How to handle time consuming PHP scripts

Some times you might have to write PHP scripts that can take a long time to finish.
For example create/recover backups, install CMS demos, parse large amounts of data etc...
To make sure your scripts will work as expected you will need to know and remember a few things.

Timeouts

The first thing you have to do is to set the max_execution_time parameter in your PHP configuration.
If the script is executed by the web server (i.e in response of a HTTP request from a user) then you will also have to set the correct timeout parameters in the web server config.
For Apache its TimeOut and FastCgiServer… -idle-timeout (if you are using fastcgi)
For Nginx its send_timeout and fastcgi_read_timeout (if you are using fastcgi)

The web server can also proxy requests to an other web server that will in his turn execute the PHP script (e.g nginx - frontend, apache - backend). In this case you will have to set the correct timeout parameters for the proxy.
For Apache its ProxyTimeout and for Nginx proxy_read_timeout.

User Interruption

If the script is executed in response to a HTTP request, then the user can easily interrupt the script by canceling the request in his browser.
If you want for the PHP script to continue running even after the user canceled/interrupted his request then set TRUE in ignore_user_abort parameter of PHP.

Loss of Open Connection

If the script opens a connection with any other service (DB, mail, ftp..) and the connection happens to close/timeout then the script will fail to execute correctly.
For example if during the execution of the script we open a connection to MySQL and for some time don't use it, then MySQL will close the connection based on the wait_timeout parameter.
In this case the first thing should be to try and increase the connection timeout. For example for MySQL we can run the following query:

SET SESSION wait_timeout = 9999

But if we don't have this opportunity then we can check if the connection is open in parts of code where the connection could timeout and re-connect if needed.
For example in the mysqli module there is a useful function called mysqli::ping to check the status of our connection. Plus there is the configuration parameter mysqli.reconnect for automatic re-connection if the connection closes.
If you are trying to do the same thing for an other kind of service and there is no similar configuration or function available then you could try to write it yourself.
The function should try to connect to your service and in case of error (you can use try and catch) re-connect. For example:

{
    private $ftp;
 
    public function connect()
    {
        $this->ftp = ftp_connect('ftp.server');
        ...
    }
 
    public function reconnect()
    {
        try
        {
            if (!ftp_pwd($this->ftp))
                $this->connect();
        }
        catch($e)
        {
            $this->connect();
        }
    }
 
    ...
}

or

class MssqlConnection
{
    private $db;
 
    public function connect()
    {
        $this->db = mssql_connect('mssql.server');
        ...
    }
     
    public function reconnect()
    {
        try
        {
            if (!mssql_query('SELECT 1 FROM dual', $this->db))
                $this->connect();
        }
        catch($e)
        {
            $this->connect();
        }
    }
 
    ...
}

Parallel Execution

Its not rare when long scripts are executed on schedule (cron) and its expected that at any moment only one copy of the script will be running.
But it can happen that the script will be executed on schedule while the old one is still running. This will lead to unpredictable problems and errors.
In this case you should use a locking technique of the resourses being used, but this task is always solved individually. You can check if there is an other copy running and either wait for it to finish or cancel the current script execution. To achieve this you can check the list of running processes or lock the execution of the script itself, something like:

if (lockStart('script.php'))
{
    // main php code
    ...
    lockStop('script.php');
}

Web Server Load

In cases when the long scripts are executed by the web server, the client connection with the web server stays open until the script finishes. This is not a good thing since the web server's task is to execute the request as fast as possible and give the result. But if the connection stays open then one of the web server workers(process) will be busy for a long time. And if a high amount of these scripts will be executed at the same time then all the workers (apache MaxClients) will be busy executing them thus the server will simply stop responding to any new requests, resulting in downtime.

That's why when processing a user's request you should execute the script in background using php-cli to keep the server load as low as possible. If needed you can use AJAX to check the status of the script exection.

Feel free to comment and share the article. Also don't forget to check out and use jsDelivr.