Jubin
Jubin

Reputation: 51

How to set a timeout for some routes in a Laravel project?

I have a project made on Laravel. I have routes for web pages, and routes for api. My question is : how could I set a different timeout for those two groups ?

I tried with a Middleware, by just playing with set_time_limit, but it didn't work.

So I think I can do this by my Nginx vhost file, and I'm kind of stuck on this. Here how I have ended up so far :

server {
    listen 80;
    listen 443 ssl http2;
    server_name mysiste;
    root "/home/vagrant/www/mysite/public";

    index index.html index.htm index.php;

    charset utf-8;

    location / {
        try_files $uri $uri/ /index.php?$query_string;
    }


    location = /favicon.ico { access_log off; log_not_found off; }
    location = /robots.txt  { access_log off; log_not_found off; }

    access_log off;
    error_log  /var/log/nginx/mysite-error.log error;

    sendfile off;

    client_max_body_size 100m;

    location ~ \.php$ {
         fastcgi_split_path_info ^(.+\.php)(/.+)$;
         fastcgi_pass unix:/var/run/php/php7.1-fpm.sock;
         fastcgi_index index.php;
         include fastcgi_params;
         fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;

         fastcgi_intercept_errors off;
         fastcgi_buffer_size 16k;
         fastcgi_buffers 4 16k;

         fastcgi_connect_timeout 300;
         fastcgi_send_timeout 300;
         fastcgi_read_timeout 300;
     }

     location ~ ^/api/v1 {
         try_files $uri $uri/ /index.php?$query_string;
         client_body_timeout 1;
         send_timeout 1;
         fastcgi_connect_timeout 300;
         fastcgi_send_timeout 300;
         fastcgi_read_timeout 300;
     }

     location ~ /\.ht {
         deny all;
     }
}

(Of course, I set my timeouts to 1 just to do my researches).

Is anyone have an idea on how to approach this please ?

Thanks !

Upvotes: 2

Views: 4953

Answers (2)

mewm
mewm

Reputation: 1277

The reason why your config isn't working, is because the try_files directive redirects to your location ~ \.php$. To avoid that, you have to remove try_files in your specific route and hardcode your fastcgi_param SCRIPT_FILENAME.

This is how I solved the issue, where I had to allow longer timeout for a route used for video upload:

  • php.ini max_execution_time in to 30s
  • default request_terminate_timeout for php-fpm (which is 0)
  • set_time_limit(1800); in the top of my laravel controller (that resolves from /api/posts)

And then use nginx locations like this:

location ~ ^/api/posts {
    include fastcgi_params;
    fastcgi_connect_timeout 30s;
    fastcgi_read_timeout 1800s;
    fastcgi_send_timeout 1800s;
    fastcgi_buffers 256 4k;
    fastcgi_param SCRIPT_FILENAME '${document_root}/index.php';
    fastcgi_pass php:9000;
}

location ~ \.php$ {
    try_files $uri /index.php?$query_string;
    include fastcgi_params;
    fastcgi_connect_timeout 30s;
    fastcgi_read_timeout 30s;
    fastcgi_send_timeout 30s;
    fastcgi_buffers 256 4k;
    fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
    fastcgi_pass php:9000;
}

You might have to adjust the params for your usecase.

For the record, I use PHP 7.3 and nginx 1.12.2

Upvotes: 6

Alexander Kolin
Alexander Kolin

Reputation: 69

you cant do it with nginx, you have to set max timeout in nginx and controll your app-timeout in your Classes or php.ini

here is the answer incl. info why it so

So in your Nginx configuration's "location ~ .php$" section, set request_terminate_timeout to a very high value (or better yet, set it to 0 to disable the timeout). And also set fastcgi_read_timeout to the highest number of seconds you could possibly want any script to run.

And then fine-tune it by setting a default value by setting max_execution_time in your php.ini. Now, when you have script that you want to allow to run for a long time, use the set_time_limit() PHP command in those scripts.

Upvotes: 0

Related Questions