Reputation: 41
I have auto-deployment bash script, that is called by github webhook thru nginx+fcgiwrap for any push events. But when payload of github webhook POST request is bigger then 64kb, I get the following error in in nginx error.log and script is not called
fastcgi request record is too big
Below is nginx location block for this webhook:
location /deploy {
gzip off;
client_body_buffer_size 1M;
fastcgi_pass unix:/var/run/fcgiwrap.socket;
include /etc/nginx/fastcgi_params;
fastcgi_param DOCUMENT_ROOT /var/www;
fastcgi_param REQUEST_BODY $request_body;
fastcgi_param SCRIPT_FILENAME /var/www/deploy.sh;
}
Is there any way to increase this limit? Or may there is another way to pass request body to script and run it?
Upvotes: 2
Views: 2456
Reputation: 41
The issue was resolved by using the following workaround:
location /deploy {
gzip off;
client_body_in_file_only clean;
client_body_temp_path /var/tmp;
fastcgi_pass_request_body off;
include /etc/nginx/fastcgi_params;
fastcgi_param REQUEST_BODY_FILE $request_body_file;
fastcgi_param SCRIPT_FILENAME /var/www/deploy.sh;
fastcgi_pass unix:/var/run/fcgiwrap.socket;
}
In this case we pass the request body to script thru temporary file and disable direct passing to avoid the above error.
Upvotes: 2