Yeo Bryan
Yeo Bryan

Reputation: 429

Nginx Upstream prematurely closed FastCGI stdout while reading response header from upstream

I am aware that there are many similar questions posted on stackoverflow, I have been through them but it still doesn't resolve my issue. Please read on before marking it as a similar question.

I have hosted my laravel based web application through the use of Nginx. The web application is accessible just fine both locally and through the server. However there is a particular URL where when too much data is being returned, it results in the server crashing and returning 404 error.

In nginx error logs the following error message is shown

Nginx Upstream prematurely closed FastCGI stdout while reading response header from upstream

Attempted Solution I have tried adjusting settings from both PHP ini files as well as nginx conf files to no avail. I also restarted the server using

systemctl restart nginx
systemctl restart php-fpm

PHP.ini

upload_max_filesize = 256M
post_max_size = 1000M

nginx conf

client_max_body_size 300M;
client_body_timeout 2024;
client_header_timeout 2024;

fastcgi_buffers 16 512k
fastcgi_buffer_size 512k
fastcgi_read_timeout 500;
fastcgi_send_timeout 500;

Can someone kindly tell me what i am missing out?

Upvotes: 1

Views: 3420

Answers (1)

Vico
Vico

Reputation: 11

This error usually occurs when a web page has too much content and the server cannot handle it. So, the first step to resolve this error is to do the following −

1.Reduce the amount of content on the page

2.Remove any errors, warnings and notices

3.Make sure your code is clean

Modify the configuration of Nginx, such as what you wrote above If there are too many warning messages, it cannot be changed. You can turn up the error level. error_reporting(E_ERROR);

Upvotes: 1

Related Questions