Logan Phillips
Logan Phillips

Reputation: 710

Gunicorn/Django/Nginx - 502 Bad Gateway Error when uploading files above 100 MB

I have been stuck on this error for a week. I am officially at a loss with this one.

I have a React/Django web app where users can upload audio files (.WAV) (Via react Dropzone). The React and Django is completely separated into a frontend/ and backend/ folder, communicating via fetch() calls. For some reason, I am able to upload files less than 100 MB, but if I upload a file larger, for example, 180 MB, Nginx errors with the following:

2020/07/14 02:29:18 [error] 21023#21023: *71 upstream prematurely closed connection while reading response header from upstream, client: 50.***.***.***, server: api.example.com, request: "POST /api/upload_audio HTTP/1.1", upstream: "http://unix:/home/exampleuser/AudioUploadApp/AudioUploadApp.sock:/api/upload_audio", host: "api.example.com”, referrer: "https://example.com/profile/audio/record"

My Gunicorn error log does not show any errors. I can see each of the 5 workers starting, but there is no WORKER TIMEOUT errors or anything that I can see.

My guniocorn.service file:

[Unit]
Description=gunicorn daemon
After=network.target

[Service]
User=exampleuser
Group=www-data
WorkingDirectory=/home/exampleuser/AudioUploadApp/Backend
ExecStart=/home/exampleuser/virtualenvs/uploadenv/bin/gunicorn --access-logfile "/tmp/gunicorn_access.log" --error-logfile "/tmp/gunicorn_error.log" --capture-output --workers 5 --worker-class=gevent --timeout=900 --bind unix/home/exampleuser/AudioUploadApp/AudioUploadApp.sock AudioUploadApp.wsgi:application --log-level=error

[Install]
WantedBy=multi-user.target
server {
        server_name api.example.com;

        location / {
                include proxy_params;
                proxy_pass http://unix:/home/exampleuser/AudioUploadApp/AudioUploadApp.sock;
        client_max_body_size 200M;
        }

    location /static {
        autoindex on;
        alias /home/exampleuser/AudioUploadApp/Backend/static/;
    }

    listen 443 ssl; # managed by Certbot
    ssl_certificate /etc/letsencrypt/live/example.com/fullchain.pem; # managed by Certbot
    ssl_certificate_key /etc/letsencrypt/live/example.com/privkey.pem; # managed by Certbot
    include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
    ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot
    client_max_body_size 200M;

   }

server {
        server_name www.example.com example.com;
        root /home/exampleuser/AudioUploadApp/build;
        index index.html index.html;
        location / {
                try_files $uri /index.html;
        }

    listen 443 ssl; # managed by Certbot
    ssl_certificate /etc/letsencrypt/live/example.com/fullchain.pem; # managed by Certbot
    ssl_certificate_key /etc/letsencrypt/live/example.com/privkey.pem; # managed by Certbot
    include /etc/letsencrypt/options-ssl-nginx.conf; # managed by Certbot
    ssl_dhparam /etc/letsencrypt/ssl-dhparams.pem; # managed by Certbot
    client_max_body_size 200M;
}

server {
    if ($host = www.example.com) {
        return 301 https://$host$request_uri;
    } # managed by Certbot


    if ($host = example.com) {
        return 301 https://$host$request_uri;
    } # managed by Certbot

    server_name www.example.com example.com;
    listen 80;
    return 404; # managed by Certbot

}server {
    if ($host = api.example.com) {
        return 301 https://$host$request_uri;
    } # managed by Certbot

    listen 80;
    server_name api.example.com;
    client_max_body_size 200M;
    return 404; # managed by Certbot

}

And my Nginx proxy params:

proxy_set_header Host $http_host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_connect_timeout   900s;
proxy_send_timeout      900s;
proxy_read_timeout      900s;

I realize that my timeout for Gunicorn and Nginx is way too much, but I don't have the best upload speed where I live, so I just want to makes sure that Timeouts due to upload speed are not the issue.

Here is what I’ve tried, with no luck:

To reiterate, this only seems to happen with wav file sizes above 100 MB. I have successfully been able to upload file sizes such as 80 MB, but have not been able to upload files with sizes of 150 MB.

I have been on this for about a week. I am pretty stuck. Would really appreciate any help. I can include any more information if I missed any that would be helpful

Upvotes: 2

Views: 3090

Answers (1)

Logan Phillips
Logan Phillips

Reputation: 710

The fix for this was to upgrade the EC2 instance that Gunicorn/Django/Nginx is running on. I went from a t2.medium instance to a R5.large instance. This worked. Then, I went from a R5.large down to a t2.large instance, and it still works. t2.medium and t2.large has the same amount of virtual CPUs, but t2 large has twice the amount of memory (4 GiB vs 8 GiB). I claimed to have already done this, but I must have tried it back when the first error I was getting was about the client body being too large. I fixed that error by changing client_max_body_size in Nginx. After that change is when I was getting the error that this post is about. I just tried to upgrade the hardware at the wrong spot.

I also made the following changes compared to what I have in my original post, since the larger numbers seemed unnecessary:

  • Number of workers in Gunicorn: from 5 to 3
  • Gunicorn timeout: from 900 to 300
  • Nginx timeout: from 900s to 300s

The Gunicorn and Nginx could probably go down to its default, but I haven't tested that yet. I have awful upload speed where I live.

Upvotes: 1

Related Questions