Manjunath Bhat
Manjunath Bhat

Reputation: 11

proxy_cache_valid directive caching error in nginx

I have an nginx server which serves live video content to users. This nginx server gets the content from an upstream server, caches it and delivers all future request for that content from this cache. Some requests from users get a '503 Service Unavailable' as that part of the live content is not yet generated. However this gets cached on my Nginx server. In the future when this content is available, even then my nginx server is giving the cached 503 to my users.

I checked https://stackoverflow.com/questions/28994863/nginx-proxy-cache-caches-502-errors.

This says `If only caching time is specified

proxy_cache_valid 5m; then only 200, 301, and 302 responses are cached.`

proxy_pass http://MDVR;
proxy_cache_valid    10d;
proxy_cache   a12251;

I am expecting that my 503 which I get from upstream should not be cached. Does this mean that my upstream is asking my nginx server to cache the 503 error message?

I did find an alternate way to handle this at the nginx server end. One was by setting these error codes to be cached for 1 minute only after which the nginx would check with the upstream server. Another was by using an error_page directive, and then handle the location separately with the added header? e.g. in Nginx:

server {
  error_page 404 /404.html;
  location = /404.html {
    root   /usr/share/nginx/html;
    add_header Cache-Control "no-cache" always;
  }

However I dont want to call another location block for handling error codes. Can it be handled on the upstream server?

Upvotes: 0

Views: 2848

Answers (1)

VBart
VBart

Reputation: 15110

You can use the proxy_no_cache directive in combination with the map directive. For example:

map $upstream_status $no_cache {
    200     "";
    default 1;
}

proxy_pass http://MDVR;
proxy_no_cache $no_cache;

or fix your backend to not set caching headers in case of errors.

Upvotes: 1

Related Questions