Xeos
Xeos

Reputation: 6477

Adding header to response for specific URLs with HAproxy

I have a simple condition in my HAproxy config (I tried this for frontend and backend):

acl no_index_url path_end .pdf .doc .xls .docx .xlsx
rspadd X-Robots-Tag:\ noindex if no_index_url

It should add the no-robots header to content that should not be indexed. However it gives me this WARNING when parsing the config:

acl 'no_index_url' will never match because it only involves keywords
    that are incompatible with 'backend http-response header rule'

and

acl 'no_index_url' will never match because it only involves keywords
    that are incompatible with 'frontend http-response header rule'

According to documentation, rspadd can be used in both frontend and backend. The path_end is used in examples within frontend. Why am I getting this error and what does it mean?

Upvotes: 16

Views: 14663

Answers (3)

rvh
rvh

Reputation: 183

if using haproxy below v1.6, create a new backend block (could be a duplicate of the default backend) and add the special headers in there. then in frontend use that backend conditionally. i.e.

use_backend alt_backend if { some_condition } 

admittedly not an ideal solution but it does the job.

Upvotes: -1

Adam
Adam

Reputation: 6107

Starting in HaProxy 1.6 you won't be able to just ignore the error message. To get this working use the temporary variable feature:

frontend main
   http-request set-var(txn.path) path

backend local
   http-response set-header X-Robots-Tag noindex if { var(txn.path) -m end .pdf .doc }

Upvotes: 23

Xeos
Xeos

Reputation: 6477

Apparently, even with the warning, having the acl within the frontend works perfectly fine. All the resources with .pdf, .doc, etc are getting the correct X-Robots-Tag added to them.

In other words, this WARNING is misleading and in reality the acl does match.

Upvotes: 2

Related Questions