Reputation: 311
I have a PHP script that is making a MySQL insert query based on parameters in a URL via GET. I noticed today that two users who appear to be coming from the same LAN both have duplicate records, the first recording with an IP that resolves to proxy.organization.tld another record with identical values that is recorded a couple minutes later with a different IP address.
I am guessing this has something to do with the proxy server making a HTTP request and possibly caching the content. Does anyone have any ideas or strategies for dealing with this? I want to prevent duplicate entries.
Thanks.
Upvotes: 3
Views: 1867
Reputation: 19905
There are other reasons why there can be unwanted duplicate inserts. For instance, a browser can prefetch a page, a spider can follow an link it should not.
There is a design flaw if a GET request modifies data. GET should only be used to read, POST should be used to modify data. Any browser, proxy or spider is aware that a POST request is likely to have side effects and will take all precautions to not to repeat it, while a GET request is reputed to NOT have any side effects. It can therefore be repeated or cached as necessary to improve performance.
Upvotes: 0
Reputation: 1954
Add a unique value after each url. That way if the proxy server "re-calls" the url, you can detect that it is a duplicate call.
Upvotes: 2