Reputation: 11
I developed a website where a script takes movie's subtitles from an other and uploads it to my webhosting server for streaming videos. That works for more than three months when I decide to develop an other one different than the 1st, it works for weeks. So finally I decide to develop an other one takes links from an other website (it seems I like this file_get_content
), in local machine works fine, but when I put it in server it could not determine the block I want (links) but determine all the site after some tests to resolve the issue everything goes:
file_get_contents("the url"): failed to open stream: HTTP request failed! HTTP/1.1 403 Forbidden
Even the first website and the secund one are not working (I mean where I use file_get_contents
) and if I change url by another it works or if I change the server (web hosting) it works too. Please help me at lest to understand my situation.
Upvotes: 0
Views: 214
Reputation: 73
If you changed the filename parameter in your file_get_contents function to another website and then got the '403 Forbidden' error, then it seems that your code is working fine but the website is rejecting your request most likely because of security concerns to prevent web scraping. Which is what you are doing; extracting data from another website. You'll have to rethink how you fetch data for your website. Don't scrap data from other websites even if it's working, use subtitles API such as these.
Upvotes: 1