Reputation: 55
I really hate to ask these vague questions, but I could use some help figuring out the best method to go about this idea.
What I'm doing is fairly simple. I'm creating a page that will display data from several sportsbooks as accurately as possible. There are several different sites, all with their own APIs, but the data is the pretty much the same. Some of them have restrictions on the number of times they can be accessed per day so I'm trying to figure out how to go about that.
Because of the whole cross domain issue with JavaScript, I was looking into setting something up with PHP to retrieve the data locally. Would it be possible for a script to access each API say 4 times a day at certain times, parse the data locally and then serve it up with AJAX or something else?
Would it make sense to store the data in a MYSQL database and then query it? What I'm looking to do seems as simple as displaying live XML data, I just need to figure out a way not to exceed the API calls for the day. I'm fairly new to server side code in general so I apologize if this is blatantly obvious.
If anyone has any ideas, I would greatly appreciate it.
Thanks!
Upvotes: 0
Views: 155
Reputation: 28541
You need indeed to store data locally. It can be either a database or files.
The set-up would go like that:
Have a PHP script that would
Have a CRON job that would run the above script 3 times a day
If your server gets overloaded, you might consider caching the frontend php file to a static html file each time an API gets refreshed.
Upvotes: 1