Reputation:
I am building a comparison shopping site that takes in multiple XML feeds and displays the best deals. I use PHP Simplexml and then sort them using PHP when the page loads. I use a library like this to process the feeds in parallel.
Our application has little database logic. We just need these feeds to be processed as quickly as possible using PHP. It's decently fast now but I'd like to obviously make it quicker. Also, I'm worried that when we start getting traffic PHP will slow down dramatically.
We are using eaccelerator but I don't think that this functionality gets a real boost from this. I can't really use caching because we need the deals to be fresh when the page loads.
If you guys were designing a system like this, what would you do to get the best performance? How can we get PHP to process these xml feeds as quickly as possible?
Upvotes: 0
Views: 180
Reputation: 27854
I like xml_parse_into_struct()
, it is really much faster in comparison with "easier to use classes" like DOMDocument(). Take about only 2/100 time in this case.
And of course, as already suggested, you should also optimize it storing the processed data instead of doing everything again every time.
Upvotes: 0
Reputation: 321598
You're downloading the feeds at every page hit?
You should be using cron to dump them into a database - it'll be much faster.
Upvotes: 4