Reputation: 10261
I am trying to retrieve RSS feed from Yahoo SQL API and storing it in a file through PHP. Here is the code:
if( filemtime($cache) < (time() - 10800) ) {
if ( !file_exists(dirname(__FILE__) . '/cache') ) {
mkdir(dirname(__FILE__) . '/cache', 0777);
}
$path = "http://query.yahooapis.com/v1/public/yql?q=";
$path .= urlencode("SELECT * FROM feed WHERE url='http://url'");
$path .= "&format=json";
$feed = file_get_contents($path, true);
if ( is_object($feed) && $feed->query->count ) {
$cachefile = fopen($cache, 'wb');
fwrite($cachefile, $feed);
fclose($cachefile);
}
}
else {
$feed = file_get_contents($cache);
}
The problem is. Now I want to store the results in the database. but since, I am storing it in the file and updating it within 3 hours, I am certain that I will run in to duplicate RSS entries. And I don't want to store those duplicates in the DB.
Is there some efficient approach to getting this done?
Upvotes: 1
Views: 122
Reputation: 533
Or you add a unique index on url, that way an insert will not be added if the url already exists.
You can ingore the error mysql gives you by using INSERT INGORE INTO ...
Upvotes: 3
Reputation: 70728
Check if the RSS entry exists, if it does - don't insert, else insert the new rss entry.
Upvotes: 1