Reputation: 8682
I'm working on a project where I'm asked to aggregate a number of feeds from various websites into a local/searchable database. The project/site is Drupal based and old feeds entries are key data to the project. My question is:
Thanks.
Upvotes: 2
Views: 211
Reputation: 2185
Another option is the mature http://drupal.org/project/feedapi or the newer http://drupal.org/project/feeds both of which are designed for parsing feeds into drupal nodes, users, etc.
Feeds (the next generation of feedapi) has the following useful functions:
Import or aggregate data as nodes, users, taxonomy terms or simple database records.
* One-off imports and periodic aggregation of content * Import or aggregate RSS/Atom feeds * Import or aggregate CSV files * Import or aggregate OPML files * PubSubHubbub support * Create nodes, users, taxonomy terms or simple database records from import * Extensible to import any other kind of content * Granular mapping of input elements to Drupal content elements * Exportable configurations * Batched import for large files
Good Luck!
Upvotes: 2
Reputation: 1172
Drupals feed aggregator module in Drupal is the official way to do this.
In the settings there is a select tag of 'Discard items older than:' This only goes up to 6 weeks, but this can be over-ridden in a custom module by using:
function MYMODULE_form_aggregator_admin_settings_alter(&$form, &$form_state) {
$form['aggregator_clear']['#options'][157784630] = "Nearly Never aka 5 years";
}
As mentioned here
This will automatically grab your feeds for you and store them in the aggregator_item table
Upvotes: 3