Reputation: 41
So I've written my first scraper with Scrapy and I'm having some trouble with the next steps. I want to run the scraper daily, probably with cron, and track the changes in the values I've scraped. When I export to a json or csv file, then run the scraper again, the new data gets dumped into the same file. Is there a way to make each scrape export into a separate file? Any insight would be great, thanks!
Upvotes: 1
Views: 392
Reputation: 11396
tell scrapy the name of the file to write to using -o
$ scrapy crawl -h | grep output=
--output=FILE, -o FILE dump scraped items into FILE (use - for stdout)
you can use current date as file name like:
$ scrapy crawl <spider-name> -t json/csv -o $(date '+%Y-%m-%d')
Upvotes: 2