Reputation: 33
I am building a project where I need a web crawler which crawls a list of different webpages. This list can change at any time. How is this best implemented with scrapy? Should I create one spider for all websites or dynamically create spiders?
I have read about scrapyd, and I guess that dynamically creating spiders is the best approach. I would need a hint about how to implement it though.
Upvotes: 3
Views: 1303
Reputation: 925
If parsing logic is same then there are two methods,
Passing parameters in scrapy
scrapy crawl spider_name -a start_url=your_url
In scrapyd replace -a with -d
Upvotes: 2