Reputation: 19
I want to give command line args to scrapy and use that sys.argv[] in spider to check which urls have that argument. How can I do like this for spider named urls?
$scrapy crawl urls "August 01,2018"?
Upvotes: 0
Views: 2557
Reputation: 28246
You can pass arguments to a spider's __init__()
by using -a
, as specified in the docs: https://doc.scrapy.org/en/latest/topics/spiders.html#spider-arguments
The default method will make all of the arguments into spider attributes, but you can also create a custom one if you need to do something with them.
Upvotes: 2