vrnair
vrnair

Reputation: 21

Scrapy Crawl multiple spiders subsequently

I'm a little new to scrapy and I'm stuck at a point. I want to restart a spider when it gets closed.

What I'm trying to implement here is I'm getting URLs from the database I wrote my view in a way that whenever I send a "scrapy crawl xyz" request the start_request will get one URL[next URL] that is different from what was passed in the previous request from the database.

The problem is if there are four URLS in the database I need to run "scrapy crawl xyz" 4 times but I want to avoid that and I am trying to fire "scrapy crawl xyz" when the current "spider_closed" get called at the end of spider. Please help

Upvotes: 2

Views: 1499

Answers (2)

vrnair
vrnair

Reputation: 21

Hi guys I found a problem to my question. I wanted to run same scrapy command simultaneously. so what I did is created my own command in linus and then put my scrapy crawl xyz in a loop and it worked.

!/bin/bash

for i in seq 1 3 do scrapy crawl taleo

done

Upvotes: 0

Granitosaurus
Granitosaurus

Reputation: 21446

If you want to crawl multiple spiders in one script - you probably want to run the spiders from a script. See official docummenation on how to do that

To expand on the example provided in the docs, your's should look something like:

process.crawl(MySpider)
process.start()
process.crawl(MySpider2)
process.start()
process.crawl(MySpider3)
process.start()

Upvotes: 0

Related Questions