Reputation: 191
How can I yield multiple items at the same time? I am scraping a list of URLs where each of these URLs has about 10-20 nested URLS. I scrape each nested URL for 10 items of information that I need to yield. Is there a way to yield 10 items at the same time? Maybe through a list or something that I append each item to then yield them all at the end? I am not totally sure how to do this. Any suggestions?
Example of Code:
class OdSpider(scrapy.Spider):
name = 'od'
allowed_domains = []
start_urls = ["url1, url2, . . . . ."]
def parse(self, response):
# scrape nested urls
yield scrapy.Request(nested_url, callback=self.parsenestedgame)
def parsenestedgame(self, response):
i1 = item1()
i2 = item2()
# 9 other items then adding info to items
yield item1(**i1)
yield item2(**i2)
# how can I yield all of these items at the same time?
Upvotes: 0
Views: 3402
Reputation: 191
I actually figured it out I just appended all of the items to a list like:
item_list.append(item1(**i1))
. . . .etc
Then I iterated over the items like:
for item in item_list:
yield item
Upvotes: 2
Reputation: 2564
Given the the information provided in the comments:
You can have any number of Items and yield them as soom as you populate each of them, in whatever order you want. Since the yield
statement doesn't terminate the code execution they can even follow each other as you presented in your sample code.
They will all reach the ItemPipelines and there you can make the distinction and treat them differently if you need.
If that doen't answer your question I may have not fully understood it. Please explain what you are trying to achieve and what you tried and didn't work, so I can be more helpful.
Upvotes: 1