Sameer Shaikh
Sameer Shaikh

Reputation: 265

Scrapy get data from mongodb in the spider

I have created a spider which scrape the websites products from the listing page. Is there any way in which i can connect to mongodb in my spider. Get the list of urls which are stored and scrape those url's

Thanks..

Upvotes: 0

Views: 1461

Answers (1)

Jithin
Jithin

Reputation: 1712

You can import the urls from mongodb in the spider itself.

from pymongo import MongoClient()
import scrapy

class Myspider(scrapy.Spider):

    def __init__(self):
        self.db = MongoClient() #you can add db-url and port as parameter to MongoClient(), localhost by default
        self.urls = self.db.db_name.collection.find() #use appropriate finding criteria here according to the structure of data resides in that collection

    def parse(self, response):
        # other codes
        for url in self.urls: # self.urls refers to the url's fetched from db
            #do operations with the urls

Upvotes: 3

Related Questions