Reputation: 15251
from scrapy import project, signals
from scrapy.crawler import Settings
from scrapy.crawler import CrawlerProcess
from scrapy.xlib.pydispatch import dispatcher
from multiprocessing.queues import Queue
import multiprocessing
class CrawlerWorker(multiprocessing.Process):
def __init__(self, spider, result_queue):
multiprocessing.Process.__init__(self)
self.result_queue = result_queue
self.crawler = Crawler(Settings())
if not hasattr(project, 'crawler'):
self.crawler.install()
self.crawler.configure()
self.items = []
self.spider = spider
dispatcher.connect(self._item_passed, signals.item_passed)
def _item_passed(self, item):
self.items.append(item)
def run(self):
self.crawler.crawl(self.spider)
self.crawler.start()
self.crawler.stop()
self.result_queue.put(self.items)
I got an error while trying to use just CrawlerProcess(settings) from scrapy.conf.settings, it seems like there's a discrepancy between what the scrapy doc says here http://doc.scrapy.org/en/latest/topics/practices.html
I am following an older scrapy version, I am trying to make it work with 0.16 of scrapy.
This is the error as soon as I run the python script.
Traceback (most recent call last):
File "server.py", line 5, in <module>
from scraper import Scraper
File "/home/me/spider/spider/scraper.py", line 6, in <module>
from crawlerworker import CrawlerWorker
File "/home/me/spider/spider/crawlerworker.py", line 2, in <module>
from scrapy.crawler import Settings
ImportError: cannot import name Settings
Upvotes: 1
Views: 5157