Ooxie
Ooxie

Reputation: 23

Scrapy API - Spider class init argument turned to None

After a fresh install of Miniconda 64-bit exe installer for Windows and Python 2.7 on Windows 7, through which I get Scrapy, here is what is installed:

This minimal code, run from "python scrapy_test.py" (using Scrapy API):

#!/usr/bin/env python2.7
# -*- coding: utf-8 -*-

import scrapy.spiders.crawl
import scrapy.crawler
import scrapy.utils.project

class MySpider(scrapy.spiders.crawl.CrawlSpider) :
    name = "stackoverflow.com"
    allowed_domains = ["stackoverflow.com"]
    start_urls = ["http://stackoverflow.com/"]
    download_delay = 1.5

    def __init__(self, my_arg = None) :
        print "def __init__"

        self.my_arg = my_arg
        print "self.my_arg"
        print self.my_arg

    def parse(self, response) :
        pass

def main() :
    my_arg = "Value"

    process = scrapy.crawler.CrawlerProcess(scrapy.utils.project.get_project_settings())
    process.crawl(MySpider(my_arg))
    process.start()

if __name__ == "__main__" :
    main()

gives this ouput:

[scrapy] INFO: Scrapy 1.1.1 started (bot: scrapy_project)
[scrapy] INFO: Overridden settings: {'NEWSPIDER_MODULE': 'scrapy_project.spiders', 'SPIDER_MODULES': ['scrapy_project.spiders'], 'ROBOTSTXT_OBEY': True, 'BOT_NAME': 'scrapy_project'}
def __init__
self.my_arg
Value
[scrapy] INFO: Enabled extensions:
['scrapy.extensions.logstats.LogStats',
 'scrapy.extensions.telnet.TelnetConsole',
 'scrapy.extensions.corestats.CoreStats']
def __init__
self.my_arg
None
[...]

Notice how the init method was run twice and how the stored argument got turned to None after the second run, which is not what I want. Is this supposed to happen??

If I change:

def __init__(self, my_arg = None) :

to:

def __init__(self, my_arg) :

the output is:

[...]
Unhandled error in Deferred:
[twisted] CRITICAL: Unhandled error in Deferred:


Traceback (most recent call last):
  File "scrapy_test.py", line 28, in main
    process.crawl(MySpider(my_arg))
  File "C:\Users\XYZ\Miniconda2\lib\site-packages\scrapy\crawler.py", line 163, in crawl
    return self._crawl(crawler, *args, **kwargs)
  File "C:\Users\XYZ\Miniconda2\lib\site-packages\scrapy\crawler.py", line 167, in _crawl
    d = crawler.crawl(*args, **kwargs)
  File "C:\Users\XYZ\Miniconda2\lib\site-packages\twisted\internet\defer.py", line 1331, in unwindGenerator
    return _inlineCallbacks(None, gen, Deferred())
--- <exception caught here> ---
  File "C:\Users\XYZ\Miniconda2\lib\site-packages\twisted\internet\defer.py", line 1185, in _inlineCallbacks
    result = g.send(result)
  File "C:\Users\XYZ\Miniconda2\lib\site-packages\scrapy\crawler.py", line 90, in crawl
    six.reraise(*exc_info)
  File "C:\Users\XYZ\Miniconda2\lib\site-packages\scrapy\crawler.py", line 71, in crawl
    self.spider = self._create_spider(*args, **kwargs)
  File "C:\Users\XYZ\Miniconda2\lib\site-packages\scrapy\crawler.py", line 94, in _create_spider
    return self.spidercls.from_crawler(self, *args, **kwargs)
  File "C:\Users\XYZ\Miniconda2\lib\site-packages\scrapy\spiders\crawl.py", line 96, in from_crawler
    spider = super(CrawlSpider, cls).from_crawler(crawler, *args, **kwargs)
  File "C:\Users\XYZ\Miniconda2\lib\site-packages\scrapy\spiders\__init__.py", line 50, in from_crawler
    spider = cls(*args, **kwargs)
exceptions.TypeError: __init__() takes exactly 2 arguments (1 given)
[twisted] CRITICAL:
Traceback (most recent call last):
  File "C:\Users\XYZ\Miniconda2\lib\site-packages\twisted\internet\defer.py", line 1185, in _inlineCallbacks
    result = g.send(result)
  File "C:\Users\XYZ\Miniconda2\lib\site-packages\scrapy\crawler.py", line 90, in crawl
    six.reraise(*exc_info)
  File "C:\Users\XYZ\Miniconda2\lib\site-packages\scrapy\crawler.py", line 71, in crawl
    self.spider = self._create_spider(*args, **kwargs)
  File "C:\Users\XYZ\Miniconda2\lib\site-packages\scrapy\crawler.py", line 94, in _create_spider
    return self.spidercls.from_crawler(self, *args, **kwargs)
  File "C:\Users\XYZ\Miniconda2\lib\site-packages\scrapy\spiders\crawl.py", line 96, in from_crawler
    spider = super(CrawlSpider, cls).from_crawler(crawler, *args, **kwargs)
  File "C:\Users\XYZ\Miniconda2\lib\site-packages\scrapy\spiders\__init__.py", line 50, in from_crawler
    spider = cls(*args, **kwargs)
TypeError: __init__() takes exactly 2 arguments (1 given)

No clue how to get around this problem. Any idea?

Upvotes: 2

Views: 2314

Answers (1)

Sam
Sam

Reputation: 20486

Here is the method definition for scrapy.crawler.CrawlerProcess.crawl():

crawl(crawler_or_spidercls, *args, **kwargs)

  • crawler_or_spidercls (Crawler instance, Spider subclass or string) – already created crawler, or a spider class or spider’s name inside the project to create it
  • args (list) – arguments to initialize the spider
  • kwargs (dict) – keyword arguments to initialize the spider

This means you should be passing the name of your Spider separately from the kwargs needed to initialize said Spider, like so:

process.crawl(MySpider, my_arg = 'Value')

Upvotes: 7

Related Questions