Reputation: 682
In the interest of saving time and lines of repeated code on a very large project, I have been attempting to instantiate multiple spiders in Scrapy from a single class definition. I don't find in the docs that this is a standard practice, but I also don't find any indication that it cannot or should not be done. However, it is not working. Here is what I'm trying:
from scrapy.spider import CrawlSpider
class ExampleSpider(CrawlSpider):
def __init__(self, name, source, allowed_domains, starturls):
self.name = name
self.custom_settings = {'LOG_FILE':'logs/' + name + '.txt' }
self.source = source
self.allowed_domains = allowed_domains
self.start_urls = starturls
self.rules = ( Rule(LinkExtractor(allow=''), callback='parse_item', follow=True),)
def parse_item(self, response):
# do stuff here
SpiderInstance = ExampleSpider (
'columbus',
'Columbus Symphony',
'columbussymphony.com',
[ 'http://www.columbussymphony.com/events/'],
)
The error I get is :
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.6/bin/scrapy", line 11, in <module>
sys.exit(execute())
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/scrapy/cmdline.py", line 150, in execute
_run_print_help(parser, _run_command, cmd, args, opts)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/scrapy/cmdline.py", line 90, in _run_print_help
func(*a, **kw)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/scrapy/cmdline.py", line 157, in _run_command
cmd.run(args, opts)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/scrapy/commands/crawl.py", line 57, in run
self.crawler_process.crawl(spname, **opts.spargs)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/scrapy/crawler.py", line 170, in crawl
crawler = self.create_crawler(crawler_or_spidercls)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/scrapy/crawler.py", line 198, in create_crawler
return self._create_crawler(crawler_or_spidercls)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/scrapy/crawler.py", line 202, in _create_crawler
spidercls = self.spider_loader.load(spidercls)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/scrapy/spiderloader.py", line 71, in load
raise KeyError("Spider not found: {}".format(spider_name))
KeyError: 'Spider not found: columbus'
Is it possible to use Scrapy this way, and if so, what am I doing incorrectly?
Upvotes: 2
Views: 768
Reputation: 682
After reading @starrify's answer, a simple solution I hadn't gone to before:
def class_factory(passed_name, passed_source, passed_allowed_domains, passed_start_urls):
class ColumbusSpider(ExampleSpider):
name = passed_name
source = passed_source
allowed_domains = passed_allowed_domains
start_urls = passed_start_urls
# ... other stuff
def parse_item(self, response):
# use any other passed parameters as needed
return ColumbusSpider
columbus = class_factory (
'columbustest',
'Columbus Symphony',
['columbussymphony.com'],
[ 'http://www.columbussymphony.com/events/'],
) # use as many times as needed
Upvotes: 0
Reputation: 14751
1.
scrapy
looks for spider classes, not instances.
Here in your code ExampleSpider
is a class, while SpiderInstance
is an instance of it.
You may need to do something like this instead:
class ColumbusSpider(ExampleSpider):
name = 'columbus'
source = 'Columbus Symphony'
allowed_domains = ['columbussymphony.com']
start_urls = ['http://www.columbussymphony.com/events/']
2.
It's also worth noticing that the allowed_domains
attribute of a spider is expected to contain a list, tuple, or set of domains. While in your sample code it's a string.
3.
Instead of subclassing the ExampleSpider
as shown in #1, you may also make ExampleSpider
a metaclass. So that instantiating ExampleSpider
would bring you a class, instead of a class instance.
Upvotes: 2