Reputation: 9024
I am working on scrapy framework.
I have some common properties which i want in all of my spiders so i made a BaseSpider.
BaseSpider
import scrapy
from src.LoggerFactory import get_logger
import ConfigParser
from redis import Redis
class BaseSpider(scrapy.Spider):
logger = get_logger()
def __init__(self, *args, **kwargs):
super(scrapy.Spider, self).__init__(*args, **kwargs)
config = ConfigParser.RawConfigParser()
config.read('../../config.cfg')
self.config = config
self.redis = Redis(host=config.get('redis', 'host'), port=config.get('redis', 'port'))
def parse(self, response):
pass
And my EbaySpider is as follows
EbaySpider
import scrapy
import json
from scrapper.items import Product
from BaseSpider import BaseSpider
class EbaySpider(BaseSpider):
name = "ebay"
allowed_domains = ["ebay.com"]
def __init__(self, *args, **kwargs):
super(BaseSpider, self).__init__(*args, **kwargs)
print self.redis # Throws AttributeError: 'EbaySpider' object has no attribute 'redis'
exit()
Strangest part is i can still access scrapy.Spider
properts in my EbaySpider
although it is not inherited from scrapy.Spider
.
Also, if there is any scrapy way to extend Spiders please suggest as i wasn't able to find that in their documentations.
Upvotes: 0
Views: 78
Reputation: 21436
Your super
usage is wrong.
a typical superclass call looks like this:
class C(B):
def method(self, arg):
super(C, self).method(arg)
Where's your looks like:
class C(B):
def method(self, arg):
super(B, self).method(arg)
^
Upvotes: 4