Reputation: 501
I'm trying to change the settings for Scrapy. I've managed to successfully do this for CrawlerProcess before. But I can't seem to get it to work for CrawlerRunner. The log should be disabled but I'm still seeing output from the log. What am I doing wrong? Thanks.
import scrapy
from twisted.internet import reactor, defer
from scrapy.crawler import CrawlerRunner
from scrapy.utils.log import configure_logging
from scrapy.utils.project import get_project_settings
from scrapy.settings import Settings
class MySpider1(scrapy.Spider):
name = "spider1"
class MySpider2(scrapy.Spider):
name = "spider2"
configure_logging()
s = get_project_settings()
s.update({
"LOG_ENABLED": "False"
})
runner = CrawlerRunner(s)
@defer.inlineCallbacks
def crawl():
yield runner.crawl(MySpider1)
yield runner.crawl(MySpider2)
reactor.stop()
crawl()
reactor.run()
Upvotes: 2
Views: 2437
Reputation: 1247
According to the doc, and the api, you should use your setting
to init the logger, so you should adjust your code like that:
# comment that line
# configure_logging()
s = get_project_settings()
s.update({
"LOG_ENABLED": "False"
})
# init the logger using setting
configure_logging(s)
runner = CrawlerRunner(s)
Then you will get what you want.
Upvotes: 2