Shouvik
Shouvik

Reputation: 23

scrapy's Request function is not being called

I'm trying to call a parse function from the main parse function, but it isn't working.

Here is the code:

class CodechefSpider(CrawlSpider):
    name = "codechef_crawler"
    allowed_domains = ["codechef.com"]
    start_urls = ["http://www.codechef.com/problems/easy/","http://www.codechef.com/problems/medium/","http://www.codechef.com/problems/hard/","http://www.codechef.com/problems/challenege/"]

    rules = (Rule(SgmlLinkExtractor(allow=('/problems/[A-Z,0-9,-]+')), callback='parse_item'),)

    def parse_solution(self,response):

        hxs = HtmlXPathSelector(response)
        x = hxs.select("//tr[@class='kol']//td[8]").exctract()
        f = open('test/'+response.url.split('/')[-1]+'.txt','wb')
        f.write(x.encode("utf-8"))
        f.close()



    def parse_item(self, response):
        hxs = HtmlXPathSelector(response)
        item = Problem()
        item['title'] = hxs.select("//table[@class='pagetitle-prob']/tr/td/h1/text()").extract()
        item['content'] = hxs.select("//div[@class='node clear-block']//div[@class='content']").extract()
        filename = str(item['title'][0])
        solutions_url = 'http://www.codechef.com/status/' + response.url.split('/')[-1] + '?language=All&status=15&handle=&sort_by=Time&sorting_order=asc'
        Request(solutions_url, callback = self.parse_solution)
        f = open('problems/'+filename+'.html','wb')
        f.write("<div style='width:800px;margin:50px'>")
        for i in item['content']:
            f.write(i.encode("utf-8"))
        f.write("</div>")
        f.close()

parse solution method is not being called. The spider runs without any errors.

Upvotes: 1

Views: 203

Answers (1)

nizam.sp
nizam.sp

Reputation: 4072

You should put yield Request(solutions_url, callback = self.parse_solution) and not just Request(solutions_url, callback = self.parse_solution).

Upvotes: 2

Related Questions