Fekher Turki
Fekher Turki

Reputation: 11

export python data to csv file

I'm trying to export my file via command line :

scrapy crawl tunisaianet -o save.csv -t csv

but nothing is happenning, any help?

here is my code:

import scrapy
import csv
from tfaw.items import TfawItem


class TunisianetSpider(scrapy.Spider):
    name = "tunisianet"
    allowed_domains = ["tunisianet.com.tn"]
    start_urls = [
        'http://www.tunisianet.com.tn/466-consoles-jeux/',
    ]

    def parse(self, response):
        item = TfawItem()
        data= []
        out = open('out.csv', 'a')
        x = response.xpath('//*[contains(@class, "ajax_block_product")]')
        for i in range(0, len(x)):
            item['revendeur'] = response.xpath('//*[contains(@class, "center_block")]/h2/a/@href').re('tunisianet')[i]
            item['produit'] = response.xpath('//*[contains(@class, "center_block")]/h2/a/text()').extract()[i]
            item['url'] = response.xpath('//*[contains(@class, "center_block")]/h2/a/@href').extract()[i]
            item['description'] = response.xpath('//*[contains(@class, "product_desc")]/a/text()').extract()[i]
            item['prix'] = response.xpath('//*[contains(@class, "price")]/text()').extract()[i]
            data = item['revendeur'], item['produit'], item['url'], item['description'], item['prix']
            yield data
            out.write(str(data))
            out.write('\n')

Upvotes: 0

Views: 425

Answers (1)

eLRuLL
eLRuLL

Reputation: 18799

I assume you are getting these errors:

ERROR: Spider must return Request, BaseItem, dict or None, got 'tuple' in <GET http://www.tunisianet.com.tn/466-consoles-jeux>

which specifically says what's wrong, you are returning tuples as items, change your yield code to:

...
item['prix'] = response.xpath('//*[contains(@class, "price")]/text()').extract()[i]
yield item

Upvotes: 1

Related Questions