Matt
Matt

Reputation: 85

Scrapy crawl and extract data into mysql

I am trying to get price and save it into the database, but I cant find out whats happing with the code, I can extract the data and I can save it using -o save.xml in the commend, but when I try to integrate settings.py to save the data into MySql database everything changes. When I try to save the information again using -o save.xml it does not show me the price results. I did notice that my database ID auto increment does change but no data was inserted.

Can someone help me out? Here is my code.

test.py
------------------------
import scrapy
from scrapy.contrib.linkextractors.sgml import SgmlLinkExtractor
from scrapy.contrib.spiders import CrawlSpider, Rule
from scrapy.selector import HtmlXPathSelector
from getprice.items import GetPriceItem
from scrapy.log import *
from getprice.settings import *
from getprice.items import *

class MySpider(CrawlSpider):
    name = "getprice"
    allowed_domains = ["www.craigslist.ca"]
    start_urls = ["http://calgary.craigslist.ca/search/sss"]

    def parse(self, response):
        hxs = HtmlXPathSelector(response)
        titles = hxs.select("//div[@'sliderforward arrow']")
        items = []
        for title in titles:
            item = GetPriceItem()
            item ["price"] = title.select("text()").extract()[0]
            insert_table(item)

settings.py
---------------------
BOT_NAME = 'getp'
BOT_VERSION = '1.0'

import sys
import MySQLdb

# SCRAPY SETTING
SPIDER_MODULES = ['getprice.spiders']
NEWSPIDER_MODULE = 'getprice.spiders'
USER_AGENT = '%s/%s' % (BOT_NAME, BOT_VERSION)

# SQL DATABASE SETTING
SQL_DB = 'test'
SQL_TABLE = 'testdata'
SQL_HOST = 'localhost'
SQL_USER = 'root'
SQL_PASSWD = 'testing'
SQL_LIST = 'price' 
# connect to the MySQL server
try:
    CONN = MySQLdb.connect(host=SQL_HOST,
                         user=SQL_USER,
                         passwd=SQL_PASSWD,
                         db=SQL_DB)
except MySQLdb.Error, e:
    print "Error %d: %s" % (e.args[0], e.args[1])
    sys.exit(1)

cursor = CONN.cursor()  # important MySQLdb Cursor object

def insert_table(item):
    sql = "INSERT INTO %s (%s) \
values('%s')" % (SQL_TABLE, SQL_LIST,
    MySQLdb.escape_string(item['price'].encode('utf-8')),
    )
    # print sql
    if cursor.execute(sql):
        print "Inserted"
    else:
        print "Something wrong"

Upvotes: 1

Views: 7623

Answers (1)

alecxe
alecxe

Reputation: 474271

You need to do it the right way and follow the Scrapy's Control Flow.

Create a "Pipeline" that would be responsible for persisting your items in the database.

MySQL pipeline examples:

Upvotes: 3

Related Questions