mandar.gokhale
mandar.gokhale

Reputation: 1875

How to write a crawler in ruby?

I am working on a ROR application where I need to implement a crawler that crawls other sites and stores data in my database. For example suppose I want to crawl all deals from http://www.snapdeal.com and store them into my database. How to implement this using crawler?

Upvotes: 1

Views: 4271

Answers (3)

Bhushan Lodha
Bhushan Lodha

Reputation: 6862

There are couple of options depending upon your usecase.

I have used combination of Nokogiri and Mechanize for few of my projects and I think they are good options.

Upvotes: 10

pguardiario
pguardiario

Reputation: 54984

You want to take a look at mechanize. Also from what you mention you probably don't need rails at all.

Upvotes: 3

Dan Wich
Dan Wich

Reputation: 4943

As Sergio commented, you retrieve pages, parse them, and follow their links. In your case, it sounds like you're more focused on "screen scraping" than crawling deep link networks, so a library like Scrubyt will be helpful (although progress on it has died out). You can also use a lower-level parsing-focused library like Nokogiri.

Upvotes: 0

Related Questions