Wolfpack'08
Wolfpack'08

Reputation: 4128

How do I transfer wget output to a file or DB?

I'm trying to use a small script to download a field from multiple pages. For one thing, I'm only able to get it from one page..., but the real problem I'm having is that I don't know how to hand the output off to a database table? How can I take the output from curl/lynx|grep (which is going to be all the list items) and move it, list item by list item, to a table in my DB or to a CSV where it will be ready for import to the DB?

#!/bin/bash

lynx --source "http://www.thewebsite.com"|cut -d\" -f8|grep "<li>"

The database I would connect to would be a MySQL database. We could call the dummy table "listTable". Please, try to stick to bash? I'm not allowed to compile on the server I'm using, and I can't seem to get curl to work with PHP. Anyway, I'm thinking I need to make a variable and then systematically pass the contents of the variable to the database, right?

Upvotes: 0

Views: 1506

Answers (2)

Tassos Bassoukos
Tassos Bassoukos

Reputation: 16152

Use something like awk, sed or perl to create INSERT statements, then pipe that to your sql client (psql or mysql).

Upvotes: 2

Elalfer
Elalfer

Reputation: 5338

Just write a Python script which reads everything from the stdin an puts it into the database and do something like:

curl http://www.google.com | ./put_to_db.py

Upvotes: 0

Related Questions