G Pochman
G Pochman

Reputation: 43

Creating an SQL database from a Python webscrape

I am trying to transfer some basic data scraped from a weather forecast with a Python script (shown below) to an SQL database. I have the data stored in arrays, and formatted in a pandas dataframe.

import requests
import pandas
from bs4 import BeautifulSoup
page = requests.get('https://weather.com/weather/tenday/l/USOR0190:1:US')
soup = BeautifulSoup(page.content, 'html.parser')
feed = soup.select('main.region.region-main')
days = [i.get_text() for i in soup.select('span.day-detail.clearfix')]
descs = [i['title'] for i in soup.select('td.description')]
descs2 = [i.get_text() for i in soup.select('td.description span')]
temp = [i.get_text() for i in soup.select('td.temp div')]
temps = ["High: " + i[:3] + " / Low: " + i[3:] for i in temp]
frame = pandas.DataFrame({
    "Date": days,
    "Desc": descs2,
    "Temp": temps,
    "More": descs
})

What would the next steps be? Should I use SQLite, SQLalchemy, or some other engine? I have no knowledge of SQL, and am learning it on the fly. I believe that I have set up the environment to allow for usage these engines, so that shouldn't be a problem.

Upvotes: 0

Views: 100

Answers (1)

Rich Tier
Rich Tier

Reputation: 9441

Getting an ORM like sqlalchemy seems prudent.

https://pypi.org/project/SQLAlchemy/

No SQL needed. Lots of documentation and examples. Work entirely in python.

Upvotes: 2

Related Questions