Rolf12
Rolf12

Reputation: 771

SQL table to Python dataframe

I have a SQL database from which I need to analyze a very big table. I would like to use python for that.

Unfortunately I cannot access the SQL database directly via python.

Are there any suggestions to what to export the table to be able to work with it?

  1. I tried to export to SQL file and to import that into pandas dataframe but of course out of memory.

  2. I tried to access the database from python directly using pymysql using

     db=pymysql.connect(host="localhost", db="all_data")
    

but I get "can't connect to MySQL server Win Error 10061"?

File type is .sql.

Many thanks

Upvotes: 0

Views: 1191

Answers (2)

Mel
Mel

Reputation: 331

I had this problem before... Try this:

#import libraries
from sqlalchemy import create_engine
import pandas as pd

#set login parameters
db = #enter your database schema name here
user = #enter username to login
pw = #type the password here


#connect to database
engine = create_engine("mysql+pymysql://{user}:{pw}@localhost/{db}"
                        .format(user=user,pw=pw, db=db))


#load data from MySQL into a dataframe
df = pd.read_sql_query("Select * from table", engine)

Upvotes: 2

vukojevicf
vukojevicf

Reputation: 827

I am pretty sure something like this would work

import pyodbc
import pandas as pd

conn = pyodbc.connect(r'Driver={Microsoft Access Driver (*.mdb, *.accdb)};DBQ=C:\Users\Ron\Desktop\testdb.accdb;')

SQL_Query = pd.read_sql_query(
'''select
product_name,
product_price_per_unit,
units_ordered,
((units_ordered) * (product_price_per_unit)) AS revenue
from tracking_sales''', conn)

df = pd.DataFrame(SQL_Query, columns=['field1','field2',...])

You can see a good example on this link: https://datatofish.com/sql-to-pandas-dataframe/

Upvotes: 0

Related Questions