Reputation: 341
I am trying to read in a table from my Postgres database into Python. Table has around 8 million rows and 17 columns, and has a size of 622MB in the DB.
I can export the entire table to csv using psql, and then use pd.read_csv() to read it in. It works perfectly fine. Python process only uses around 1GB of memory and everything is good.
Now, the task we need to do requires this pull to be automated, so I thought I could read the table in using pd.read_sql_table() directly from the DB. Using the following code
import sqlalchemy
engine = sqlalchemy.create_engine("postgresql://username:password@hostname:5432/db")
the_frame = pd.read_sql_table(table_name='table_name', con=engine,schema='schemaname')
This approach starts using a lot of memory. When I track the memory usage using Task Manager, I can see the Python process memory usage climb and climb, until it hits all the way up to 16GB and freezes the computer.
Any ideas on why this might be happening is appreciated.
Upvotes: 11
Views: 13477
Reputation: 511
Under the hood, pandas will call cursor.execute(your_query)
, this will load the binary data received from database via socket.recv() (via TCP)
. This will actually load all the data into memory, no matter what you do with pd.read_sql(chunksize=n).
Okay. So what is the solution? Using pandas.read_sql() leaves you without any flexibility whatsoever. Here are some ideas:-
use pyarrow as dtype_backend in pd.read_sql(), but I have tested it actually takes more memory because of multiple copies. First query result is being loaded to pandas Dataframe then converted to pyarrow Table. What a joke!
define dtype={}, it will surely reduce some memory for you. But not enough.
You can see it doesn't work with read_sql(). I would suggest you to check connectorx library.This is super memory efficient and fast. it is the same as pandas.read_sql(), it will give you pandas dataframe in return. Try it.
Upvotes: 1
Reputation: 61967
You need to set the chunksize
argument so that pandas will iterate over smaller chunks of data. See this post: https://stackoverflow.com/a/31839639/3707607
Upvotes: 5