Kushal Bhola
Kushal Bhola

Reputation: 67

Importing a very very large file in postgres without defining the table structure

I have an insanely large csv file which I want to import in postgres db. It is of the size of 500MB. I do not want to create the tabe first with more than 1000+ columns and then go for the insert like with the convntional copy command. Is there any way where I can use the header info of csv ( column names ), to directly import this data without creating a table first. I am lookig for an import which is similar to R import.

Upvotes: 3

Views: 1538

Answers (1)

m.swiss
m.swiss

Reputation: 104

It is probably not the solution your expecting but with Python you could read in the headers of your columns and create a table out of the csv very easily:

import pandas as pd
import psycopg2
from sqlalchemy import create_engine

# Read the csv to a dataframe
df = pd.read_csv('path_to_csv_file', index_col='name_of_index_column',  sep=",") 

# Connect and upload
engine = create_engine('postgresql+psycopg2://db_user_name:db_password@localhost:5432/' + 'db_name', client_encoding='utf8')
df.to_sql('table_name', engine, if_exists='replace', index =True, index_label='name_of_index_column')

Upvotes: 1

Related Questions