CP002
CP002

Reputation: 51

Loading large dataframe to Vertica

I have a rather large dataframe (500k+ rows) that I'm trying to load to Vertica. I have the following code working, but it is extremely slow.

#convert df to list format
lists = output_final.values.tolist()

#make insert string
insert_qry = " INSERT INTO SCHEMA.TABLE(DATE,ID, SCORE) VALUES (%s,%s,%s) "

# load into database
for i in range(len(lists)):
    cur.execute(insert_qry, lists[i])
conn_info.commit()

I have seen a few posts talking about using COPY rather than EXECUTE to do this large of a load, but haven't found a good working example.

Upvotes: 1

Views: 1113

Answers (1)

CP002
CP002

Reputation: 51

After a lot of trial and error... I found that the following worked for me.

   # insert statements
    copy_str = "COPY SCHEMA.TABLE(DATE,ID, SCORE)FROM STDIN DELIMITER ','"

    # turn the df into a csv-like object
    stream = io.StringIO()
    contact_output_final.to_csv(stream, sep=",",index=False, header=False)

    # reset the position of the stream variable
    stream.seek(0)

    # load to data
    with conn_info.cursor() as cursor:
        cur.copy(copy_str,stream.getvalue())
    conn_info.commit() 

Upvotes: 4

Related Questions