Reputation: 6842
I would like to import a large R data.frame
object into Postgres. I am saving the object as a CSV file using these commands:
> out_file <- paste(input_path, "data.csv", sep="")
> con<-file(out_file, encoding="UTF-8")
> write.csv(df, out_file)
No error messages are shown. Then switching to psql I issue the import with COPY
, which results in this error:
# COPY data_in FROM 'data.csv' DELIMITER ',' CSV HEADER;
ERROR: invalid byte sequence for encoding "UTF8": 0xf8
CONTEXT: COPY data_in, line 74358
Which software is failing here? Or is more guidance needed to get the correct encoding?
Upvotes: 0
Views: 468
Reputation: 2017
From my comment:
write.csv(df, out_file,fileEncoding=TRUE)
# write.csv(df, con)
Either of the above will work. If the encoding option is added to the connection, I don't think it doesn't affect the file itself.
Upvotes: 1