Reputation: 1308
I have pandas dataframe size of 21909, 20037 (row,column) and I want to convert it to csv by following code.
dpF.to_csv("outputFile.csv", encoding='utf-8')
after this line of code my computer is getting hang and I am not getting any output. Is there any way to write such a huge dataframe in a optimal way.
System configuration
- OS: ubuntu 16.04 LTS
- OS type: 64 bit
- Memory: 7.7 GiB
- Processor: Intel® Core™ i5-4590 CPU @ 3.30GHz × 4
Upvotes: 0
Views: 710
Reputation: 210982
You can use chunksize
parameter:
df.to_csv("outputFile.csv", encoding='utf-8', chunksize=2000)
chunksize : int or None
rows to write at a time
Upvotes: 5