Reputation: 491
I want to upload a dataframe as csv from colab to google drive.I tried a lot but no luck. I can upload a simple text file but failed to upload a csv.
I tried the following code:
import pandas as pd
df=pd.DataFrame({1:[1,2,3]})
df.to_csv('abc',sep='\t')
auth.authenticate_user()
gauth = GoogleAuth()
gauth.credentials = GoogleCredentials.get_application_default()
drive = GoogleDrive(gauth)
uploaded = drive.CreateFile({'title': 'sample.csv', 'mimeType':'csv'})
uploaded.SetContentFile('abc')
uploaded.Upload()
Upvotes: 36
Views: 125870
Reputation: 6097
The other answers almost worked but I needed a small tweak:
from google.colab import drive
drive.mount('drive')
df.to_csv('/content/drive/My Drive/filename.csv', encoding='utf-8', index=False)
the /content/
bit proved necessary
Upvotes: 9
Reputation: 678
If you want to save locally then you can use this
f.to_csv('sample.csv')
from google.colab import files
files.download("sample.csv")
Upvotes: 2
Reputation: 761
# Import Drive API and authenticate.
from google.colab import drive
# Mount your Drive to the Colab VM.
drive.mount('/gdrive')
# Write the DataFrame to CSV file.
with open('/gdrive/My Drive/foo.csv', 'w') as f:
df.to_csv(f)
Upvotes: 11
Reputation: 753
Without using !cp command
from google.colab import drive
drive.mount('/drive')
df.to_csv('/drive/My Drive/folder_name/name_csv_file.csv')
Upvotes: 24
Reputation: 40773
It may be easier to use mounting instead of pydrive.
from google.colab import drive
drive.mount('drive')
After authentication, you can copy your csv file.
df.to_csv('data.csv')
!cp data.csv "drive/My Drive/"
Upvotes: 47
Reputation: 1629
If you don't want to work with Pandas, do this:
df_mini.coalesce(1)\
.write\
.format('com.databricks.spark.csv')\
.options(header='true', delimiter='\t')\
.save('gdrive/My Drive/base_mini.csv')
Upvotes: 1