Reputation: 171
I have uploaded all the files I need to train (fine-tune) an object detection model on my Google Drive account. There, I have opened an ipynb file via google colab and I need to run some scripts to initialize things and run training.
But I ran into a problem of not knowing how to access the files that I uploaded from the notebook. The notebook is created in the same directory of the scripts that I need to run.
When I execute !ls
I get only one datalab
folder in return, and !pwd
returns /content
.
I want to know if there's a way to access all the files I have uploaded to my google drive account. I may be doing the wrong thing trying to train a model this way, but I don't know any other. So, please help :).
Thank you in advance.
Upvotes: 3
Views: 2781
Reputation: 1535
Here what I use to download a file from Google Drive to Google Colab file system, substitute fileId
value with your Drive file id.
!pip install -U -q PyDrive
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
from google.colab import auth
from oauth2client.client import GoogleCredentials
auth.authenticate_user()
gauth = GoogleAuth()
gauth.credentials = GoogleCredentials.get_application_default()
drive = GoogleDrive(gauth)
fileId = '1234567890dcz4qL-JQtYqX9ZXUt7JXe'
fileName = fileId
downloaded = drive.CreateFile({'id': fileId})
downloaded.GetContentFile(fileName)
To get Drive file id look at this answer.
Upvotes: 2