sreenu padidapu
sreenu padidapu

Reputation: 31

how to load 30GB of datasets loading in google colab

i have a dataset of 30GB which I need to upload to google colab. What is the process to upload it?

Upvotes: 0

Views: 2816

Answers (3)

Hootan Alavizadeh
Hootan Alavizadeh

Reputation: 411

you can get more space by changing GPU to TPU from

MENU > Runtime > Change runtime type > Hardware accelerator = TPU

Upvotes: 0

korakot
korakot

Reputation: 40928

If your problem is not enough space, you can change to a GPU runtime to get 350 GB space.

MENU > Runtime > Change runtime type > Hardware accelerator = GPU

The process is the same as @Anwarvic 's answer.

Upvotes: 0

Anwarvic
Anwarvic

Reputation: 13022

It depends on what do you mean by "Have a 30GB dataset". If this dataset is on your local machine, then you need to:

  • Upload your dataset to Google Drive first
  • Then mount your Google Drive to your colab-notebook.

If you have the dataset on a server online, then you need to:

  • Mount your google drive to your notebook
  • Then, download it to your google drive directly

You can use this cod to mount your google-drive to your notebook:

import os
from google.colab import drive

drive.mount('/content/gdrive')
ROOT = "/content/gdrive/My Drive/"
os.chdir(ROOT)

If your data is on a server, then you can download it directly by running the following code in a notebook cell.

!wget [dataset_url]

Upvotes: 2

Related Questions