rajan
rajan

Reputation: 485

GCP and Datalab with Python 3.6, Need to use Jupyter on GCP

I'm trying to use fastai's libraries, but some of the data accessing tools built in to those libraries are dependent on HTML objects. For example, the DataBunch.show_batch attribute produces an HTML object that is easiest to use in Jupyter. I need to run my testing on GCP (or another cloud), and here are the issues:

I see a few options:

  1. Develop my own data visualization techniques by subclassing fastai's libraries and skip Jupyter altogether
  2. Create a Jupyter-to-GCP interface in a different way, basically redoing the steps in link in the second bullet point above.
  3. Use one of the containers (docker) that I keep hearing about on Datalab that allow me to use my own version of Python

Does anyone have other options for how I can make this connection? If not, can anyone provide other links for how to accomplish 1, 2, or 3?

Upvotes: 0

Views: 660

Answers (2)

Zain Rizvi
Zain Rizvi

Reputation: 24636

You could create a notebook using GCP's AI Platform Notebooks.

It should give you a one-click way to create a VM with all the libraries you need preinstalled. It'll even give you a URL that you can use to directly access your notebook.

Upvotes: 0

itroulli
itroulli

Reputation: 2094

You can follow this guide from fast.ai to create a VM with all the required libraries pre-installed. Then, following the same guide you can access JupyterLab or Jupyter Notebooks in this VM. It's simple, fast and comes with Python 3.7.3.

Upvotes: 1

Related Questions