Reputation: 21
I have developed a Tensorflow based machine learning model on local Machine. And I want to deploy it in Google Cloud Platform (Cloud ML Engine) for predictions. The model reads input data from Google Bigquery and the output predictions has to be written in Google Bigquery only. There are some data preparation scripts which has to be run before the model prediction is run. I have used Google Cloud Storage for model storage and used it for deployment, i have deployed it successfully. But, instead of using Google Cloud Storage for saving a model (i.e. .pb or .pkl model file) can i store it on GCP VM (Or Local machine) and call it from Cloud ML Engine for prediction? Is it possible? or I have only a option to upload Model directory to a Cloud Storage bucket which i will use it for prediction?
Could you please help me on this.
Upvotes: 2
Views: 265
Reputation: 10058
For model deployment you need a Google Cloud Storage using AI Platform. Another option is to use AI platform training (local or in GCP), output the model (SavedModel format) to a local folder or Cloud Storage and from there using TF Serving in Compute Engine Instance.
Upvotes: 0