Randolfo
Randolfo

Reputation: 688

gcloud ml-engine local predict RuntimeError: Bad magic number in .pyc file

My objective is to make predictions on google cloud ml engine.

I installed gcloud sdk on linux ubuntu 16.04LT following Google instructions. I already have a machine learning trained model. I using python version anaconda python 3.5.

I run:

gcloud ml-engine local predict --model-dir={MY_MODEL_DIR} --json-instances={MY_INPUT_JSON_INSTANCE}

I received the message: ERROR:

(gcloud.ml-engine.local.predict) RuntimeError: Bad magic number in .pyc file

Below is all the stack trace:

DEBUG: (gcloud.ml-engine.local.predict) RuntimeError: Bad magic number in .pyc file
Traceback (most recent call last):
  File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/calliope/cli.py", line 797, in Execute
    resources = calliope_command.Run(cli=self, args=args)
  File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/calliope/backend.py", line 757, in Run
    resources = command_instance.Run(args)
  File "/usr/lib/google-cloud-sdk/lib/surface/ml_engine/local/predict.py", line 65, in Run
    args.text_instances)
  File "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/command_lib/ml_engine/local_utils.py", line 89, in RunPredict
    raise LocalPredictRuntimeError(err)
LocalPredictRuntimeError: RuntimeError: Bad magic number in .pyc file
ERROR: (gcloud.ml-engine.local.predict) RuntimeError: Bad magic number in .pyc file
Evaluation ended**

Upvotes: 12

Views: 2653

Answers (6)

Dr. Fabien Tarrade
Dr. Fabien Tarrade

Reputation: 1696

The above tricks, is working but I found another option which seems to be permanent and which is to use GCP SDK with python 3.

Create a python 3 env with anaconda for example or use an existing python 3 installation.

Create a file gcp-sdk.yaml:

name: env_gcp_sdk
channels:
- defaults
- conda-forge
dependencies:
  # core packages
  - python=3.7.5
  - pip=20.0.2
  - pip:
    - google-cloud-storage
    - google-cloud-bigquery
    - google-cloud-kms
    - google-cloud-pubsub

Then create the env:

conda env create -f gcp-sdk.yaml 

Now set the following even variables and in my case I don't need anymore to deleted *.pyc files:

os.environ['CLOUDSDK_PYTHON']='path_to_env/env_gcp_sdk/bin/python'
os.environ['CLOUDSDK_GSUTIL_PYTHON']='path_to_env/env_gcp_sdk/bin/python'

Upvotes: 0

qursaan
qursaan

Reputation: 1

In Notebook run the following Cell:

%%bash
sudo find "/usr/lib/google-cloud-sdk/lib/googlecloudsdk/command_lib/ml_engine" -name '*.pyc' -delete

Upvotes: 0

Vikas Sharma
Vikas Sharma

Reputation: 451

Below are the steps for fixing this issue in Ubuntu:

1.Navigate to ml_engine path

cd /usr/lib/google-cloud-sdk/lib/googlecloudsdk/command_lib/ml_engine

2.Remove files ending with .pyc

sudo rm -rf *.pyc

Upvotes: 0

intotecho
intotecho

Reputation: 5684

Find and delete all the pyc files in the google SDK. They were compiled with the wrong python environment. They will be recompiled automatically when next needed.

%%bash
find "/tools/google-cloud-sdk/lib/" -name '*.pyc' -delete

Upvotes: 2

Randolfo
Randolfo

Reputation: 688

In fact I myself post this question to help people with the same problem, because I couldn't find an easy concise answer.

There are other solutions, in my opinion even better than mine, but this was what solved for me.

My solution was that google cloud sdk doesn't works with python 3, at least in my configuration. To solve:

  1. install a anaconda virtual environment with python 2 (in my case 2.7.14)
  2. activate the environment
  3. execute the gcloud command again

If your export ml model and inputs are OK that will work.

Simple problem, but caused a lot of pain to me, just because I couldn't easily find this pre-requirement or I simply missed it.

I hope help somebody.

Upvotes: 7

Johnny
Johnny

Reputation: 481

actually it works with python3, you just need to delete the pyc files in google cloud folders, so the prediction call can compile them with python3.

to know the location of the pyc files , i did enable the flag --verbosity debug in the prediction call:

gcloud ml-engine local predict --model-dir=${MODEL_LOCATION} --json-instances=data/new-data2.json --verbosity debug

the trackback will give you info about the path of gcloud ml engine files, in my machine was:

/usr/local/Caskroom/google-cloud-sdk/latest/google-cloud-sdk/lib/googlecloudsdk/command_lib/ml_engine/

go to that directory and delete the pyc files.

Upvotes: 21

Related Questions