coding
coding

Reputation: 689

Error when loading ML model from the remote MLflow instance

I tried to load a model from the remote MLflow instance, using load_model function:

import mlflow

model = mlflow.pyfunc.load_model("http://remote_IP_address:5000/runs:/<run_id>/model")

I found the run_id by using the REST API:

import requests

requests.get("http://remote_IP_address:5000/api/2.0/preview/mlflow/runs/search",params={"experiment_ids":[0,1]})

But I am receiving an error:

ValueError: not enough values to unpack (expected 2, got 1)

I suppose the error is in the URI that I am using. Can you tell me the correct way to access the remote Mlflow instance and load the model?

p.s. I also tried:

mlflow.pyfunc.load_model("http://remote_Ip_address:5000/models:/<model_name>/production")

but I received the same error.

Thank you in advance!

Upvotes: 1

Views: 1723

Answers (1)

coding
coding

Reputation: 689

I found the solution, so I hope it will be helpful to others.

In Python script or Jupyter notebook, write:

import mlflow
mlflow.set_tracking_uri("http://remote_IP_address:5000/")
model_test = mlflow.pyfunc.load_model("models:/name_of_the_model/production")

This is an example to retrieve the registered models from MLflow. Similarly, you can retrieve models that are uploaded into MLflow by using runs option.

Upvotes: 2

Related Questions