Reputation: 3954
I am trying to deploy a model via azure ml that was pushed with mlflow in a pretty simple setup.
I followed the guide here using this snippet
from azure.ai.ml import MLClient
from azure.identity import DefaultAzureCredential
import mlflow
ml_client = MLClient.from_config(credential=DefaultAzureCredential())
azureml_mlflow_uri = ml_client.workspaces.get(ml_client.workspace_name).mlflow_tracking_uri
mlflow.set_tracking_uri(azureml_mlflow_uri)
and doing the experiment with
mlflow.tensorflow.autolog()
with mlflow.start_run():
...
That all worked fine, and the model ended up under "Jobs" in the azure ml studio web UI. When I then follow the guide here, the deployment fails with:
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
tensorflow 2.9.1 requires protobuf<3.20,>=3.9.2, but you have protobuf 4.21.7 which is incompatible.
tensorboard 2.9.1 requires protobuf<3.20,>=3.9.2, but you have protobuf 4.21.7 which is incompatible.
This is not a dependency that I configured, it is coming from azure or mlflow. Any idea what to do about that?
Upvotes: 0
Views: 498
Reputation: 1683
Reproduced the same problem and achieved the task. The model was generated, and the dependencies are not objected the operation and it is suggestable to use any kind of coding editor like notebook in azure portal to avoid the library dependencies.
Create a project using the subscription key and resource group information. The code block which was mentioned in the question is used.
Get the details of the subscription and then to manage the model use the below code block.
model_path = "model" model_uri = 'runs:/{}/{}'.format(run_id, model_path) mlflow.register_model(model_uri,"registered_model_name")
Go to the workspace which was created and check with the jobs and models in the left panel
Upvotes: 0