Mostafa M. Galal
Mostafa M. Galal

Reputation: 11

Whisper model Real-time endpoint container deployment failed on Azure ML

I tried to deploy Whisper on Azure ML. I am using the Whipser-openAI-v3 model for deployment. The endpoint creation successes but deployment failed with the error ResouceOperationFailed and so the creation of deployment failed.

I think the problem is outdated packages required by the model to be deployed but I don't know how to degrade them from Azure ML.

This is the log:

Liveness Probe: GET   127.0.0.1:31311/
Score:          POST  127.0.0.1:31311/score

2025-01-30 00:02:33,483 W [379] azmlinfsrv - Found extra keys in the config file that are not supported by the server.
Extra keys = ['AZUREML_ENTRY_SCRIPT', 'AZUREML_MODEL_DIR', 'HOSTNAME']
2025-01-30 00:02:33,755 W [379] azmlinfsrv - AML_FLASK_ONE_COMPATIBILITY is set. However, compatibility patch for Flask 1 has failed. This is only a problem if you use @rawhttp and relies on deprecated methods such as has_key().
Traceback (most recent call last):
  File "/opt/miniconda/envs/userenv/lib/python3.10/site-packages/azureml_inference_server_http/server/create_app.py", line 58, in <module>
    patch_flask()
  File "/opt/miniconda/envs/userenv/lib/python3.10/site-packages/azureml_inference_server_http/server/create_app.py", line 33, in patch_flask
    patch_werkzeug = LooseVersion(werkzeug.__version__) >= LooseVersion("2.1")
AttributeError: module 'werkzeug' has no attribute '__version__'

Initializing logger
2025-01-30 00:02:33,757 I [379] azmlinfsrv - Starting up app insights client
WARNING:entry_module:No signature information provided for model. If no sample information was provided with the model the deployment's swagger will not include input and output schema and typing information.For more information, please see: https://aka.ms/aml-mlflow-deploy.
2025/01/30 00:02:35 WARNING mlflow.utils.requirements_utils: Detected one or more mismatches between the model's dependencies and the current Python environment:
 - scikit-learn (current: 1.2.2, required: scikit-learn<=1.1.3)
To fix the mismatches, call mlflow.pyfunc.get_model_dependencies(model_uri) to fetch the model's environment and install dependencies using the resulting environment file.
Traceback (most recent call last):
  File "/opt/miniconda/envs/userenv/lib/python3.10/site-packages/azureml/evaluate/mlflow/__init__.py", line 67, in <module>
    import mlflow.gluon as gluon
ModuleNotFoundError: No module named 'mlflow.gluon'
2025-01-30 00:02:38,617 E [379] azmlinfsrv - Traceback (most recent call last):
  File "/opt/miniconda/envs/userenv/lib/python3.10/site-packages/azureml_inference_server_http/server/user_script.py", line 77, in load_script
    main_module_spec.loader.exec_module(user_module)
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/var/mlflow_resources/mlflow_score_script.py", line 378, in <module>
    model = load_model(model_path)
  File "/opt/miniconda/envs/userenv/lib/python3.10/site-packages/mlflow/tracing/provider.py", line 383, in wrapper
    is_func_called, result = True, f(*args, **kwargs)
  File "/opt/miniconda/envs/userenv/lib/python3.10/site-packages/mlflow/pyfunc/__init__.py", line 1120, in load_model
    model_impl = importlib.import_module(conf[MAIN])._load_pyfunc(data_path)
  File "/opt/miniconda/envs/userenv/lib/python3.10/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 1050, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1027, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1006, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 688, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 883, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File "/opt/miniconda/envs/userenv/lib/python3.10/site-packages/azureml/evaluate/mlflow/hftransformers/__init__.py", line 30, in <module>
    from azureml.evaluate.mlflow import pyfunc, aml
ImportError: cannot import name 'pyfunc' from 'azureml.evaluate.mlflow' (/opt/miniconda/envs/userenv/lib/python3.10/site-packages/azureml/evaluate/mlflow/__init__.py)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/opt/miniconda/envs/userenv/lib/python3.10/site-packages/azureml_inference_server_http/server/aml_blueprint.py", line 91, in setup
    self.user_script.load_script(config.app_root)
  File "/opt/miniconda/envs/userenv/lib/python3.10/site-packages/azureml_inference_server_http/server/user_script.py", line 79, in load_script
    raise UserScriptImportException(ex) from ex
azureml_inference_server_http.server.user_script.UserScriptImportException: Failed to import user script because it raised an unhandled exception

2025-01-30 00:02:38,617 I [379] gunicorn.error - Worker exiting (pid: 379)
2025-01-30 00:02:39,347 E [9] gunicorn.error - Worker (pid:379) exited with code 3
2025-01-30 00:02:39,348 E [9] gunicorn.error - Shutting down: Master
2025-01-30 00:02:39,348 E [9] gunicorn.error - Reason: Worker failed to boot.
Azure ML Inferencing HTTP server v1.3.4

Server Settings
---------------
Entry Script Name: /var/mlflow_resources/mlflow_score_script.py
Model Directory: /var/azureml-app/azureml-models/openai-whisper-large-v3/5
Config File: None
Worker Count: 1
Worker Timeout (seconds): 300
Server Port: 31311
Health Port: 31311
Application Insights Enabled: false
Application Insights Key: None
Inferencing HTTP server version: azmlinfsrv/1.3.4
CORS for the specified origins: None
Create dedicated endpoint for health: None

Upvotes: 0

Views: 47

Answers (1)

JayashankarGS
JayashankarGS

Reputation: 8020

You can find the artifacts in model page like shown below.

enter image description here

Download all of them and update the dependencies in conda.yaml or requirement.txt.

Then use below code to register it again with updated files.

from azure.ai.ml.entities import Model
from azure.ai.ml.constants import AssetTypes

mlflow_model = Model(
    path="mlflow-model",#provide the download folder path and make sure you updated required dependencies.
    type=AssetTypes.MLFLOW_MODEL,
    name="local-mlflow-example",
    description="MLflow model created from local path",
)
ml_client.create_or_update(mlflow_model)

This will register new dataset or creates new version of the current one, further you deploy it.

Next, you also getting error from scoring script make sure you have compatible mlflow version.

ModuleNotFoundError: No module named 'mlflow.gluon'

Below is the recommended dependencies you need to add in yaml or requirement.txt file.

      - werkzeug<2.1.0
      - scikit-learn<=1.1.3
      - mlflow==2.3.1
      - azureml-inference-server-http==1.3.4
      - azureml-evaluate-mlflow==0.1.0

Upvotes: 0

Related Questions