Reputation: 91
I have been trying to serve a model using MLFlow to no avail. Here is what I'm doing:
Step 1: Run all data prep steps in my Jupyter notebook
Step 2: start an Anaconda command prompt and go the same directory of the notebook
Step 3: start mlflow as follows:
mlflow server --backend-store-uri sqlite:///mlflow.db --default-artifact-root ./artifacts
Step 4: set tracking uri in the notebook as follows:
mlflow.set_tracking_uri('http://localhost:5000')
Step 5: run experiments in the notebook
Step 6: register the best experiment as production (in the notebook)
Step 7: start another command prompt and go the same directory of the notebook
Step 8: serve the registered model as follows:
mlflow models serve --model-uri models:/random-forest/Production -p 1234 --no-conda
At this stage I get the following error:
Model Registry features are not supported by the store with URI: 'file:///C:/localpath/mlruns'. Stores with the following URI schemes are supported: ['databricks', 'http', 'https', 'postgresql', 'mysql', 'sqlite', 'mssql'].
Though, I'm using a sqlite database (as seen in step 3). MLFlow is using it - because I can see the sqlite file size increase when I run experiments. Everything (including the UI) is working fine except serving the model. Can anyone tell me what I'm doing wrong?
Upvotes: 2
Views: 2655
Reputation: 91
Solved: Right before executing the model serve command at step 8, you need to create a new environment variable (in Windows) as follows: go to Environment variables
, click on New
for System, and add the following entry:
Variable: MLFLOW_TRACKING_URI
Value: http://localhost:5000
Upvotes: 3