Spark Monkay
Spark Monkay

Reputation: 442

Artifact storage and MLFLow on remote server

I am trying to get MLFlow on another machine in a local network to run and I would like to ask for some help because I don't know what to do now.

I have a mlflow server running on a server. The mlflow server is running under my user on the server and has been started like this:

mlflow server --host 0.0.0.0 --port 9999 --default-artifact-root sftp://<MYUSERNAME>@<SERVER>:<PATH/TO/DIRECTORY/WHICH/EXISTS>

My program which should log all the data to the mlflow server looks like this:

from mlflow import log_metric, log_param, log_artifact, set_tracking_uri

if __name__ == "__main__":
    remote_server_uri = '<SERVER>' # this value has been replaced
    set_tracking_uri(remote_server_uri)
    # Log a parameter (key-value pair)
    log_param("param1", 5)

    # Log a metric; metrics can be updated throughout the run
    log_metric("foo", 1)
    log_metric("foo", 2)
    log_metric("foo", 3)

    # Log an artifact (output file)
    with open("output.txt", "w") as f:
        f.write("Hello world!")
    log_artifact("output.txt")

The parameters get and metrics get transfered to the server but not the artifacts. Why is that so?

Note on the SFTP part: I can log in via SFTP and the pysftp package is installed

Upvotes: 9

Views: 6821

Answers (2)

sklingel
sklingel

Reputation: 179

I guess your problem is that you need to create also the experiment so using the sftp remote storage

mlflow.create_experiment("my_experiment", artifact_location=sftp_uri)

This fixed it for me.

Upvotes: 4

Spark Monkay
Spark Monkay

Reputation: 442

I don't know if I will get an answer to my problem but I did solved it this way.

On the server I created the directory /var/mlruns. I pass this directory to mlflow via --backend-store-uri file:///var/mlruns

Then I mount this directory via e.g. sshfs on my local machine under the same path.

I don't like this solution but it solved the problem good enough for now.

Upvotes: 2

Related Questions