Aniket Walse
Aniket Walse

Reputation: 1

Storing mlflow artifacts to s3, while having SQL databse as backend

When using a SQL database as backend for mlflow are the artifacts stored in the same database or in default ./mlruns directory?

Is it possible to store them in different location as in AWS S3?

Upvotes: 0

Views: 1325

Answers (2)

errorParser
errorParser

Reputation: 671

Yes. You can make use of the mlflow serve as below.

mlflow server --backend-store-uri=sqlite:///mlflow.db --default-artifact-root="s3://<bucket_name>" --host 0.0.0.0 --port 80

Also dont forget add install boto3 and configure AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables so that mlflow can read and write from and to the bucket respectively

Upvotes: 0

Jules
Jules

Reputation: 54

Yes, you can different artifact locations for each experiment and have the same backend registry. Here is an example that shows it

In this example, my backend registry is "mlruns.db" and the artifacts will be stored in their respective locations.

Upvotes: 1

Related Questions