Reputation: 1290
Within our azure machine learning inference server we want to be able to make calls to a password protected database. What is the best way to provide a secret to the running inference server?
I have tried following this tutorial: https://learn.microsoft.com/en-us/azure/machine-learning/how-to-use-secrets-in-runs.
However the request to run = Run.get_context()
always returns an offline instance.
Presumably that is only used for passing secrets to models when training them?
I was also thinking I could pass my secrets in as env variables but I'm not sure if that is safe?
env = Environment.from_pip_requirements(env_name, "requirements-azure.txt")
env.environment_variables.update({"mysecret": "password"})
Any suggestions are greatly appreciated!
Upvotes: 2
Views: 494
Reputation: 2754
Here is link to Access Azure resources from a managed online endpoint (preview) with a managed identity and Azure Machine Learning inference HTTP server.
Currently the following feature is in the roadmap to Provide deployed web service some kind of easy access path to the associated Key Vault, similar to how an experimental run of a model can use Run.get_context().get_secret().
Upvotes: 1