Reputation: 61
I am trying prepare and then submit a new experiment to Azure Machine Learning from an Azure Function in Python. I therefore register a new dataset for my Azure ML workspace, which contains the training data for my ML model using dataset.register(...
. However, when I try to create this dataset with the following line of code
dataset = Dataset.Tabular.from_delimited_files(path = datastore_paths)
then I get a Failure Exception: OSError: [Errno 30] Read-only file system ...
.
datastore_path
and then register this to my Azure Machine Learning workspace. But it seems that the method from_delimited_files
is trying to write to the file system anyway (maybe some caching?).os.chdir(tempfile.gettempdir())
, but that didn't help.Any other ideas? I don't think I am doing something particularly unusually...
I am using python 3.7 and azureml-sdk 1.9.0 and I can run the python script locally without problems. I currently deploy from VSCode using the Azure Functions extension version 0.23.0 (and an Azure DevOps pipeline for CI/CD).
Here is my full stack trace:
Microsoft.Azure.WebJobs.Host.FunctionInvocationException: Exception while executing function: Functions.HttpTrigger_Train
---> Microsoft.Azure.WebJobs.Script.Workers.Rpc.RpcException: Result: Failure
Exception: OSError: [Errno 30] Read-only file system: '/home/site/wwwroot/.python_packages/lib/site-packages/dotnetcore2/bin/deps.lock'
Stack: File "/azure-functions-host/workers/python/3.7/LINUX/X64/azure_functions_worker/dispatcher.py", line 345, in _handle__invocation_request
self.__run_sync_func, invocation_id, fi.func, args)
File "/usr/local/lib/python3.7/concurrent/futures/thread.py", line 57, in run
result = self.fn(*self.args, **self.kwargs)
File "/azure-functions-host/workers/python/3.7/LINUX/X64/azure_functions_worker/dispatcher.py", line 480, in __run_sync_func
return func(**params)
File "/home/site/wwwroot/HttpTrigger_Train/__init__.py", line 11, in main
train()
File "/home/site/wwwroot/shared_code/train.py", line 70, in train
dataset = Dataset.Tabular.from_delimited_files(path = datastore_paths)
File "/home/site/wwwroot/.python_packages/lib/site-packages/azureml/data/_loggerfactory.py", line 126, in wrapper
return func(*args, **kwargs)
File "/home/site/wwwroot/.python_packages/lib/site-packages/azureml/data/dataset_factory.py", line 308, in from_delimited_files
quoting=support_multi_line)
File "/home/site/wwwroot/.python_packages/lib/site-packages/azureml/dataprep/api/readers.py", line 100, in read_csv
df = Dataflow._path_to_get_files_block(path, archive_options)
File "/home/site/wwwroot/.python_packages/lib/site-packages/azureml/dataprep/api/dataflow.py", line 2387, in _path_to_get_files_block
return datastore_to_dataflow(path)
File "/home/site/wwwroot/.python_packages/lib/site-packages/azureml/dataprep/api/_datastore_helper.py", line 41, in datastore_to_dataflow
datastore, datastore_value = get_datastore_value(source)
File "/home/site/wwwroot/.python_packages/lib/site-packages/azureml/dataprep/api/_datastore_helper.py", line 83, in get_datastore_value
_set_auth_type(workspace)
File "/home/site/wwwroot/.python_packages/lib/site-packages/azureml/dataprep/api/_datastore_helper.py", line 134, in _set_auth_type
get_engine_api().set_aml_auth(SetAmlAuthMessageArgument(AuthType.SERVICEPRINCIPAL, json.dumps(auth)))
File "/home/site/wwwroot/.python_packages/lib/site-packages/azureml/dataprep/api/engineapi/api.py", line 18, in get_engine_api
_engine_api = EngineAPI()
File "/home/site/wwwroot/.python_packages/lib/site-packages/azureml/dataprep/api/engineapi/api.py", line 55, in __init__
self._message_channel = launch_engine()
File "/home/site/wwwroot/.python_packages/lib/site-packages/azureml/dataprep/api/engineapi/engine.py", line 300, in launch_engine
dependencies_path = runtime.ensure_dependencies()
File "/home/site/wwwroot/.python_packages/lib/site-packages/dotnetcore2/runtime.py", line 141, in ensure_dependencies
with _FileLock(deps_lock_path, raise_on_timeout=timeout_exception):
File "/home/site/wwwroot/.python_packages/lib/site-packages/dotnetcore2/runtime.py", line 113, in __enter__
self.acquire()
File "/home/site/wwwroot/.python_packages/lib/site-packages/dotnetcore2/runtime.py", line 72, in acquire
self.lockfile = os.open(self.lockfile_path, os.O_CREAT | os.O_EXCL | os.O_RDWR)
at Microsoft.Azure.WebJobs.Script.Description.WorkerFunctionInvoker.InvokeCore(Object[] parameters, FunctionInvocationContext context) in /src/azure-functions-host/src/WebJobs.Script/Description/Workers/WorkerFunctionInvoker.cs:line 85
at Microsoft.Azure.WebJobs.Script.Description.FunctionInvokerBase.Invoke(Object[] parameters) in /src/azure-functions-host/src/WebJobs.Script/Description/FunctionInvokerBase.cs:line 85
at Microsoft.Azure.WebJobs.Script.Description.FunctionGenerator.Coerce[T](Task`1 src) in /src/azure-functions-host/src/WebJobs.Script/Description/FunctionGenerator.cs:line 225
at Microsoft.Azure.WebJobs.Host.Executors.FunctionInvoker`2.InvokeAsync(Object instance, Object[] arguments) in C:\projects\azure-webjobs-sdk-rqm4t\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionInvoker.cs:line 52
at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.InvokeAsync(IFunctionInvoker invoker, ParameterHelper parameterHelper, CancellationTokenSource timeoutTokenSource, CancellationTokenSource functionCancellationTokenSource, Boolean throwOnTimeout, TimeSpan timerInterval, IFunctionInstance instance) in C:\projects\azure-webjobs-sdk-rqm4t\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:line 587
at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.ExecuteWithWatchersAsync(IFunctionInstanceEx instance, ParameterHelper parameterHelper, ILogger logger, CancellationTokenSource functionCancellationTokenSource) in C:\projects\azure-webjobs-sdk-rqm4t\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:line 532
at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.ExecuteWithLoggingAsync(IFunctionInstanceEx instance, ParameterHelper parameterHelper, IFunctionOutputDefinition outputDefinition, ILogger logger, CancellationTokenSource functionCancellationTokenSource) in C:\projects\azure-webjobs-sdk-rqm4t\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:line 470
at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.ExecuteWithLoggingAsync(IFunctionInstanceEx instance, FunctionStartedMessage message, FunctionInstanceLogEntry instanceLogEntry, ParameterHelper parameterHelper, ILogger logger, CancellationToken cancellationToken) in C:\projects\azure-webjobs-sdk-rqm4t\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:line 278
--- End of inner exception stack trace ---
at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.ExecuteWithLoggingAsync(IFunctionInstanceEx instance, FunctionStartedMessage message, FunctionInstanceLogEntry instanceLogEntry, ParameterHelper parameterHelper, ILogger logger, CancellationToken cancellationToken) in C:\projects\azure-webjobs-sdk-rqm4t\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:line 325
at Microsoft.Azure.WebJobs.Host.Executors.FunctionExecutor.TryExecuteAsyncCore(IFunctionInstanceEx functionInstance, CancellationToken cancellationToken) in C:\projects\azure-webjobs-sdk-rqm4t\src\Microsoft.Azure.WebJobs.Host\Executors\FunctionExecutor.cs:line 117
Upvotes: 2
Views: 1773
Reputation: 61
The issue was an incompatible OS version in my virtual environment.
A huge thanks goes to PramodValavala-MSFT for his idea to create a docker container! Following his suggestion, I suddenly got the following error message for the dataset = Dataset.Tabular.from_delimited_files(path = datastore_paths)
command:
Exception: NotImplementedError: Unsupported Linux distribution debian 10.
which reminded me of the following warning in the azure machine learning documentation:
Some dataset classes have dependencies on the azureml-dataprep package, which is only compatible with 64-bit Python. For Linux users, these classes are supported only on the following distributions: Red Hat Enterprise Linux (7, 8), Ubuntu (14.04, 16.04, 18.04), Fedora (27, 28), Debian (8, 9), and CentOS (7).
Choosing the predefined docker image 2.0-python3.7
(running Debian 9) instead of 3.0-python3.7
(running Debian 10) solved the issue (see https://hub.docker.com/_/microsoft-azure-functions-python).
I suspect that the default virtual environment, which I was using originally, also ran on an incompatible OS.
Upvotes: 3