Reputation: 41
I've deployed a model using AzureML's inference cluster. I recently found that some of the requests to the model's API endpoint resulted in a 404 HTTP error involving a missing swagger.json file.
So I followed this guide in order to auto-generate the swagger.json file. But now all the requests to the endpoint result in a "list index out of range" error and it's something to do with the input_schema
decorator. I just can't seem to pinpoint what the problem is exactly.
Here is a minimal recreation of my scoring script:
from inference_schema.schema_decorators import input_schema, output_schema
from inference_schema.parameter_types.standard_py_parameter_type import StandardPythonParameterType
def inference(args):
# inference logic here
return model_output
def init():
global model
model = get_model()
input_sample = StandardPythonParameterType({
'input_1': 'some text',
'input_2': 'some other text',
'input_3': 'other text'
})
sample_global_parameters = StandardPythonParameterType(1.0)
output_sample = StandardPythonParameterType({
'Results': {
'text': 'some text',
'model_output': [
{
'entity_type': 'date',
'value': '05/04/2022'
}
]
}
})
@input_schema('Inputs', input_sample)
@input_schema('GlobalParameters', sample_global_parameters)
@output_schema(output_sample)
def run(Inputs, GlobalParameters):
try:
return inference(Inputs['input_1'], Inputs['input_2'], Inputs['input_3'])
except Exception as e:
error = str(e)
return error
I've checked out this and this question but it didn't seem to help.
I tried looking at the code on GitHub as well but I still can't triangulate on the exact problem.
I'm calling the API from Postman with the default headers (I'm not adding anything). The request body looks like this:
{
"Inputs": {
"input_1": "some text",
"input_2": "some other text",
"input_3": "different text"
},
"GlobalParameters": 1.0
}
This is the error message from the endpoint logs:
2022-04-05 06:33:22,536 | root | ERROR | Encountered Exception: Traceback (most recent call last):
File "/var/azureml-server/synchronous/routes.py", line 65, in run_scoring
response, time_taken_ms = invoke_user_with_timer(service_input, request_headers)
File "/var/azureml-server/synchronous/routes.py", line 110, in invoke_user_with_timer
result, time_taken_ms = capture_time_taken(user_main.run)(**params)
File "/var/azureml-server/synchronous/routes.py", line 92, in timer
result = func(*args, **kwargs)
File "/var/azureml-app/main.py", line 21, in run
return_obj = driver_module.run(**arguments)
File "/azureml-envs/azureml_e63c7c0baf9bf3d861ce5992975a467b/lib/python3.7/site-packages/inference_schema/schema_decorators.py", line 61, in decorator_input
return user_run(*args, **kwargs)
File "/azureml-envs/azureml_e63c7c0baf9bf3d861ce5992975a467b/lib/python3.7/site-packages/inference_schema/schema_decorators.py", line 55, in decorator_input
args[param_position] = _deserialize_input_argument(args[param_position], param_type, param_name)
IndexError: list index out of range
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/azureml-envs/azureml_e63c7c0baf9bf3d861ce5992975a467b/lib/python3.7/site-packages/flask/app.py", line 1832, in full_dispatch_request
rv = self.dispatch_request()
File "/azureml-envs/azureml_e63c7c0baf9bf3d861ce5992975a467b/lib/python3.7/site-packages/flask/app.py", line 1818, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/var/azureml-server/synchronous/routes.py", line 44, in score_realtime
return run_scoring(service_input, request.headers, request.environ.get('REQUEST_ID', '00000000-0000-0000-0000-000000000000'))
File "/var/azureml-server/synchronous/routes.py", line 74, in run_scoring
raise RunFunctionException(str(exc))
run_function_exception.RunFunctionException
Upvotes: 2
Views: 660
Reputation: 1683
Try on setting the "GlobalParameters" to any kind of floating number other than 1.0 or try to remove it and execute. Sometimes global parameters will cause the issue.
Upvotes: 1