Chandanraj C L
Chandanraj C L

Reputation: 1

Error in deploy LLM model in sagemaker endpoint. pls provide the solution any one known

#033[2m2023-07-31T06:58:11.298494Z#033[0m #033[31mERROR#033[0m #033[2mtext_generation_launcher#033[0m#033[2m:#033[0m Download encountered an error: Traceback (most recent call last):
  File "/opt/conda/bin/text-generation-server", line 8, in <module>
    sys.exit(app())
  File "/opt/conda/lib/python3.9/site-packages/text_generation_server/cli.py", line 151, in download_weights
    utils.convert_files(local_pt_files, local_st_files)
  File "/opt/conda/lib/python3.9/site-packages/text_generation_server/utils/convert.py", line 84, in convert_files
    convert_file(pt_file, sf_file)
  File "/opt/conda/lib/python3.9/site-packages/text_generation_server/utils/convert.py", line 62, in convert_file
    save_file(pt_state, str(sf_file), metadata={
    "format": "pt"
})
  File "/opt/conda/lib/python3.9/site-packages/safetensors/torch.py", line 232, in save_file
    serialize_file(_flatten(tensors), filename, metadata=metadata)

we are trying to deploy falcon LLM model in aws sagemaker endpoint by following code provided in huggingface model deploy option.

please provide the solution for this issue..

Upvotes: 0

Views: 323

Answers (1)

Related Questions