Sanpreet
Sanpreet

Reputation: 103

Why is the inference API (serverless) for the custom model from Hugging Face not functioning?

I have pushed the custom model named as Orcawise/eu-ai-act-align on hugging face [A popular platform for sharing and discovering natural language processing models and other AI-related resources].

I have created this model by training google/gemma-2b on custom data. Now when I am trying to Use this model with the Inference API (serverless) using the below code

import requests

API_URL = "https://api-inference.huggingface.co/models/Orcawise/eu-ai-act-align"
headers = {"Authorization": "Bearer xxxxxxxxxxxxxxxxxxxxxxxx"}

def query(payload):
  response = requests.post(API_URL, headers=headers, json=payload)
  return response.json()

output = query({
"inputs": "Can you please let us know more details about your ",
})
print("Response from custom model is {}".format(output))

I am getting response as below

{'error': 'You are trying to access a gated repo.\nMake sure to request access at https://huggingface.co/google/gemma-2b and pass a token having permission to this repo either by logging in with `huggingface-cli login` or by passing `token=<your_token>`.'}

Solutions tried

  1. I went to hugging face account that i have and ask for access from google/gemma-2b and access was granted to me and then again tried to run the above code but with the same error. I am adding the screenshot to show the access being granted. here
  2. I used the pretrained model api url [google/gemma2b] in the above code so that it can first load the google/gemma2b model first and then custom model but no success. Check the below image. Inference api using base model with custom model
  3. I have also asked this question on hugging face community which can be read from here so that people can revert back here if they solved similar issues.

Question is why this happens when access to pretrained model is already there and what is the best way to use pretrained model with the custom model to use with inference api (serverless). Any ideas are welcomed.

Upvotes: 0

Views: 434

Answers (0)

Related Questions