maryam esmaeili
maryam esmaeili

Reputation: 11

Use HuggingFace models locally

I would like to use transformers especially HuggingFace Models as a part of my programming

my question is; Can I use and implement transformers and HuggingFace Models offline and in Spyder IDE (or any other IDE that I can use locally? (Of course, after downloading and installing all needed packages).

Thanks in advance.

Upvotes: 1

Views: 469

Answers (2)

MD-ML
MD-ML

Reputation: 442

If I understand your question correctly, you want to do something like this:

import requests
import numpy as np

API_URL = "https://api-inference.huggingface.co/pipeline/feature-extraction/deepset/roberta-base-squad2"
headers = {"Authorization": f"Bearer {HUGGINGFACE_TOKEN}"}

def query(payload):
    response = requests.post(API_URL, headers=headers, json=payload)
    return response.json()

data = query({"inputs": "This is a more detailed example."})
data = np.array(data)[0]
embedding = np.mean(data, axis=0)
len(embedding), embedding

Note that the api url contains

  1. the task (here: feature-extraction) and
  2. the model (here: deepset/roberta-base-squad2).

You can look up available tasks here: https://huggingface.co/docs/transformers/en/main_classes/pipelines

And you can choose an appropriate model there: https://huggingface.co/models

Usually, you don't want to expose your HUGGINGFACE_TOKEN. Here is a video that demonstrates how to hide it: https://www.youtube.com/watch?v=_tYNm5nqpxE

I hope this helps.

P.s.: Here is a link to a Colab notebook.

Upvotes: 0

hafedh
hafedh

Reputation: 79

Yes, Once you use a model once, it gets downloaded into your cache directory, meaning that next time you call the model even without internet connection it can be loaded as well.

your default cache directory is ~\.cache\huggingface\hub

Upvotes: -1

Related Questions