Đặng Huy
Đặng Huy

Reputation: 51

What is "language modeling head" in BertForMaskedLM

I have recently read about BERT and want to use BertForMaskedLM for fill_mask task. I know about BERT architecture. Also, as far as I know, BertForMaskedLM is built from BERT with a language modeling head on top, but I have no idea about what language modeling head means here. Can anyone give me a brief explanation.

Upvotes: 5

Views: 6737

Answers (2)

Minh
Minh

Reputation: 17

Additionally to @Ashwin Geet D'Sa's answer. Here is the Huggingface's LM head definition:

The model head refers to the last layer of a neural network that accepts the raw hidden states and projects them onto a different dimension.

You can find the Huggingface's definition for other terms at this page https://huggingface.co/docs/transformers/glossary

Upvotes: 1

Ashwin Geet D'Sa
Ashwin Geet D'Sa

Reputation: 7379

The BertForMaskedLM, as you have understood correctly uses a Language Modeling(LM) head .

Generally, as well as in this case, LM head is a linear layer having input dimension of hidden state (for BERT-base it will be 768) and output dimension of vocabulary size. Thus, it maps to hidden state output of BERT model to a specific token in the vocabulary. The loss is calculated based on the scores obtained of a given token with respect to the target token.

Upvotes: 3

Related Questions