spadel
spadel

Reputation: 1036

Fine-tune Bert for specific domain (unsupervised)

I want to fine-tune BERT on texts that are related to a specific domain (in my case related to engineering). The training should be unsupervised since I don't have any labels or anything. Is this possible?

Upvotes: 7

Views: 2155

Answers (1)

Jindřich
Jindřich

Reputation: 11240

What you in fact want to is continue pre-training BERT on text from your specific domain. What you do in this case is to continue training the model as masked language model, but on your domain-specific data.

You can use the run_mlm.py script from the Huggingface's Transformers.

Upvotes: 9

Related Questions