Jain26
Jain26

Reputation: 73

Make sure BERT model does not load pretrained weights?

I want to make sure my BertModel does not loads pre-trained weights. I am using auto class (hugging face) which loads model automatically.

My question is how do I load bert model without pretrained weights?

Upvotes: 2

Views: 4891

Answers (3)

Phi Machine
Phi Machine

Reputation: 41

You can re-initialize of a PreTrainedModel class with init_weights method (Huggingface Documentation), if the model is already loaded with pre-trained weights.

Upvotes: 1

Matthew Cox
Matthew Cox

Reputation: 1194

Use AutoConfig instead of AutoModel:

from transformers import AutoConfig
config = AutoConfig.from_pretrained('bert-base-uncased')
model =  AutoModel.from_config(config)

this should set up the model without loading the weights.

Documentation here and here

Upvotes: 6

abe
abe

Reputation: 987

Maybe you can just load the model with pretrained weights, iterate over model parameters and set the model parameters randomly with whatever initialization technique you prefer.

Upvotes: 0

Related Questions