Reputation: 73
I want to make sure my BertModel does not loads pre-trained weights. I am using auto class (hugging face) which loads model automatically.
My question is how do I load bert model without pretrained weights?
Upvotes: 2
Views: 4891
Reputation: 41
You can re-initialize of a PreTrainedModel
class with init_weights
method (Huggingface Documentation), if the model is already loaded with pre-trained weights.
Upvotes: 1
Reputation: 1194
Use AutoConfig instead of AutoModel:
from transformers import AutoConfig
config = AutoConfig.from_pretrained('bert-base-uncased')
model = AutoModel.from_config(config)
this should set up the model without loading the weights.
Upvotes: 6
Reputation: 987
Maybe you can just load the model with pretrained weights, iterate over model parameters and set the model parameters randomly with whatever initialization technique you prefer.
Upvotes: 0