Reputation: 11
I want you to use different pretrain bert model embeddings for the bert score. How can I do that? P, R, F1 = score(cand, ref, lang="bn", model_type="distilbert-base-uncased", verbose=True) In model_type if use my pretain model then it gives a keyError.
Upvotes: 1
Views: 517
Reputation: 76
You need to pass num_layers configuration parameter (if it is not given, the library will search for predefined defaults in utils.py file ).
bert_score.score(['Hello world'], ['Whats up'], model_type='/home/user/bart_large', num_layers=10)
Upvotes: 0