Fabian
Fabian

Reputation: 83

Large Language Model Perplexity

i am currently using GPT-3 and i am trying to compare its capabilities to related language models for my masters thesis. Unfortunatly GPT-3 is an API based application, so i am not really able to extract metrics such as perplexity.

Over the API i have acces to these three metrics and of course the models outputs:

Is there any possibility to calculate the perplexity of my model using python?

Thank you.

Upvotes: 1

Views: 609

Answers (1)

mostafa amiri
mostafa amiri

Reputation: 410

perplexity of model can be achived by exponential of its cross entropy loss: pp = 2^(-loss)

Upvotes: 0

Related Questions