Hatter The Mad
Hatter The Mad

Reputation: 140

ImportError: cannot import name 'AutoModelWithLMHead' from 'transformers'

This is literally all the code that I am trying to run:

from transformers import AutoModelWithLMHead, AutoTokenizer
import torch

tokenizer = AutoTokenizer.from_pretrained("microsoft/DialoGPT-small")
model = AutoModelWithLMHead.from_pretrained("microsoft/DialoGPT-small")

I am getting this error:

---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
<ipython-input-14-aad2e7a08a74> in <module>
----> 1 from transformers import AutoModelWithLMHead, AutoTokenizer
      2 import torch
      3 
      4 tokenizer = AutoTokenizer.from_pretrained("microsoft/DialoGPT-small")
      5 model = AutoModelWithLMHead.from_pretrained("microsoft/DialoGPT-small")

ImportError: cannot import name 'AutoModelWithLMHead' from 'transformers' (c:\python38\lib\site-packages\transformers\__init__.py)

What do I do about it?

Upvotes: 6

Views: 17597

Answers (1)

Hatter The Mad
Hatter The Mad

Reputation: 140

I solved it! Apparently AutoModelWithLMHead is removed on my version.

Now you need to use AutoModelForCausalLM for causal language models, AutoModelForMaskedLM for masked language models and AutoModelForSeq2SeqLM for encoder-decoder models.

So in my case code looks like this:

from transformers import AutoModelForCausalLM, AutoTokenizer
import torch

tokenizer = AutoTokenizer.from_pretrained("microsoft/DialoGPT-small")
model = AutoModelForCausalLM.from_pretrained("microsoft/DialoGPT-small")

Upvotes: 2

Related Questions