导入错误:无法从“transformers”导入名称“AutoModelWithLMHead”

问题描述 投票:0回答:1

这实际上是我尝试运行的所有代码:

from transformers import AutoModelWithLMHead, AutoTokenizer
import torch

tokenizer = AutoTokenizer.from_pretrained("microsoft/DialoGPT-small")
model = AutoModelWithLMHead.from_pretrained("microsoft/DialoGPT-small")

我收到此错误:

---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
<ipython-input-14-aad2e7a08a74> in <module>
----> 1 from transformers import AutoModelWithLMHead, AutoTokenizer
      2 import torch
      3 
      4 tokenizer = AutoTokenizer.from_pretrained("microsoft/DialoGPT-small")
      5 model = AutoModelWithLMHead.from_pretrained("microsoft/DialoGPT-small")

ImportError: cannot import name 'AutoModelWithLMHead' from 'transformers' (c:\python38\lib\site-packages\transformers\__init__.py)

我该怎么办?

python pytorch huggingface-transformers
1个回答
2
投票

我解决了!显然

AutoModelWithLMHead
在我的版本中被删除了。

现在您需要将

AutoModelForCausalLM
用于因果语言模型,
AutoModelForMaskedLM
用于掩码语言模型,
AutoModelForSeq2SeqLM
用于编码器-解码器模型。

所以在我的例子中,代码如下所示:

from transformers import AutoModelForCausalLM, AutoTokenizer
import torch

tokenizer = AutoTokenizer.from_pretrained("microsoft/DialoGPT-small")
model = AutoModelForCausalLM.from_pretrained("microsoft/DialoGPT-small")
© www.soinside.com 2019 - 2024. All rights reserved.