Huggingface - 带有经过微调的预训练模型的管道

问题描述 投票:0回答:0

我有一个来自

facebook/bart-large-mnli
的预训练模型我使用了Trainer以便在我自己的数据集上训练它。

model = BartForSequenceClassification.from_pretrained("facebook/bart-large-mnli", num_labels=14, ignore_mismatched_sizes=True)

然后在训练之后,我尝试使用以下内容(使用微调模型创建 pipeline):

# Import the Transformers pipeline library
from transformers import pipeline

# Initializing Zero-Shot Classifier
classifier = pipeline("zero-shot-classification", model=model, tokenizer=tokenizer, id2label=id2label)

我从中得到以下错误:

Failed to determine 'entailment' label id from the label2id mapping in the model config. Setting to -1. Define a descriptive label2id mapping in the model config to ensure correct outputs.

我尝试在网上搜索解决方案,但找不到任何东西。

你可以参考我之前训练有困难时的问题这里


应用后this我收到以下错误:

RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu! (when checking argument for argument index in method wrapper__index_select)

我尝试删除我的自定义指标并修复了一段时间,但它并没有持续,这个错误不断出现。

错误来自这里:

sequences = "Some text sequence"

classifier(sequences, list(id2label.values()), multi_label=False)
python pipeline huggingface-transformers text-classification huggingface
© www.soinside.com 2019 - 2024. All rights reserved.