以下是代码:
from transformers import AutoModelForCausalLM, AutoTokenizer, AutoModel
import torch
model = torch.load("./tmp/gptj.pt")
model.eval()
tokenizer = AutoTokenizer.from_pretrained("./tmp")
device = "cpu"
inputs = tokenizer.encode("function helloWorld():", return_tensors="pt").to(device)
outputs = model.generate(inputs)
print(tokenizer.decode(outputs[0]))type here
环境是:
蟒蛇:3.9.15 手电筒:2.0.0 变形金刚:4.28.1
我试过更改火炬和变压器的版本,但无济于事。