无法解析LLM输出

问题描述 投票:0回答:0
llm = ChatOpenAI(model_name="gpt-3.5-turbo",temperature=0.3)
memory = ConversationBufferMemory(memory_key="chat_history",return_messages=True)
agent_chain = initialize_agent(tools, llm, agent=AgentType.CHAT_CONVERSATIONAL_REACT_DESCRIPTION, verbose=True, memory=memory)

human_message = "hi"

while(True):
  bot_message = agent_chain.run(input=human_message);
  print(">>>>> Assist: ", bot_message)
  print(">>>>> Human:", end=" ")
  human_message = input();

回溯(最近一次调用最后一次):文件 “D:\work\gpt ooking_bot pp.py”,第 48 行,在 bot_message = agent_chain.run(输入= human_message); ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 文件 “D:\work\gpt ooking_bot nv\Lib\site-packages\langchai

python chatbot openai-api langchain llm
© www.soinside.com 2019 - 2024. All rights reserved.