为什么langchain ConversationalRetrievalChain不记得聊天记录,即使我将其添加到chat_history参数中?

问题描述 投票:0回答:2

正在研究AI和LangChain,我正在尝试做一个对话式聊天机器人。到目前为止一切顺利,我设法向其提供自定义文本,并且它根据文本回答问题,但由于某种原因它不记得以前的答案。从这个问题来看,

ConversationalRetrievalChain
需要采用chat_history参数来保留记忆,但即使我提供了它,它仍然无法记住任何东西。这是我的代码:

history = []
def ask(question: str):
    chat = ConversationalRetrievalChain.from_llm(llm, vectorstore.as_retriever(), memory=memory)
    answer = chat({"question": question, "chat_history": history})["answer"]
    history.append((question, answer))
    print(answer)
    return answer


ask("Who is Bound by this Agreement?") #Answers correctly
ask("What did I ask in previous question?") #Doesn't remember

我已验证聊天记录确实已记录到

history
列表中。那么为什么模型不记得之前发生了什么呢?

python artificial-intelligence chatbot langchain
2个回答
1
投票

ConversationalRetrievalChain
正在执行几个步骤:

  1. 重新表述独立问题的输入
  2. 检索文件
  3. 根据提供的上下文提出问题

如果您通过

memory
进行配置,它也会更新问题和答案。 由于我在文档中没有找到任何有关使用过的提示的信息,因此我在存储库中寻找它们,其中有两个关键的:

const question_generator_template = `Given the following conversation and 
a follow up question, rephrase the follow up question to be a standalone 
question.
Chat History:
{chat_history}
Follow Up Input: {question}
Standalone question:`;

export const DEFAULT_QA_PROMPT = /*#__PURE__*/ new PromptTemplate({
template:
"Use the following pieces of context to answer the question at the end. 
If you don't know the answer, just say that you don't know, don't try to 
make up an answer.\n\n{context}\n\nQuestion: {question}\nHelpful 
Answer:,
inputVariables: ["context", "question"],
});

如您所见,只有

question_generator_template
有 chat_history 上下文。 我遇到了与您相同的问题,我更改了 qaChain 的提示,因为在链中,它的每个部分都可以访问所有输入变量,您只需修改提示并添加 chat_history 输入,如下所示:

const QA_PROMPT = new PromptTemplate({
template:
"Use the following pieces of context and chat history to answer the 
question at the end.\n" +
"If you don't know the answer, just say that you don't know, " +
"don't try to make up an answer.\n\n" +
"{context}\n\nChat history: {chat_history}\n\nQuestion: {question} 
\nHelpful Answer:",
inputVariables: ["context", "question", "chat_history"],
});

然后将其传递给 fromLLM() 函数:

chat = ConversationalRetrievalChain.from_llm(llm, 
vectorstore.as_retriever(), memory=memory, qaChainOptions: {type: 
"stuff", prompt: QA_PROMPT})

现在实际提出问题的最终提示已提供 chat_history ,并且应该按您的预期工作。 您还可以将

verbose: true
传递给配置,以便它将记录所有带有提示的调用,以便更轻松地进行调试。 如果对您有帮助请告诉我。


0
投票

您不需要显式地将问题和答案附加到历史记录中,ConversationalRetrievalChain 模型会自动获取它

您正在ask方法中创建ConversationalRetrievalChain对象并将问题传递给它。

发生的情况是,每次您向它提出问题时,都会从 ConversationalRetrievalChain 创建一个新的聊天对象,它将覆盖以前的内存并重新开始。

要解决此问题,请在询问函数外部创建聊天对象 ConversationalRetrievalChain 并将其作为参数传递给它。

喜欢

chat = ConversationalRetrievalChain.from_llm(llm, vectorstore.as_retriever(), memory=memory)

ask("Who is Bound by his Agreement?", chat) #Answers correctly
ask("What did I ask in previous question?", chat) #Doesn't remember

def ask(question: str, chat: Object):
    answer = chat({"question": question)["answer"]
    print(answer)
    return answer
© www.soinside.com 2019 - 2024. All rights reserved.