使用 langchain conversationalretrievalchain.from_llm 返回源文档

问题描述 投票:0回答:1

我正在尝试使用 Langchain 的 conversationalretrievalchain.from_llm 返回源文档,但不断收到错误,该对象只需要一个输出键,但它得到两个:“answer”和“source_documents”。

我查看了其他堆栈溢出帖子和 langchain 文档(这有点令人困惑),我认为我可能错误地使用了该类。无论如何,这是代码:

vectorstore = Pinecone(
    index, embeddings.embed_query, text_field
)

def chat(user_id):
    user_message = request.form.get('message')
    
    # Load the conversation history from session
    conversation_history = session.get('conversation_history_{user_id}', [])
    
    bot_temperature = get_bot_temperature(user_id)
    custom_prompt = get_custom_prompt(user_id)

    # Initialize the chatbot with the bot_temperature
    llm = ChatOpenAI(
        openai_api_key=openai_api_key,
        model_name='gpt-3.5-turbo',
        temperature=bot_temperature
    )

    # Define the prompt template with placeholders for context and chat history
    prompt_template = f"""
        {custom_prompt}

        CONTEXT: {{context}}

        QUESTION: {{question}}"""
    
        # Create a PromptTemplate object with input variables for context and chat history
    TEST_PROMPT = PromptTemplate(input_variables=["context", "question"], template=prompt_template)

    # Create a ConversationBufferMemory object to store the chat history
    memory = ConversationBufferWindowMemory(memory_key="chat_history", return_messages=True, k=8)

    # Create a ConversationalRetrievalChain object with the modified prompt template and chat history memory
    conversation_chain = ConversationalRetrievalChain.from_llm(
            llm=llm,
            retriever=vectorstore.as_retriever(search_kwargs={'filter': {'user_id': f"{user_id}"}}),
            memory=memory,
            combine_docs_chain_kwargs={"prompt": TEST_PROMPT},
            return_source_documents=True
        )
    # Handle the user input and get the response
    response = conversation_chain.run({'question': user_message})
    source_document = response['source_documents'][0]
    print(f"Source document: {source_document}")
    # Save the user message and bot response to session
    conversation_history.append({'input': user_message, 'output': response})
    session['conversation_history'] = conversation_history
    
    # print(f"User: {user_message} | Bot:{response}")  # This will print the conversation history
    print(conversation_history)
    print(session)
    print("*"*100)
    
    return jsonify(response=response)

我尝试获取源文档字典中的值,并使用方法调用而不是运行,但它仍然不起作用。这是我收到的错误:

回溯(最近一次调用最后一次): 文件“/opt/homebrew/lib/python3.11/site-packages/flask/app.py”,第 1455 行,在 wsgi_app 中 响应 = self.full_dispatch_request() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 文件“/opt/homebrew/lib/python3.11/site-packages/flask/app.py”,第 869 行,在 full_dispatch_request 中 rv = self.handle_user_exception(e) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 文件“/opt/homebrew/lib/python3.11/site-packages/flask/app.py”,第 867 行,在 full_dispatch_request 中 rv = self.dispatch_request() ^^^^^^^^^^^^^^^^^^^^^^^^^ 文件“/opt/homebrew/lib/python3.11/site-packages/flask/app.py”,第 852 行,dispatch_request 返回 self.ensure_sync(self.view_functions[rule.endpoint])(**view_args) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^ 文件“/Users/philiphess_1/Desktop/Coding/HR_bot/hr_bot_demo/app.py”,第 334 行,在聊天中 响应=对话链.run({'问题':user_message}) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 文件“/opt/homebrew/lib/python3.11/site-packages/langchain/chains/base.py”,第 500 行,运行中 _output_key = self._run_output_key ^^^^^^^^^^^^^^^^^^^^^^ 文件“/opt/homebrew/lib/python3.11/site-packages/langchain/chains/base.py”,第 449 行,在 _run_output_key 中 引发值错误( ValueError: 当不存在一个输出键时,不支持

run
。得到['answer', 'source_documents']。

flask langchain
1个回答
0
投票

几周前我也遇到了类似的问题,使用的是同一条链条,但有一个

ConversationBufferMemory
。我刚刚添加了
input_key
output_key
,如下所示,它起作用了。

memory = ConversationBufferMemory(
        memory_key="chat_history",
        input_key="question",
        output_key="answer",
        return_messages=True,
        chat_memory=message_manager,
)
© www.soinside.com 2019 - 2024. All rights reserved.