ConversationChain 与 langchain 中的上下文

问题描述 投票:0回答:1

我想创建一个基于

langchain
的聊天机器人。在对话的第一条消息中,我想传递初始上下文。

有什么方法可以做到吗?我正在努力解决这个问题,因为从我看来,我可以使用提示模板。从他们的例子来看:

template = """The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.

Current conversation:
{history}
Human: {input}
AI Assistant:"""
PROMPT = PromptTemplate(input_variables=["history", "input"], template=template)
conversation = ConversationChain(
    prompt=PROMPT,
    llm=llm,
    verbose=True,
    memory=ConversationBufferMemory(ai_prefix="AI Assistant"),
)

但问题是,我处理模型的常用方法是使用

SystemMessage
,它为机器人提供上下文和指导。我不确定这个模板是否是
langchain
处理系统消息的推荐方式。如果不是的话,能否请您说明正确的方法?

python-3.x chatbot langchain
1个回答
0
投票

您可以使用ChatPromptTemplate,为了设置上下文,您可以使用HumanMessage和AIMessage提示。以下是工作代码示例

        from langchain.chains import ConversationChain
        from langchain.memory import ConversationBufferMemory
        from langchain.chat_models import ChatOpenAI
        from langchain.chains import LLMChain

        from langchain.prompts.chat import (
            ChatPromptTemplate,
            SystemMessagePromptTemplate,
            AIMessagePromptTemplate,
            HumanMessagePromptTemplate,
        )

        template = """

        The following is a friendly conversation between a human and an AI. 
        The AI is talkative and provides lots of specific details from its context. 
        If the AI does not know the answer to a question, it truthfully says it does
        not know.

        Current conversation:
        Human: {input}
        AI Assistant:"""

        system_message_prompt = SystemMessagePromptTemplate.from_template(template)

        example_human_history = HumanMessagePromptTemplate.from_template("Hi")
        example_ai_history = AIMessagePromptTemplate.from_template("hello, how are you today?")

        human_template="{input}"
        human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)

        chat_prompt = ChatPromptTemplate.from_messages([system_message_prompt, example_human_history, example_ai_history, human_message_prompt])

        chat = ChatOpenAI(model_name="gpt-3.5-turbo", temperature=0)

        chain = LLMChain(llm=chat, prompt=chat_prompt)

        print(chain.run("What is the future of generative AI ? Explain in two sentences ?"))
© www.soinside.com 2019 - 2024. All rights reserved.