我正在构建一个由 llms 提供支持的问答聊天机器人。我在 bing chat 之类的聊天机器人中看到过,它可以预测用户接下来可能会问的前三个问题。
我的问题是:我如何在我的聊天机器人中做同样的事情?
我已经使用langchain实现了qa聊天机器人。
我想到的方法:
还有其他方法/工具可以完成此任务吗(我找不到)?
我尝试了以下两个有历史记录和无历史记录的选项,我能够成功预测下一个问题。我只是确保角色和上下文设置正确。您可以调整代码并构建链,其中第一个链根据领域知识(您的实际用例问题答案)提供答案,第二个链预测下一个问题
from langchain.chat_models import ChatOpenAI
from langchain.prompts.chat import (
ChatPromptTemplate,
SystemMessagePromptTemplate,
HumanMessagePromptTemplate,
)
from langchain.chains import LLMChain, ConversationChain
from langchain.memory import ConversationBufferMemory
from langchain.prompts import PromptTemplate
#Option 1 Without history
template = """You are a helpful assistant in predicting next question based on current question.
You always provide predicted question on new line"""
system_message_prompt = SystemMessagePromptTemplate.from_template(template)
human_template = "{text}"
human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)
chat_prompt = ChatPromptTemplate.from_messages([system_message_prompt, human_message_prompt])
chain = LLMChain(
llm=ChatOpenAI(temperature=1),
prompt=chat_prompt,
verbose=True
)
print(chain.run("Is computer science right field?"))
#Option 2 With History
template = """You are a helpful assistant in predicting next question based on chat history.
Don't forget, you always provide predicted question on new line with Predicted Question prefix
Current conversation:
{history}
Human: {input}
AI Assistant:"""
PROMPT = PromptTemplate(input_variables=["history", "input"], template=template)
conversation = ConversationChain(
prompt=PROMPT,
llm=ChatOpenAI(model_name="gpt-3.5-turbo", temperature=1),
verbose=True,
memory=ConversationBufferMemory(ai_prefix="AI Assistant"),
)
print(conversation.predict(input="Is computer science good field ?"))
print(conversation.predict(input="Is computer science is complicated field ?"))