我是Langchain的新手,最近在学习多路由链。在自己练习的时候,我想知道为什么我们不能使用 ChatPromptTemplate 而不是 PromptTemplate 来获取路由器提示。
Router_temp = MULTI_PROMPT_ROUTER_TEMPLATE.format(destinations = destination_str)
router_prompt = ChatPromptTemplate(template = Router_temp, input_variables=["input"], output_parser= RouterOutputParser())
router_chain = LLMRouterChain.from_llm(llm =llm, prompt = router_prompt)
以下是它抛出的错误: 关键错误:“消息”
但是当我之前在代码中使用 ChatPromptTemplate 来处理类似的情况时,它起作用了。如下所示:
beginner_phy_teach = """You are a Physics teacher who specialises in basic physics concepts. Your name is Feynman, always introduce yourself. Explain these concepts like your audience are five
The topic is as follows\n. {input}"""
advanced_phy_teach = """You are a Physics teacher who specialises in advanced physics concepts. Your name is Oppenheimer, always introduce yourself. Explain these concepts like your audience already know the basics.
The topic is as follows\n. {input}"""
prompt_info = [{"Name": "Beginner Physics Teacher", "Description": "Good for answering questions related to beginner Physics concepts", "Template": beginner_phy_teach
}, {"Name": "Advanced Physics Teacher", "Description": "Good for answering questions related to advanced Physics concepts", "Template": advanced_phy_teach
}]
destination_chains = {}
for p in prompt_info:
name = p["Name"]
template = p["Template"]
chat_prompt = ChatPromptTemplate.from_template(template=template)
chain = LLMChain(llm=llm, prompt=chat_prompt)
destination_chains[name] = chain
使用 ChatPromptTemplate 创建模板,然后将模板传递给大语言模型
llm = ChatOpenAI(temperature=0, openai_api_key=openai_api_key)
# Create a chat prompt template
prompt_template = ChatPromptTemplate.from_messages(
[
("system", "Your an smart ai agent."),
("human", "Respond to question: {question}")
]
)
# Insert a question into the template and call the model
full_prompt = prompt_template.format_messages(question='Why is the sky blue?')
llm(full_prompt)