# Setup your supervisor, conditional logic, and integrate the nodes into the workflow
members = ["Web_Searcher", "Web_Searcher_Quality","Blog_Searcher", "Blog_Searcher_Quality", "Content_Writer"]
system_prompt = (
"""As a supervisor, your role is to oversee a dialogue between these
workers: {members}. and excute all the agents one by one,
determine which worker should take the next action. Each worker is responsible for
executing a specific task and reporting back their findings and progress. Once all tasks are complete,
indicate with 'FINISH'."""
)
options = ["FINISH"] + members
function_def = {
"name": "route",
"description": "Select the next role.",
"parameters": {
"title": "routeSchema",
"type": "object",
"properties": {"next": {"title": "Next", "anyOf": [{"enum": options}] }},
"required": ["next"],
},
}
prompt = ChatPromptTemplate.from_messages([
("system", system_prompt),
MessagesPlaceholder(variable_name="messages"),
("system", "Given the conversation above, who should act next? Or should we FINISH? Select one of: {options}"),
]).partial(options=str(options), members=", ".join(members))
manager_chain = (prompt | llm.bind_functions(functions=[function_def], function_call="route") | JsonOutputFunctionsParser())
我期待它运行,我不知道为什么我被卡住了,这是我得到的输出。 流动 manager_chain = (提示 | llm.bind_functions(functions=[function_def], function_call="route") | JsonOutputFunctionsParser()) ^^^^^^^^^^^^^^^^^^^ AttributeError:“Ollama”对象没有属性“bind_functions”
如错误所示,
Ollama
类没有bind_functions()
方法。相反, ChatOpenAI
类确实有 bind_functions()
方法。
OllamaFunctions
包装类,为其提供与 OpenAI Functions 相同的 API。
示例:
from langchain_experimental.llms.ollama_functions import OllamaFunctions
model = OllamaFunctions(model="llama2")
model = model.bind(
functions=[
{
"name": "route",
"description": "Select the next role.",
"parameters": {
"title": "routeSchema",
"type": "object",
"properties": {"next": {"title": "Next", "anyOf": [{"enum": options}] }},
"required": ["next"],
},
}
],
function_call={"name": "route"},
)
此实现可能适用于您的函数调用用例。但是,正如文档所述,“更强大、更有能力的模型将在复杂的模式和/或多个功能上表现更好。”
参考资料: