具有结构化输出 Ollama 函数的 LangChain Python

问题描述 投票:0回答:1

我正在按照本指南设置自抹布: https://github.com/langchain-ai/langgraph/blob/main/examples/rag/langgraph_self_rag.ipynb

目前不允许我使用 Open AI 模型,所以我一直使用 ChatOllama 模型。我希望开始能够使用“with_structed_output()”函数通过 OllamaFunctions 而不是 ChatOllama 来管道输出。此处演示:https://python.langchain.com/docs/integrations/chat/ollama_functions/

本质上是代码:

from langchain_experimental.llms.ollama_functions import OllamaFunctions


from langchain_core.prompts import PromptTemplate
from langchain_core.pydantic_v1 import BaseModel, Field


# Schema for structured response
class Person(BaseModel):
    name: str = Field(description="The person's name", required=True)
    height: float = Field(description="The person's height", required=True)
    hair_color: str = Field(description="The person's hair color")


# Prompt template
prompt = PromptTemplate.from_template(
    """Alex is 5 feet tall. 
Claudia is 1 feet taller than Alex and jumps higher than him. 
Claudia is a brunette and Alex is blonde.

Human: {question}
AI: """
)

# Chain
llm = OllamaFunctions(model="phi3", format="json", temperature=0)
structured_llm = llm.with_structured_output(Person)
chain = prompt | structured_llm

我遇到两个错误,使我陷入了死胡同。第一个是:

ValidationError: 1 validation error for OllamaFunctions
__root__
  langchain_community.chat_models.ollama.ChatOllama() got multiple values for keyword argument 'format' (type=type_error)

所以我改变了

llm = OllamaFunctions(model="phi3", format="json", temperature=0)
llm = OllamaFunctions(model="phi3", temperature=0)

这至少让我进入下一行。然后, with_structed_output(Person) 行失败并出现错误:

File ~/anaconda3/envs/pytorch_p310/lib/python3.10/site-packages/langchain_core/language_models/base.py:208, in BaseLanguageModel.with_structured_output(self, schema, **kwargs)
    204 def with_structured_output(
    205     self, schema: Union[Dict, Type[BaseModel]], **kwargs: Any
    206 ) -> Runnable[LanguageModelInput, Union[Dict, BaseModel]]:
    207     """Implement this if there is a way of steering the model to generate responses that match a given schema."""  # noqa: E501
--> 208     raise NotImplementedError()

NotImplementedError:

我不知道从这里该去哪里。任何事情都会有所帮助。谢谢!

langchain llama
1个回答
0
投票

在检查了 git 上的代码(https://github.com/langchain-ai/langchain/blob/master/libs/experimental/langchain_experimental/llms/ollama_functions.py)并进行比较后,我遇到了与你相同的问题它与我的环境中的包中包含的代码一起,似乎缺少一大块应该支持 .with_structed_output() 的代码。我用git上的代码替换了代码,看起来效果很好。

© www.soinside.com 2019 - 2024. All rights reserved.