如何在 Microsoft AutoGen 中指定对话历史记录?

问题描述 投票:0回答:1

背景故事:我将 AutoGen 集成为一种方法,以在我的应用程序中进一步进行对话(而不是直接调用 OpenAI 的 聊天完成)。

如何将自己的消息注入到 user_proxy 聊天记录中?

这是我迄今为止所拥有的:

assistant = AssistantAgent(
    "assistant",
    human_input_mode="NEVER",
    llm_config={"config_list": config_list},
    system_message=str(system_message))

user_proxy = UserProxyAgent(
    "user_proxy",
    human_input_mode="NEVER",
    max_consecutive_auto_reply=0,
)

# TODO: populate chat messages from existing conversation here

user_proxy.initiate_chat(
    assistant,
    clear_history=True,
    message=prompt,
)

last_message = user_proxy.last_message()
logger.info("AUTOGEN: last_message: %s", last_message)
return last_message['content']

注意:我必须设置

max_consecutive_auto_reply=0
,否则 AutoGen 会永远循环,向助手发送空白提示。

openai-api azure-openai
1个回答
0
投票

我认为一个肮脏但有效的解决方案是更改助手和 user_proxy 的

_oai_messages

如果你查看源代码,你可以看到 self._oai_system_message + self._oai_messages 到 Openai API。

    def generate_oai_reply(
        self,
        messages: Optional[List[Dict]] = None,
        sender: Optional[Agent] = None,
        config: Optional[OpenAIWrapper] = None,
    ) -> Tuple[bool, Union[str, Dict, None]]:
        """Generate a reply using autogen.oai."""
        client = self.client if config is None else config
        if client is None:
            return False, None
        if messages is None:
            messages = self._oai_messages[sender]

        # TODO: #1143 handle token limit exceeded error
        response = client.create(
            context=messages[-1].pop("context", None), messages=self._oai_system_message + messages
        )

        # TODO: line 301, line 271 is converting messages to dict. Can be removed after ChatCompletionMessage_to_dict is merged.
        extracted_response = client.extract_text_or_completion_object(response)[0]
        if not isinstance(extracted_response, str):
            extracted_response = extracted_response.model_dump(mode="dict")
        return True, extracted_response
© www.soinside.com 2019 - 2024. All rights reserved.