自定义提示llamaindex

问题描述 投票:0回答:1

我已经使用 llamaindex 构建了聊天机器人来从 pdf 中获取响应,我还想添加自定义提示,其中如果用户消息是关于预约的,则用“立即预订!”进行响应。

这是我的基本实现

 upload_dir = 'uploads/machinebuilt'
    file_paths = [os.path.join(upload_dir, filename) for filename in os.listdir(upload_dir) if os.path.isfile(os.path.join(upload_dir, filename))]
    documents = SimpleDirectoryReader(input_files=file_paths).load_data()
    index=VectorStoreIndex.from_documents(documents)
    chat_engine= index.as_chat_engine(response_mode="compact",a_template=PromptTemplate(text_qa_template_str))
    response = chat_engine.chat(question)
    json_response = json.dumps({"response": response}, default=custom_serializer)
    response_dict = json.loads(json_response)
    final_response = response_dict['response']

如何在不影响现有性能的情况下添加提示。?

我尝试过,但预订不起作用

question = request.json.get('question')

    qa_prompt_str = (
        "Context information is below.\n"
        "---------------------\n"
        "{context_str}\n"
        "---------------------\n"
        "Given the context information and not prior knowledge, "
        "answer the question: {query_str}\n"
    )

    refine_prompt_str = (
        "We have the opportunity to refine the original answer "
        "(only if needed) with some more context below.\n"
        "------------\n"
        "{context_msg}\n"
        "------------\n"
        "Given the new context, refine the original answer to better "
        "answer the question: {query_str}. "
        "If the question is about or related to booking an appointment, output the Appointment Answer \n"
        "Appointment Answer: booknow!"
    )

    chat_text_qa_msgs = [
        ChatMessage(
            role=MessageRole.SYSTEM,
            content=(
                "Always answer the question, even if the context isn't helpful."
            ),
        ),
        ChatMessage(role=MessageRole.USER, content=qa_prompt_str),
    ]

    text_qa_template = ChatPromptTemplate(chat_text_qa_msgs)

    # Refine Prompt
    chat_refine_msgs = [
        ChatMessage(
            role=MessageRole.SYSTEM,
            content=(
                "Always answer the question, even if the context isn't helpful."
            ),
        ),
        ChatMessage(role=MessageRole.USER, content=refine_prompt_str),
    ]
    refine_template = ChatPromptTemplate(chat_refine_msgs)
   
   
    
    upload_dir = 'uploads/machinebuilt'
    file_paths = [os.path.join(upload_dir, filename) for filename in os.listdir(upload_dir) if os.path.isfile(os.path.join(upload_dir, filename))]
    documents = SimpleDirectoryReader(input_files=file_paths).load_data()
    index=VectorStoreIndex.from_documents(documents)
    chat_engine= index.as_chat_engine(response_mode="compact", text_qa_template=text_qa_template,refine_template=refine_template)
    response = chat_engine.chat(question)
chatbot large-language-model llama-index
1个回答
0
投票

你解决了吗?我还想根据提示添加其他信息,但不成功

© www.soinside.com 2019 - 2024. All rights reserved.