azure gpt api 流响应

问题描述 投票:0回答:2

我想从 Node.js 中的 gpt api 传输响应。我可以使用以下代码从 OpenAI GPT api 流式传输响应:

import OpenAI from "openai";

const openai = new OpenAI({
    apiKey: 'my_api_key',
});
    
const response = await openai.chat.completions.create({
    model: "gpt-3.5-turbo",
    messages: [
        {
            "role": "user",
            "content": "Generate Lorem Ipsum text."
        }
    ],
    temperature: 0,
    stream: true
});


for await (const chunk of response[Symbol.asyncIterator]()) {
    try {
        process.stdout.write(chunk['choices'][0]['delta']['content']);
    } catch (err) {
    }
}

如何使用 azure 重写此代码?我尝试过类似的方法,但不起作用:

const {OpenAIClient, AzureKeyCredential} = require("@azure/openai");
const endpoint = 'my_endpoint';
const azureApiKey = 'my_azure_gpt_api_key';


const messages = [
    {
        "role": "user",
        "content": "Generate Lorem Ipsum text."
    }
];


async function main() {
    const client = new OpenAIClient(endpoint, new AzureKeyCredential(azureApiKey));
    const deploymentId = "gpt35-turbo-deploy";
    const response = await client.getChatCompletions(deploymentId, messages, {
        temperature: 0,
        stream: true
    })

    // TODO console.log(response['choices'][0]['message']['content']) how?
}

main().catch((err) => {
    console.error("The sample encountered an error:", err);
});

module.exports = {main};

我应该使用上面提到的库吗?或者我可以使用不同的东西吗?

javascript node.js azure openai-api
2个回答
1
投票

Azure OpenAI SDK 目前似乎不支持直接流式响应。

您可以根据 documentationreadableStream.js 示例实现流聊天完成,该示例使用

listChatCompletions
maxTokens
值。

下面是示例代码片段:

const { OpenAIClient, AzureKeyCredential } = require("@azure/openai");

require("dotenv").config();

const endpoint = process.env["ENDPOINT"] || "<endpoint>";
const azureApiKey = process.env["AZURE_API_KEY"] || "<api key>";

const messages = [
  { role: "system", content: "You are a helpful assistant. " },
  { role: "user", content: "Can you help me?" },
  
  { role: "user", content: "Generate Lorem Ipsum text." },
];

async function main() {
  console.log("== Streaming Chat Completions Sample ==");

  const client = new OpenAIClient(endpoint, new AzureKeyCredential(azureApiKey));
  const deploymentId = "<Deployment Name>";
  const events =  await client.listChatCompletions(deploymentId, messages, { maxTokens: 128 });
  const stream = new ReadableStream({
    async start(controller) {
      for await (const event of events) {
        controller.enqueue(event);
      }
      controller.close();
    },
  });
 
  const reader = stream.getReader();
  while (true) {
    const { done, value } = await reader.read();
    if (done) {
      break;
    }
    for (const choice of value.choices) {
      if (choice.delta?.content !== undefined) {
        console.log(choice.delta?.content);
      }
    }
  }
}

main().catch((err) => {
  console.error("The sample encountered an error:", err);
});
module.exports = { main };

enter image description here

注意: 这是示例输出,您可以根据需要修改和重新配置。


0
投票

Azure 添加了流式聊天完成响应的官方方法。 您可以参考他们的存储库上的此示例以查找有关实施的详细信息。

示例片段:

const client = new OpenAIClient(endpoint, new AzureKeyCredential(azureApiKey));
const deploymentId = "gpt-35-turbo";
const events = await client.streamChatCompletions(
  deploymentId,
  [
    {
      role: "system",
      content: "You are a helpful assistant. You will talk like a pirate.",
    },
    { role: "user", content: "Can you help me?" },
    {
      role: "assistant",
      content: "Arrrr! Of course, me hearty! What can I do for ye?",
    },
    { role: "user", content: "What's the best way to train a parrot?" },
  ],
  { maxTokens: 128 }
);

for await (const event of events) {
  for (const choice of event.choices) {
    console.log(choice.delta?.content);
  }
}
© www.soinside.com 2019 - 2024. All rights reserved.