消费在 Azure-ML 上部署 Prophet TimeSeries 预测模型以预测未来?

问题描述 投票:0回答:0

我是 ML_model 部署和 Azure-ML 的新手。目前,我已经在 Azure 中使用 MLflow 为经过训练和部署的 Prophet 模型生成了端点。但是我被困在模型端点的 input_data 结构(或格式)上。问题是 Prophet 接受“Dataframe”作为预测预测的输入,但端点消费代码需要“Json”作为输入。

如果我的训练数据框是:

import pandas as pd
# Dataframe for Time-series model training
df = pd.DataFrame({'ds': ['2023-01-11 01:00:00', '2023-01-11 02:00:00' ],'y':[21, 22 ]})

from prophet import Prophet
# Training the model
model = Prophet()
model.fit(df)

# For predicting the forecast
future = model.make_future_dataframe(periods=2)
model.predict(future)

问题是 Prophet 接受“Dataframe”作为预测预测的输入,但端点消费代码需要“Json”作为输入。我应该怎么办 ?我缺少什么,我怎样才能走得更远?请提出建议。 我是模型部署和 Azure 的新手。

这里我提供了使用 Azure 中生成的端点的代码:

import urllib.request
import json
import os
import ssl

def allowSelfSignedHttps(allowed):
    # bypass the server certificate verification on client side
    if allowed and not os.environ.get('PYTHONHTTPSVERIFY', '') and getattr(ssl, '_create_unverified_context', None):
        ssl._create_default_https_context = ssl._create_unverified_context

allowSelfSignedHttps(True) # this line is needed if you use self-signed certificate in your scoring service.

# Request data goes here
# The example below assumes JSON formatting which may be updated
# depending on the format your endpoint expects.
# More information can be found here:
# https://docs.microsoft.com/azure/machine-learning/how-to-deploy-advanced-entry-script

data =  {                   ## I want to know about the format of data to be given here
  "input_data": {}
}

body = str.encode(json.dumps(data))

url = 'The_Model_End-points'
# Replace this with the primary/secondary key or AMLToken for the endpoint
api_key = 'The-API-Key'
if not api_key:
    raise Exception("A key should be provided to invoke the endpoint")

# The azureml-model-deployment header will force the request to go to a specific deployment.
# Remove this header to have the request observe the endpoint traffic rules
headers = {'Content-Type':'application/json', 'Authorization':('Bearer '+ api_key), 'azureml-model-deployment':'MymodelName' }

req = urllib.request.Request(url, body, headers)

try:
    response = urllib.request.urlopen(req)

    result = response.read()
    print(result)
except urllib.error.HTTPError as error:
    print("The request failed with status code: " + str(error.code))

    # Print the headers - they include the requert ID and the timestamp, which are useful for debugging the failure
    print(error.info())
    print(error.read().decode("utf8", 'ignore'))
python azure machine-learning azure-deployment facebook-prophet
© www.soinside.com 2019 - 2024. All rights reserved.