TensorFlow服务REST API-JSON解析错误

问题描述 投票:0回答:1

我冻结并导出了一个SavedModel,根据saved_model_cli,该视频将一批具有以下格式的视频作为输入:

The given SavedModel SignatureDef contains the following input(s):
inputs['ims_ph'] tensor_info:
    dtype: DT_UINT8
    shape: (1, 248, 224, 224, 3)
    name: Placeholder:0
inputs['samples_ph'] tensor_info:
    dtype: DT_FLOAT
    shape: (1, 173774, 2)
    name: Placeholder_1:0
The given SavedModel SignatureDef contains the following output(s):
... << OUTPUTS >> ......
Method name is: tensorflow/serving/predict

我有一个在本地成功运行的TF服务(HTTP / REST)服务器。在我的Python客户端代码中,我有2个填充的类型numpy.ndarray的对象,它们的形状分别为(1、248、224、224、3),形状为ims,形状为(1,173774,2)。

我正在尝试对我的TF模型服务器进行推断(请参见下面的客户端代码),但收到以下错误:samples

{u'error': u'JSON Parse error: Invalid value. at offset: 0'}

# I have tried the following combinations without success: data = {"instances" : [{"ims_ph": ims.tolist()}, {"samples_ph": samples.tolist()} ]} data = {"inputs" : { "ims_ph": ims, "samples_ph": samples} } r = requests.post(url="http://localhost:9000/v1/models/multisensory:predict", data=data) 似乎并不表示这两个输入张量需要任何额外的转义/编码。由于这些不是二进制数据,因此我也不认为base64编码是正确的方法。任何对这里可行方法的指导将不胜感激!

python rest tensorflow tensorflow-serving
1个回答
© www.soinside.com 2019 - 2024. All rights reserved.