如何在tensorflow_model_server上查询REST API运行?

问题描述 投票:0回答:1

我试图使用simple TensorFlow estimators example: official Iris classification problem实施的this code运行this tutorial并保存模型。

TensorFlow提供了一个命令行工具来检查导出的模型,如下所示:

$ saved_model_cli show --dir export/1550568903/ \
> --tag_set serve --signature_def serving_default
The given SavedModel SignatureDef contains the following input(s):
  inputs['inputs'] tensor_info:
      dtype: DT_STRING
      shape: (-1)
      name: input_example_tensor:0
The given SavedModel SignatureDef contains the following output(s):
  outputs['classes'] tensor_info:
      dtype: DT_STRING
      shape: (-1, 3)
      name: dnn/head/Tile:0
  outputs['scores'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1, 3)
      name: dnn/head/predictions/probabilities:0
Method name is: tensorflow/serving/classify

我安装了tensorflow-model-server并启动了支持REST API的模型服务器:

$ apt-get install tensorflow-model-server
$ tensorflow_model_server --port=9000 --rest_api_port=9001 --model_base_path=/home/deploy/export
...
2019-02-22 11:36:44.989600: I tensorflow_serving/model_servers/server.cc:302] Exporting HTTP/REST API at:localhost:9001 ...

然后我调用下面的REST API,如:

$ curl -d '{"inputs":[{"SepalLength":[5.1],"SepalWidth":[3.3],"PetalLength":[1.7],"PetalWidth":[0.5]}]}' \
   -X POST http://localhost:9001/v1/models/default:predict

{“error”:“JSON值:{\ n \”SepalLength \“:[\ n 5.1 \ n],\ n \”SepalWidth \“:[\ n 3.3 \ n],\ n \”PetalLength \“: [\ n 1.7 \ n],\ n \“PetalWidth \”:[\ n 0.5 \ n] \ n}未正确格式化为base64数据“}

发生错误,称为“未正确格式化base64数据”。所以,我编码输入如下:

$ curl -d '{"inputs": [{"b64": "W3siU2VwYWxMZW5ndGgiOls1LjFdLCJTZXBhbFdpZHRoIjpbMy4zXSwiUGV0YWxMZW5ndGgiOlsxLjddLCJQZXRhbFdpZHRoIjpbMC41XX1d"}]}' \
   -X POST http://localhost:9001/v1/models/default:predict

但是,仍然存在以下错误:

{“error”:“无法解析示例输入,值:\'[{\”SepalLength \“:[5.1],\”SepalWidth \“:[3.3],\”PetalLength \“:[1.7],\” PetalWidth \“:[0.5]}] \'\ n \ t [[{{node ParseExample / ParseExample}} = ParseExample [Ndense = 4,Nsparse = 0,Tdense = [DT_FLOAT,DT_FLOAT,DT_FLOAT,DT_FLOAT],_ output_shapes = [[?,1],[?,1],[?,1],[?,1]],dense_shapes = [[1],[1],[1],[1]],sparse_types = [] ,_device = \“/ job:localhost / replica:0 / task:0 / device:CPU:0 \”](_ arg_input_example_tensor_0_0,ParseExample / ParseExample / names,ParseExample / ParseExample / dense_keys_0,ParseExample / ParseExample / dense_keys_1,ParseExample / ParseExample / dense_keys_2,ParseExample / ParseExample / dense_keys_3,ParseExample / Const,ParseExample / Const,ParseExample / Const,ParseExample / Const)]]“}

我究竟做错了什么?如何在没有错误的情况下调用REST API?

tensorflow tensorflow-serving tensorflow-estimator
1个回答
2
投票

我试图重现你的错误,我得到了Curl Predict的类似错误。

但是当我使用Classify时,我得到了输出。

代码如下所示:

curl -d '{"examples":[{"SepalLength":[5.1],"SepalWidth":[3.3],"PetalLength":[1.7],"PetalWidth":[0.5]}]}' -X POST http://localhost:8501/v1/models/export:classify

输出是:

 {"results": [[["0", 0.998091], ["1", 0.00190929], ["2", 1.46236e-08]]]}
© www.soinside.com 2019 - 2024. All rights reserved.