通过连接到 resnet tensorflow 服务服务进行推理的代码有什么问题?

问题描述 投票:0回答:0

通过以下命令下载了tensorflow serving resnet model pb文件并解压,同时创建了一个版本1目录:

wget -c https://storage.googleapis.com/download.tensorflow.org/models/official/20181001_resnet/savedmodels/resnet_v2_fp16_savedmodel_NHWC_jpg.tar.gz
tar -zxvf resnet_v2_fp16_savedmodel_NHWC_jpg.tar.gz
mkdir -p ./resnet_v2_fp16_savedmodel_NHWC_jpg/1538687108/1
cp ./resnet_v2_fp16_savedmodel_NHWC_jpg/1538687108/saved_model.pb ./resnet_v2_fp16_savedmodel_NHWC_jpg/1538687108/1
cp -r ./resnet_v2_fp16_savedmodel_NHWC_jpg/1538687108/variables ./resnet_v2_fp16_savedmodel_NHWC_jpg/1538687108/1

然后通过以下命令拉取并启动tensorflow serving docker服务:

docker pull tensorflow/serving
docker run -p 8500:8500 -p 8501:8501 --mount type=bind,source=$(pwd)/resnet_v2_fp16_savedmodel_NHWC_jpg/1538687108,target=/models/tf-serving-resnet -e MODEL_NAME=tf-serving-resnet -t tensorflow/serving

以下输出表明服务已正确启动。

2023-03-21 19:05:02.754287: I tensorflow_serving/model_servers/server.cc:74] Building single TensorFlow model file config:  model_name: tf-serving-resnet model_base_path: /models/tf-serving-resnet
2023-03-21 19:05:02.754609: I tensorflow_serving/model_servers/server_core.cc:465] Adding/updating models.
2023-03-21 19:05:02.754620: I tensorflow_serving/model_servers/server_core.cc:594]  (Re-)adding model: tf-serving-resnet
2023-03-21 19:05:02.969747: I tensorflow_serving/core/basic_manager.cc:739] Successfully reserved resources to load servable {name: tf-serving-resnet version: 1}
2023-03-21 19:05:02.969814: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: tf-serving-resnet version: 1}
2023-03-21 19:05:02.969835: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: tf-serving-resnet version: 1}
2023-03-21 19:05:02.969990: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:45] Reading SavedModel from: /models/tf-serving-resnet/1
2023-03-21 19:05:02.978847: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:89] Reading meta graph with tags { serve }
2023-03-21 19:05:02.978873: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:130] Reading SavedModel debug info (if present) from: /models/tf-serving-resnet/1
2023-03-21 19:05:02.979085: I external/org_tensorflow/tensorflow/core/platform/cpu_feature_guard.cc:193] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations:  AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2023-03-21 19:05:03.042311: I external/org_tensorflow/tensorflow/compiler/mlir/mlir_graph_optimization_pass.cc:357] MLIR V1 optimization pass is not enabled
2023-03-21 19:05:03.051532: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:229] Restoring SavedModel bundle.
2023-03-21 19:05:03.271342: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:213] Running initialization op on SavedModel bundle at path: /models/tf-serving-resnet/1
2023-03-21 19:05:03.286821: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:305] SavedModel load for tags { serve }; Status: success: OK. Took 316826 microseconds.
2023-03-21 19:05:03.288695: I tensorflow_serving/servables/tensorflow/saved_model_warmup_util.cc:62] No warmup data file found at /models/tf-serving-resnet/1/assets.extra/tf_serving_warmup_requests
2023-03-21 19:05:03.403667: I tensorflow_serving/core/loader_harness.cc:95] Successfully loaded servable version {name: tf-serving-resnet version: 1}
2023-03-21 19:05:04.725077: I tensorflow_serving/model_servers/server_core.cc:486] Finished adding/updating models
2023-03-21 19:05:04.725315: I tensorflow_serving/model_servers/server.cc:118] Using InsecureServerCredentials
2023-03-21 19:05:04.725338: I tensorflow_serving/model_servers/server.cc:383] Profiler service is enabled
2023-03-21 19:05:04.726380: I tensorflow_serving/model_servers/server.cc:409] Running gRPC ModelServer at 0.0.0.0:8500 ...
[warn] getaddrinfo: address family for nodename not supported
2023-03-21 19:05:04.802082: I tensorflow_serving/model_servers/server.cc:430] Exporting HTTP/REST API at:localhost:8501 ...
[evhttp_server.cc : 245] NET_LOG: Entering the event loop ...

通过使用以下命令可以查看模型的输入和输出:

saved_model_cli show --all --dir ./resnet_v2_fp16_savedmodel_NHWC_j
pg/1538687108/1

它显示:

MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:

signature_def['predict']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['image_bytes'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: input_tensor:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['classes'] tensor_info:
        dtype: DT_INT64
        shape: (-1)
        name: ArgMax:0
    outputs['probabilities'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 1001)
        name: softmax_tensor:0
  Method name is: tensorflow/serving/predict

signature_def['serving_default']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['image_bytes'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: input_tensor:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['classes'] tensor_info:
        dtype: DT_INT64
        shape: (-1)
        name: ArgMax:0
    outputs['probabilities'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 1001)
        name: softmax_tensor:0
  Method name is: tensorflow/serving/predict

但是下面的python脚本从tensorflow serving docker服务器得到了错误响应:

from PIL import Image
IMAGE_URL = 'https://tensorflow.org/images/blogs/serving/cat.jpg'
import requests
dl_request = requests.get(IMAGE_URL, stream=True)
dl_request.raise_for_status()
content_byte_list = list( dl_request.content )
import base64
jpeg_bytes = base64.b64encode( dl_request.content ).decode('utf-8')
predict_request = '{"inputs" : [{"image_bytes": "%s"}]}' % jpeg_bytes
SERVER_URL = 'http://localhost:8501/v1/models/tf-serving-resnet:predict'
response = requests.post(SERVER_URL, data=predict_request)
response.json().keys()

错误响应显示为(如果运行 python 代码:response.json()):

{'error': 'JSON Value: {\n    "image_bytes": "/9j/4......//Z"\n} not formatted correctly for base64 data'}

如何纠正这个错误?请提供正确的代码以获得正确的响应。提前致谢。

tensorflow resnet inference serving
© www.soinside.com 2019 - 2024. All rights reserved.