Tensorflow服务对象检测客户端

问题描述 投票:0回答:1

我有一个运行对象检测器模型的tensorflow服务docker容器。为了生成此模型,我使用了以下代码

  input_img = sess.graph.get_tensor_by_name('Placeholder:0')
  output_cls_prob = sess.graph.get_tensor_by_name('Reshape_2:0')
  output_box_pred = sess.graph.get_tensor_by_name('rpn_bbox_pred/Reshape_1:0')

  builder = tf.saved_model.builder.SavedModelBuilder('./export/1')

  imageplaceholder_info = tf.saved_model.utils.build_tensor_info(input_img)
  cls_prob_info = tf.saved_model.utils.build_tensor_info(output_cls_prob)
  box_pred_info = tf.saved_model.utils.build_tensor_info(output_box_pred)
  prediction_signature = (
    tf.saved_model.signature_def_utils.build_signature_def(
      inputs={
        'image': imageplaceholder_info
      },
      outputs={
        'output_cls_prob': cls_prob_info,
        'output_box_pred': box_pred_info
      },
      method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME
    )
  )
  init_op = tf.group(tf.tables_initializer(), name='legacy_init_op')
  builder.add_meta_graph_and_variables(sess, [tf.saved_model.tag_constants.SERVING],
       signature_def_map={'ctpn_recs_predict': prediction_signature}, legacy_init_op=init_op)
  builder.save()

码头工人正在localhost:端口9000上运行此tensorflow服务模型。如何在该端口号上发送图像并获取适当的响应(在我的情况下为output_cls_prob和output_box_pred)?

到目前为止,我已经有了此grpc客户端代码来读取图像:

def run(host, port, image, model, signature_name):

    channel = grpc.insecure_channel('{host}:{port}'.format(host=host, port=port))
    stub = prediction_service_pb2_grpc.PredictionServiceStub(channel)


    # Read an image
    data = imread(image)
    data = data.astype(np.float32)
    print(data)




    if __name__ == '__main__':
        parser = argparse.ArgumentParser()
        parser.add_argument('--host', help='Tensorflow server host name', default='localhost', type=str)
        parser.add_argument('--port', help='Tensorflow server port number', default=9000, type=int)
        parser.add_argument('--image', help='input image', type=str, default='1.jpg')
        parser.add_argument('--model', help='model name', type=str, default='serve/test')
        parser.add_argument('--signature_name', help='Signature name of saved TF model',
                            default='prediction_signature', type=str)

        args = parser.parse_args()
        run(args.host, args.port, args.image, args.model, args.signature_name)

我应该怎么做才能将读取的图像发送到tensorflow服务docker容器进行预测?

python docker tensorflow deep-learning tensorflow-serving
1个回答
0
投票

您可以使用一个Python脚本,该脚本使用REST查询TF服务,并从模型中获取响应。

请参阅来自tensorflow的以下博客,他们提到了在Docker中托管TF服务模型并发送预测请求的完整详细信息。

https://medium.com/tensorflow/serving-ml-quickly-with-tensorflow-serving-and-docker-7df7094aa008

© www.soinside.com 2019 - 2024. All rights reserved.