Tensorflow对象检测-将.pb文件转换为tflite

问题描述 投票:4回答:2

我尝试将冻结的SSD mobilenet v2模型转换为TFLITE格式以供Android使用。这是我所有的步骤:

  1. 我使用ssd_mobilenet_v2_coco_2018_03_29模型使用TF对象检测API的train.py文件进行重新训练。 ((确定)

  2. 也使用TF Object Detection API提供的export_inference_graph.p y将训练后的model.ckpt导出到冻结的模型文件。 ((确定)

  3. 在具有GPU且仅允许CPU的python中测试冻结的图。有用。 ((确定)

这是缺点,我尝试使用以下代码:

import tensorflow as tf
tf.enable_eager_execution()
saved_model_dir = 'inference_graph/saved_model/'
converter = tf.contrib.lite.TFLiteConverter.from_saved_model(saved_model_dir,input_arrays=input_arrays,output_arrays=output_arrays,input_shapes={"image_tensor": [1, 832, 832, 3]})
converter.post_training_quantize = True

首先,我尝试不向函数添加输入shapes参数,但是没有用。从那时起,我读到您可以在这里写任何无关紧要的内容。

直到此行的输出:

INFO:tensorflow:Saver not created because there are no variables in the graph to restore
INFO:tensorflow:The specified SavedModel has no variables; no checkpoints were restored.
INFO:tensorflow:The given SavedModel MetaGraphDef contains SignatureDefs with the following keys: {'serving_default'}
INFO:tensorflow:input tensors info: 
INFO:tensorflow:Tensor's key in saved_model's tensor_map: inputs
INFO:tensorflow: tensor name: image_tensor:0, shape: (-1, -1, -1, 3), type: DT_UINT8
INFO:tensorflow:output tensors info: 
INFO:tensorflow:Tensor's key in saved_model's tensor_map: num_detections
INFO:tensorflow: tensor name: num_detections:0, shape: (-1), type: DT_FLOAT
INFO:tensorflow:Tensor's key in saved_model's tensor_map: detection_boxes
INFO:tensorflow: tensor name: detection_boxes:0, shape: (-1, 100, 4), type: DT_FLOAT
INFO:tensorflow:Tensor's key in saved_model's tensor_map: detection_scores
INFO:tensorflow: tensor name: detection_scores:0, shape: (-1, 100), type: DT_FLOAT
INFO:tensorflow:Tensor's key in saved_model's tensor_map: detection_classes
INFO:tensorflow: tensor name: detection_classes:0, shape: (-1, 100), type: DT_FLOAT
INFO:tensorflow:Saver not created because there are no variables in the graph to restore
INFO:tensorflow:The specified SavedModel has no variables; no checkpoints were restored.
INFO:tensorflow:Froze 0 variables.
INFO:tensorflow:Converted 0 variables to const ops.

然后我想转换:

tflite_quantized_model = converter.convert()

这是输出:

---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
<ipython-input-6-61a136476642> in <module>
----> 1 tflite_quantized_model = converter.convert()

~/.local/lib/python3.5/site-packages/tensorflow/contrib/lite/python/lite.py in convert(self)
    451           input_tensors=self._input_tensors,
    452           output_tensors=self._output_tensors,
--> 453           **converter_kwargs)
    454     else:
    455       # Graphs without valid tensors cannot be loaded into tf.Session since they

~/.local/lib/python3.5/site-packages/tensorflow/contrib/lite/python/convert.py in toco_convert_impl(input_data, input_tensors, output_tensors, *args, **kwargs)
    340   data = toco_convert_protos(model_flags.SerializeToString(),
    341                              toco_flags.SerializeToString(),
--> 342                              input_data.SerializeToString())
    343   return data
    344 

~/.local/lib/python3.5/site-packages/tensorflow/contrib/lite/python/convert.py in toco_convert_protos(model_flags_str, toco_flags_str, input_data_str)
    133     else:
    134       raise RuntimeError("TOCO failed see console for info.\n%s\n%s\n" %
--> 135                          (stdout, stderr))
    136 
    137 

RuntimeError: TOCO failed see console for info.

我无法在此处复制控制台输出,因此它超出了30000个字符的限制,但在这里您可以看到它:https://pastebin.com/UyT2x2Vk

[请在这一点上提供帮助,我应该怎么做才能使其工作:(

我的配置:Ubuntu 16.04,Tensorflow-GPU 1.12

感谢您!

python tensorflow converters tensorflow-lite toco
2个回答
5
投票

上周遇到相同的问题,请按照here中所述的步骤解决。

基本上,问题在于它们的主脚本不支持SSD模型。我没有使用bazel来执行此操作,而是使用了tflite_convert实用程序。

请谨慎使用export_tflite_ssd_graph.py脚本,使用前请先阅读其所有选项(主要是--max_detections挽救了我的性命。]

希望这会有所帮助。

编辑:您的步骤2无效。如果save_model包含SSD,则无法将其转换为tflite模型。您需要使用export_tflite_ssd_graph.py脚本导出经过训练的model.ckpt,并使用通过.pb工具创建的tflite_convert文件将其转换为tflite。


0
投票

您的.pb文件格式不正确。解决方法如下:https://github.com/peace195/tensorflow-lite-yolo-v3

我们需要执行2个步骤:

  1. 将权重转换为SavedModel。

  2. 使用tflite_convert从保存的模型转换为tflite格式。

请使用docker设置环境并仔细按照说明进行操作。

© www.soinside.com 2019 - 2024. All rights reserved.