使用C++加载SavedModel - SavedModel加载标签{serve};状态:失败:未找到

问题描述 投票:0回答:1

我在文件夹(

generator_model_final
)中有一个 SavedModel,其中包含以下内容:

- saved_model.pb
- variables
  |- variables.data-00000-of-00002
  |- variables.data-00001-of-00002
  |- variables.index

在目录的根目录中,我有我的

.cc
BUILD
文件:

- gan_loader.cc
- BUILD
- generator_model_final

我想使用 Tensorflow 的 C++ API 加载 SavedModel。 我的 C++ 代码如下:

#include <fstream>
#include <utility>
#include <vector>

#include "tensorflow/cc/ops/const_op.h"
#include "tensorflow/cc/ops/image_ops.h"
#include "tensorflow/cc/ops/standard_ops.h"
#include "tensorflow/core/framework/graph.pb.h"
#include "tensorflow/core/framework/tensor.h"
#include "tensorflow/core/graph/default_device.h"
#include "tensorflow/core/graph/graph_def_builder.h"
#include "tensorflow/core/lib/core/errors.h"
#include "tensorflow/core/lib/core/stringpiece.h"
#include "tensorflow/core/lib/core/threadpool.h"
#include "tensorflow/core/lib/io/path.h"
#include "tensorflow/core/lib/strings/str_util.h"
#include "tensorflow/core/lib/strings/stringprintf.h"
#include "tensorflow/core/platform/env.h"
#include "tensorflow/core/platform/init_main.h"
#include "tensorflow/core/platform/logging.h"
#include "tensorflow/core/platform/types.h"
#include "tensorflow/core/public/session.h"
#include "tensorflow/core/util/command_line_flags.h"
#include "tensorflow/cc/saved_model/loader.h"
#include "tensorflow/cc/saved_model/tag_constants.h"

// These are all common classes it's handy to reference with no namespace.
using tensorflow::Flag;
using tensorflow::int32;
using tensorflow::Status;
using tensorflow::string;
using tensorflow::Tensor;
using tensorflow::tstring;
using tensorflow::SavedModelBundle;
using tensorflow::SessionOptions;
using tensorflow::RunOptions;
using tensorflow::kSavedModelTagServe;


int main(int argc, char* argv[]) {
  // These are the command-line flags the program can understand.
  // They define where the graph and input data is located, and what kind of
  // input the model expects. 

  // Input/Output names
  string input_layer = "dense_1_input";
  string output_layer = "conv2d_2";

  string root_dir = "";

  // Arguments
  std::vector<Flag> flag_list = {
      Flag("input_layer", &input_layer, "name of input layer"),
      Flag("output_layer", &output_layer, "name of output layer"),
      Flag("root_dir", &root_dir, "interpret image and graph file names relative to this directory"),
  };
  string usage = tensorflow::Flags::Usage(argv[0], flag_list);
  const bool parse_result = tensorflow::Flags::Parse(&argc, argv, flag_list);
  if (!parse_result) {
    LOG(ERROR) << usage;
    return -1;
  }

  // We need to call this to set up global state for TensorFlow.
  tensorflow::port::InitMain(argv[0], &argc, &argv);
  if (argc > 1) {
    LOG(ERROR) << "Unknown argument " << argv[1] << "\n" << usage;
    return -1;
  }

  // TODO: First we load and initialize the model.
  SavedModelBundle model;
  SessionOptions session_options;
  RunOptions run_options;

  auto status = tensorflow::LoadSavedModel(session_options, run_options, "generator_model_final/", {kSavedModelTagServe}, &model);
  if (!status.ok()) {
    std::cerr << "Failed: " << status;
    return -1;
  }  
  return 0;
}

在代码的最后一部分,我使用了TF提供的loader.h来使用C++加载SavedModel。我相信它应该已经正确加载 SavedModel。当我用 Bazel (

bazel build tensorflow/gan_loader/...
) 构建它时,它构建得很好。但是,当运行生成的可执行文件(
./bazel-bin/tensorflow/gan_loader/gan_loader
)时,我收到以下错误:

2020-06-20 11:12:45.925247: I tensorflow/cc/saved_model/reader.cc:31] Reading SavedModel from: generator_model_final/
2020-06-20 11:12:45.925312: I tensorflow/cc/saved_model/loader.cc:364] SavedModel load for tags { serve }; Status: fail: Not found: Could not find SavedModel .pb or .pbtxt at supplied export directory path: generator_model_final/. Took 77 microseconds.
Failed: Not found: Could not find SavedModel .pb or .pbtxt at supplied export directory path: generator_model_final/(base)

这很奇怪,因为有一个.pb文件,并且它包含标签serve。

有关我的 SavedModel 的一些信息:

运行

$ saved_model_cli show --dir <path_to_saved_model_folder>
它提供:

The given SavedModel contains the following tag-sets: 
serve

运行

$ saved_model_cli show --dir <path_to_saved_model_folder> --tag_set serve 
它提供:

The given SavedModel MetaGraphDef contains SignatureDefs with the following keys:
SignatureDef key: "__saved_model_init_op"
SignatureDef key: "serving_default"

最后,使用

$ saved_model_cli show --dir <path_to_saved_model_folder> --tag_set serve --signature_def serving_default
提供:

The given SavedModel SignatureDef contains the following input(s):
  inputs['dense_1_input'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1, 100)
      name: serving_default_dense_1_input:0
The given SavedModel SignatureDef contains the following output(s):
  outputs['conv2d_2'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1, 28, 28, 1)
      name: StatefulPartitionedCall:0
Method name is: tensorflow/serving/predict

您知道为什么会发生这种情况吗?可能是目录路径错误? SavedModel 是否遗漏了什么?

谢谢!

c++ tensorflow bazel tensorflow-serving
1个回答
0
投票

这很烦人。首先,export_dir应该是包含.pb模型的文件夹,其次模型必须命名为saved_mode.pb

© www.soinside.com 2019 - 2024. All rights reserved.