如何用tf.estimator保存tensorflow模型

问题描述 投票:0回答:1

我有以下示例代码来训练和评估使用tensorflow的估计器api的cnn mnist模型:

 def model_fn(features, labels, mode):
        images = tf.reshape(features, [-1, 28, 28, 1])
        model = Model()
        logits = model(images)

        predicted_logit = tf.argmax(input=logits, axis=1, output_type=tf.int32)

        if mode == tf.estimator.ModeKeys.PREDICT:
            probabilities = tf.nn.softmax(logits)

            predictions = {
                'predicted_logit': predicted_logit,
                'probabilities': probabilities
            }
            return tf.estimator.EstimatorSpec(mode=mode, predictions=predictions)

        else:
            ...

    def mnist_train_and_eval(_):
        train_data, train_labels, eval_data, eval_labels, val_data, val_labels = get_mnist_data()

        # Create a input function to train
        train_input_fn = tf.estimator.inputs.numpy_input_fn(
            x= train_data,
            y=train_labels,
            batch_size=_BATCH_SIZE,
            num_epochs=1,
            shuffle=True)

        # Create a input function to eval
        eval_input_fn = tf.estimator.inputs.numpy_input_fn(
            x= eval_data,
            y=eval_labels,
            batch_size=_BATCH_SIZE,
            num_epochs=1,
            shuffle=False)

        # Create a estimator with model_fn
        image_classifier = tf.estimator.Estimator(model_fn=model_fn, model_dir=_MODEL_DIR)

        # Finally, train and evaluate the model after each epoch
        for _ in range(_NUM_EPOCHS):
            image_classifier.train(input_fn=train_input_fn)
            metrics = image_classifier.evaluate(input_fn=eval_input_fn)

如何使用estimator.export_savedmodel保存训练好的模型以供以后推断?我应该怎么写serve_input_receiver_fn?

非常感谢您的帮助!

tensorflow mnist
1个回答
0
投票

您可以使用输入要素字典创建一个函数。占位符应与图像的形状匹配,并与batch_size的第一个维度匹配。

def serving_input_receiver_fn():
  x = tf.placeholder(tf.float32, [None, Shape])
  inputs = {'x': x}
  return tf.estimator.export.ServingInputReceiver(features=inputs, receiver_tensors=inputs)

或者你可以使用不需要dict映射的TensorServingInputReceiver

inputs = tf.placeholder(tf.float32, [None, 32*32*3])
tf.estimator.export.TensorServingInputReceiver(inputs, inputs)

此函数返回ServingInputReceiver的新实例,该实例传递给export_savedmodeltf.estimator.FinalExporter

...
image_classifier.export_savedmodel(saved_dir, serving_input_receiver_fn)
© www.soinside.com 2019 - 2024. All rights reserved.