为什么我必须从`tf.train.batch()`重塑`inputs`以与`slim.fully_connected()一起使用?

问题描述 投票:0回答:1

为什么我为slim.fully_connected()收到此错误?

ValueError: Input 0 of layer fc1 is incompatible with the layer: : expected min_ndim=2, found ndim=1. Full shape received: [32]

我的意见是来自Tensor("batch:0", shape=(32,), dtype=float32)tf.train.batch()

  inputs, labels = tf.train.batch(
        [input, label],
        batch_size=batch_size,
        num_threads=1,
        capacity=2 * batch_size)

如果我重塑(32,1)的输入它工作正常。

inputs, targets = load_batch(train_dataset)
print("inputs:", inputs, "targets:", targets)
# inputs: Tensor("batch:0", shape=(32,), dtype=float32) targets: Tensor("batch:1", shape=(32,), dtype=float32)

inputs = tf.reshape(inputs, [-1,1])
targets = tf.reshape(targets, [-1,1])

slim walkthrough之后,load_batch()中的例子似乎没有明确重塑

tensorflow tf-slim
1个回答
0
投票

tf.train.batch期望数组像输入,因为标量很少(实际上说)。所以,你必须重塑你的输入。我认为下一个代码片段可以解决问题。

>>> import numpy as np
>>> a = np.array([1,2,3,4])
>>> a.shape
(4,)
>>> a = np.reshape(a,[4,1])
>>> a
array([[1],
       [2],
       [3],
       [4]])
>>>  
© www.soinside.com 2019 - 2024. All rights reserved.