MultiRNN和static_rnn错误:尺寸必须相等,但为256和129

问题描述 投票:3回答:1

我想用3层构建一个LSTM网络。这是代码:

num_layers=3
time_steps=10
num_units=128
n_input=1
learning_rate=0.001
n_classes=1
...

x=tf.placeholder("float",[None,time_steps,n_input],name="x")
y=tf.placeholder("float",[None,n_classes],name="y")
input=tf.unstack(x,time_steps,1)

lstm_layer=rnn_cell.BasicLSTMCell(num_units,state_is_tuple=True)
network=rnn_cell.MultiRNNCell([lstm_layer for _ in range(num_layers)],state_is_tuple=True)

outputs,_=rnn.static_rnn(network,inputs=input,dtype="float")

使用num_layers=1它工作正常,但有多个图层我在这一行得到错误:

outputs,_=rnn.static_rnn(network,inputs=input,dtype="float")

ValueError:尺寸必须相等,但对于'rnn / rnn / multi_rnn_cell / cell_0 / cell_0 / basic_lstm_cell / MatMul_1'(op:'MatMul'),其输入形状为[?,256],[129,512],则为256和129。

任何人都可以解释129和512的价值来自哪里?

python tensorflow deep-learning lstm recurrent-neural-network
1个回答
3
投票

您不应该为第一层和更深层重复使用相同的单元,因为它们的输入是不同的,因此内核矩阵是不同的。试试这个:

# Extra function is for readability. No problem to inline it.
def make_cell(lstm_size):
  return tf.nn.rnn_cell.BasicLSTMCell(lstm_size, state_is_tuple=True)

network = rnn_cell.MultiRNNCell([make_cell(num_units) for _ in range(num_layers)], 
                                state_is_tuple=True)
© www.soinside.com 2019 - 2024. All rights reserved.