Keras GRUCell缺少1个必要的位置参数:'状态'

问题描述 投票:1回答:1

我尝试用Keras构建一个3层RNN。部分代码在这里:

    model = Sequential()
    model.add(Embedding(input_dim = 91, output_dim = 128, input_length =max_length))
    model.add(GRUCell(units = self.neurons, dropout = self.dropval,  bias_initializer = bias))
    model.add(GRUCell(units = self.neurons, dropout = self.dropval,  bias_initializer = bias))
    model.add(GRUCell(units = self.neurons, dropout = self.dropval,  bias_initializer = bias))
    model.add(TimeDistributed(Dense(target.shape[2])))

然后我遇到了这个错误:

call() missing 1 required positional argument: 'states'

错误详情如下:

~/anaconda3/envs/hw3/lib/python3.5/site-packages/keras/models.py in add(self, layer)
487                           output_shapes=[self.outputs[0]._keras_shape])
488         else:
--> 489             output_tensor = layer(self.outputs[0])
490             if isinstance(output_tensor, list):
491                 raise TypeError('All layers in a Sequential model '

 ~/anaconda3/envs/hw3/lib/python3.5/site-packages/keras/engine/topology.py in __call__(self, inputs, **kwargs)
601 
602             # Actually call the layer, collecting output(s), mask(s), and shape(s).
--> 603             output = self.call(inputs, **kwargs)
604             output_mask = self.compute_mask(inputs, previous_mask)
605 
python machine-learning keras rnn gated-recurrent-unit
1个回答
4
投票
  1. 不要直接在Keras中使用Cell类(即GRUCellLSTMCell)。它们是由相应层包裹的计算单元。而是使用Layer类(即GRULSTM): model.add(GRU(units = self.neurons, dropout = self.dropval, bias_initializer = bias)) model.add(GRU(units = self.neurons, dropout = self.dropval, bias_initializer = bias)) model.add(GRU(units = self.neurons, dropout = self.dropval, bias_initializer = bias)) LSTMGRU使用相应的单元格在所有时间步长上执行计算。阅读此SO answer以了解更多关于他们的不同之处。
  2. 当您将多个RNN层堆叠在一起时,您需要将它们的return_sequences参数设置为True,以便生成每个时间步的输出,而后者又由下一个RNN层使用。请注意,您可能会或可能不会在最后一个RNN层上执行此操作(这取决于您的体系结构和您尝试解决的问题): model.add(GRU(units = self.neurons, dropout = self.dropval, bias_initializer = bias, return_sequences=True)) model.add(GRU(units = self.neurons, dropout = self.dropval, bias_initializer = bias, return_sequences=True)) model.add(GRU(units = self.neurons, dropout = self.dropval, bias_initializer = bias))
© www.soinside.com 2019 - 2024. All rights reserved.