如何减少LSTM损失

问题描述 投票:0回答:1

我喝中性的、循环的、中性的水,我不会遭受巨大的损失。如何解决这个问题?

Size X_train_man = (549, 3, 1).  
Size Y_train_main = (549, 6, 1). 

我尝试增加层数,更改正则化器、优化器、损失函数、激活函数,但损失仍然在 500-700 范围内

model = Sequential()
model.add(LSTM(units = 50, return_sequences = True, input_shape=(X_train_main.shape[1], 1), activity_regularizer=regularizers.L1(1e-5)))
model.add(PReLU())
model.add(Dropout(0.3))

model.add(LSTM(units = 50, return_sequences = True, bias_regularizer=regularizers.L2(1e-4)))
model.add(PReLU())
model.add(Dropout(0.3))


model.add(LSTM(units = 50, return_sequences = True, bias_regularizer=regularizers.L2(1e-4)))
model.add(PReLU())
model.add(Dropout(0.3))


model.add(LSTM(units = 50, return_sequences = True, bias_regularizer=regularizers.L2(1e-4)))
model.add(PReLU())
model.add(Dropout(0.3))


model.add(LSTM(units = 50, return_sequences = False))
model.add(PReLU())
model.add(Dropout(0.3))


model.add(Dense(units = Y_train_main.shape[1]))

optimizer = keras.optimizers.SGD(learning_rate=0.001)
model.compile(optimizer=optimizer, loss = 'mean_squared_error', metrics=['accuracy'])
history = model.fit(X_train_main, Y_train_main, batch_size = 4, epochs=100)
Epoch 1/40
138/138 [==============================] - 22s 15ms/step - loss: 9497.7529 - accuracy: 0.9945
Epoch 2/40
138/138 [==============================] - 2s 15ms/step - loss: 1382.4642 - accuracy: 1.0000
...
Epoch 39/40
138/138 [==============================] - 1s 10ms/step - loss: 601.1696 - accuracy: 1.0000
Epoch 40/40
138/138 [==============================] - 1s 11ms/step - loss: 599.6407 - accuracy: 1.0000
python keras lstm
1个回答
0
投票

尝试删除 LSTM 层之后的激活。您不需要它们,因为 LSTM 层有自己的内部激活。

© www.soinside.com 2019 - 2024. All rights reserved.