Keras-为什么我的CNN模型的准确性不受超参数的影响?

问题描述 投票:0回答:2

正如标题清楚地描述的那样,我的简单CNN模型的准确性不受超参数甚至是DropoutMaxPooling等图层的存在的影响。我使用Keras实现了该模型。这种奇怪情况背后的原因可能是什么?我在下面的代码中添加了相关部分:

input_dim = X_train.shape[1]
nb_classes = Y_train.shape[1]

model = Sequential()
model.add(Conv1D(filters=64, kernel_size=3, activation='relu', input_shape=(input_dim, 1)))
model.add(Dropout(0.5))
model.add(MaxPooling1D(pool_size=2))
model.add(Flatten())
model.add(Dense(40, activation='relu'))
model.add(Dense(nb_classes, activation='softmax'))

model.compile(loss='mean_squared_error', optimizer='adam', metrics=['accuracy'])

ps.s。输入数据(X_trainX_test)包含由Word2Vec再现的向量。输出为二进制。

编辑:您可以在下面找到样本培训日志:

样本训练记录:

Train on 3114 samples, validate on 347 samples
Epoch 1/10
 - 1s - loss: 0.6917 - accuracy: 0.5363 - val_loss: 0.6901 - val_accuracy: 0.5476
Epoch 2/10
 - 1s - loss: 0.6906 - accuracy: 0.5369 - val_loss: 0.6896 - val_accuracy: 0.5476
Epoch 3/10
 - 1s - loss: 0.6908 - accuracy: 0.5369 - val_loss: 0.6895 - val_accuracy: 0.5476
Epoch 4/10
 - 1s - loss: 0.6908 - accuracy: 0.5369 - val_loss: 0.6903 - val_accuracy: 0.5476
Epoch 5/10
 - 1s - loss: 0.6908 - accuracy: 0.5369 - val_loss: 0.6899 - val_accuracy: 0.5476
Epoch 6/10
 - 1s - loss: 0.6909 - accuracy: 0.5369 - val_loss: 0.6901 - val_accuracy: 0.5476
Epoch 7/10
 - 1s - loss: 0.6905 - accuracy: 0.5369 - val_loss: 0.6896 - val_accuracy: 0.5476
Epoch 8/10
 - 1s - loss: 0.6909 - accuracy: 0.5369 - val_loss: 0.6897 - val_accuracy: 0.5476
Epoch 9/10
 - 1s - loss: 0.6905 - accuracy: 0.5369 - val_loss: 0.6892 - val_accuracy: 0.5476
Epoch 10/10
 - 1s - loss: 0.6909 - accuracy: 0.5369 - val_loss: 0.6900 - val_accuracy: 0.5476
keras conv-neural-network keras-layer grid-search
2个回答
0
投票

首先,您需要将最后一层更改为

model.add(Dense(1, activation='sigmoid'))

您还需要将损失功能更改为

model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])

0
投票

我假设您具有多类分类,对吗?

然后您的损失不合适:您应该使用'categorical_crossentropy'而不是'mean_squared_error'。

此外,尝试添加多个Conv + Drop + MaxPool(3套)以清楚地验证网络的健壮性。

© www.soinside.com 2019 - 2024. All rights reserved.