在喀拉拉山大约30个纪元后,Val_loss不断增加,而val_accuracy却在减少

问题描述 投票:0回答:1

我们的val_loss和val_acc有一些问题。经过几个时期(约30个)后,val_acc下降了约50-60%,而val_loss则增加到了0.98-1.4之间(请参见下图)。该帖子的结尾是第45个纪元的结尾。

enter image description here[“损失和Val_Loss]”

    import pickle
    from datetime import time
    import matplotlib.pyplot as plt
    import numpy as np
    import tf as tf
    from keras import optimizers
    from keras.models import Sequential
    from keras.layers import *
    from keras.callbacks import TensorBoard
    from keras.utils import np_utils

    pickle_in = open("X.pickle", "rb")
    X = pickle.load(pickle_in)

    pickle_in = open("y.pickle", "rb")
    y = pickle.load(pickle_in)

    pickle_in = open("PredictionData\\X_Test.pickle", "rb")
    X_Test = pickle.load(pickle_in)

    X = X/255.0
    X_Test = X_Test/255.0

    y = np_utils.to_categorical(y, 5)

    NAME = "Emotion Detection"

    model = Sequential()

    model.add(Conv2D(32, (1, 1), activation="relu", use_bias=True,
                     bias_initializer="Ones",
                     input_shape=(145, 65, 1),
                     dim_ordering="th"))

    model.add(Conv2D(64, (3, 3),
                     activation="relu"))

    model.add(Conv2D(128, (3, 3),
                     activation="relu"))
    model.add(Dropout(0.2))

    model.add(Conv2D(64, (3, 3),
                     activation="relu"))

    model.add(Flatten())  # this converts our 3D feature maps to 1D feature vectors

    model.add(Dense(128,
                    activation="relu"))
    model.add(Dropout(0.2))

    model.add(Dense(32,
                    activation="relu"))

    model.add(Dense(5,
                    activation='sigmoid'))

    tensorboard = TensorBoard(log_dir="Tensorboard\\".format(time))

    sgd = optimizers.SGD(lr=0.001, decay=1e-6,
                         momentum=0.9, nesterov=True)

    model.compile(loss="categorical_crossentropy",
                  optimizer=sgd,
                  metrics=['accuracy'])

    history = model.fit(X, y, batch_size=16,
                        epochs=45, validation_split=0.12,
                        callbacks=[tensorboard])


    plt.plot(history.history['accuracy'])
    plt.plot(history.history['val_accuracy'])
    plt.title('Model accuracy')
    plt.ylabel('Accuracy')
    plt.xlabel('Epoch')
    plt.legend(['Accuracy', 'Val_Accuracy'], loc='upper left')
    plt.show()

    plt.plot(history.history['loss'])
    plt.plot(history.history['val_loss'])
    plt.title('Model loss')
    plt.ylabel('Loss')
    plt.xlabel('Epoch')
    plt.legend(['Loss', 'Val_Loss'], loc='upper left')
    plt.show()

    classes = model.predict(X_Test)
    plt.bar(range(5), classes[0])
    plt.show()
    print("prediction: class", np.argmax(classes[0]))


    model.summary()

    model.save("TrainedModel\\emotionDetector.h5") 

2493/2493 [===============================-35s 14ms / step-损失:0.2324-准确性:0.9202 -val_loss:1.3789-val_accuracy:0.6353

_________________________________________________________________
Layer (type)                 Output Shape              Param    
=================================================================
conv2d_1 (Conv2D)            (None, 32, 65, 1)         4672      
_________________________________________________________________
conv2d_2 (Conv2D)            (None, 30, 63, 64)        640       
_________________________________________________________________
conv2d_3 (Conv2D)            (None, 28, 61, 128)       73856     
_________________________________________________________________
dropout_1 (Dropout)          (None, 28, 61, 128)       0         
_________________________________________________________________
conv2d_4 (Conv2D)            (None, 26, 59, 64)        73792     
_________________________________________________________________
flatten_1 (Flatten)          (None, 98176)             0         
_________________________________________________________________
dense_1 (Dense)              (None, 128)               12566656  
_________________________________________________________________
dropout_2 (Dropout)          (None, 128)               0         
_________________________________________________________________
dense_2 (Dense)              (None, 32)                4128      
_________________________________________________________________
dense_3 (Dense)              (None, 5)                 165       
_________________________________________________________________
Total params: 12,723,909
Trainable params: 12,723,909
Non-trainable params: 0
_________________________________________________________________

希望您能为我们提供帮助。预先感谢。

tensorflow keras loss acc
1个回答
0
投票

仅从图和模式架构来看,您似乎是overfitting。您的模型具有大量参数(超过12M)并且正则化很少。

您可以考虑将kernel和/或bias reguralizers添加到图层。具有较高比率的Dropout图层(例如0.5)也可能会有所帮助。

这里是Tensorflow Documentation的一个不错的部分,可能会在主题上散发出更多的光芒。

祝你好运!

© www.soinside.com 2019 - 2024. All rights reserved.