如何在训练期间使用Keras修改神经网络?

问题描述 投票:2回答:1

假设我的网络中有以下内容:

x = Conv2D(
            filters=256,
            kernel_size=5,
            strides=2,
            padding="same"
        )(x)
x = Dropout(0.5)(x)
x = BatchNormalization(momentum=0.8)(x)
x = LeakyReLU(alpha=0.2)(x)

顺便说一句,我正在使用Tensorflow后端。

在训练期间,我想修改或减少Dropout图层的值。最终,有什么方法可以停用它吗?

tensorflow neural-network deep-learning keras keras-layer
1个回答
1
投票

我终于找到了:

class MyModel():
    def __init__(self, init_dropout, dropout_decay):

        self.init_dropout  = init_dropout
        self.dropout_decay = dropout_decay

        input_layer = Input((64, 64, 1))
        x = Conv2D(
            filters=256,
            kernel_size=5,
            strides=2,
            padding="same"
        )(input_layer)
        x = Dropout(rate=init_dropout)(x)
        x = BatchNormalization(momentum=0.8)(x)
        x = LeakyReLU(alpha=0.2)(x)

        self.model = Model(input_layer, x)

    def decay_dropout(self, epoch, verbose=0):

        rate = max(0, self.init_dropout * (1 / np.exp(self.dropout_decay * epoch))) #define a formula for dropout decay

        for layer in self.model.layers:
            if isinstance(layer, Dropout):

                if (verbose >= 1):
                    print("Decaying Dropout from %.3f to %.3f" % (layer.rate, rate))

                layer.rate = rate

然后当然需要在每个纪元之后调用函数decay_dropout。

© www.soinside.com 2019 - 2024. All rights reserved.