Keras在fit_generator之后停止,没有错误

问题描述 投票:0回答:1

我正在研究Resnet架构,以比较CNN的性能。我在第一次测试中使用了此Resnet。我正在重用我的代码来为网络加载和准备数据,它工作正常。该脚本的其余部分似乎也可以正常工作,除非它进入了fit_generator。在fit_generator上,它暂停了一段时间,然后似乎退出了,在这里我有一条打印语句,说“发生了什么事?”。我很困惑,因为我希望看到错误消息或程序崩溃或其他问题。我正在使用运行最新版本的蟒蛇的Windows 10。在我的公寓环境中,我使用的是python 3.6(最新版本的Keras 2.3和TensorFlow的最新版本)。我将不胜感激。

def batch_generator(X_train, Y_train):  
    while True:
        for fl, lb in zip(X_train, Y_train):
            sam, lam = get_IQsamples(fl, lb)
            max_iter = sam.shape[0]
            sample = []     # store all the generated data batches
            label = []   # store all the generated label batches

            i = 0
            for d, l in zip(sam, lam):
                sample.append(d)
                label.append(l)
                i += 1
                if i == max_iter:
                    break
            sample = np.asarray(sample)        
            label = np.asarray(label)
            yield sample, label


def residual_stack(x, f):
    
    # 1x1 conv linear
    x = Conv2D(f, (1, 1), strides=1, padding='same', data_format='channels_last')(x)
    x = Activation('linear')(x)


    # residual unit 1    
    x_shortcut = x
    x = Conv2D(f, (3, 2), strides=1, padding="same", data_format='channels_last')(x)
    x = Activation('relu')(x)
    x = Conv2D(f, 3, strides=1, padding="same", data_format='channels_last')(x)
    x = Activation('linear')(x)

    # add skip connection
    if x.shape[1:] == x_shortcut.shape[1:]:
      x = Add()([x, x_shortcut])

    else:
      raise Exception('Skip Connection Failure!')


    # residual unit 2    
    x_shortcut = x
    x = Conv2D(f, 3, strides=1, padding="same", data_format='channels_last')(x)
    x = Activation('relu')(x)
    x = Conv2D(f, 3, strides = 1, padding = "same", data_format='channels_last')(x)
    x = Activation('linear')(x)

    # add skip connection
    if x.shape[1:] == x_shortcut.shape[1:]:
      x = Add()([x, x_shortcut])

    else:
      raise Exception('Skip Connection Failure!')


    # max pooling layer
    x = MaxPooling2D(pool_size=2, strides=None, padding='valid', data_format='channels_last')(x)

    return x

定义ResNet模型

# define resnet model

def ResNet(input_shape, classes):   

    # create input tensor
    x_input = Input(input_shape)
    x = x_input

    # residual stack
    num_filters = 40
    x = residual_stack(x, num_filters)
    x = residual_stack(x, num_filters)
    x = residual_stack(x, num_filters)
    x = residual_stack(x, num_filters)
    x = residual_stack(x, num_filters)


    # output layer
    x = Flatten()(x)
    x = Dense(128, activation="selu", kernel_initializer="he_normal")(x)
    x = Dropout(.5)(x)
    x = Dense(128, activation="selu", kernel_initializer="he_normal")(x)
    x = Dropout(.5)(x)
    x = Dense(classes , activation='softmax', kernel_initializer = glorot_uniform(seed=0))(x)


    # Create model
    model = Model(inputs = x_input, outputs = x)
    model.summary()

    return model


model = ResNet((32,32,2),8)

model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])


print('Load complete!')
print('\n')


steps = val_length_train // batchsize
valid_steps = val_length // batchsize

history = model.fit_generator(
            generator=train_gen,
            epochs=3,
            verbose=0,
            steps_per_epoch=steps,
            validation_data=valid_gen,
            validation_steps=valid_steps,
            callbacks=[tensorboard])

print("what happened?")
keras keras-2
1个回答
0
投票

排序。如果有错误,它将仍然被抛出并打印为verbose为0。被说成verbose 0似乎对某些人造成了问题。这篇文章来自2017年,但我见过与2019年11月https://github.com/keras-team/keras/issues/5818相同的问题。如果我使用0或2,则一切正常,但由于脚本似乎从未开始获取数据或训练,因此所有这些都无关紧要。感谢您的反馈。

© www.soinside.com 2019 - 2024. All rights reserved.