我如何在学习率调度程序中使用Adam优化器?

问题描述 投票:0回答:1

我想将学习率调度程序与Adam优化器一起使用。我不知道我的代码是否正确。我想每50个时期减少学习率。

def step_decay(epochs):
    initial_lrate = 0.01
    drop = 0.1
    epochs_drop = 50.0
    lrate = initial_lrate * math.pow(drop, math.floor((1+epochs)/epochs_drop))
    decay_rate=lrate/epochs
    return (lrate,decay_rate)
lrate = LearningRateScheduler(step_decay)
decay_rate=LearningRateScheduler(step_decay)
opt_adam = keras.optimizers.Adam(lr=lrate, beta_1=0.9, beta_2=0.999, epsilon=1e-08, decay=decay_rate)
model.compile(loss='categorical_crossentropy', optimizer=opt_adam, metrics=['accuracy'])
history=model.fit_generator(generate_arrays_for_training(indexPat, filesPath, end=75), 
                                validation_data=generate_arrays_for_training(indexPat, filesPath, start=75),
                                steps_per_epoch=int((len(filesPath)-int(len(filesPath)/100*25))),#*25), 
                                validation_steps=int((len(filesPath)-int(len(filesPath)/100*75))),#*75),
                                verbose=2,
                                epochs=300, max_queue_size=2, shuffle=True, callbacks=callbacks_list)

python optimization keras epoch learning-rate
1个回答
© www.soinside.com 2019 - 2024. All rights reserved.