class TensorBoardLR(TensorBoard):
""" A modification to the Tensorboard callback to also include the scalars of learning rate and KL weight"""
def __init__(self, *args, **kwargs):
self.kl_weight = kwargs.pop('kl_weight')
super().__init__(*args, **kwargs)
self.count = 0
def on_batch_end(self, batch, logs=None):
logs.update({'lr': K.eval(self.model.optimizer.lr),
'kl_weight': K.eval(self.kl_weight)})
super().on_batch_end(batch, logs)
以下是从此github存储库中克隆的定制TensorBoard,旨在在每批末尾存储学习率和所谓的KL权重,在tensorflow版本<= 1 ....]中效果很好。 [将方法的名称从on_batch_end
更改为on_train_batch_end
。他们保留了用于遗留代码的方法,https://www.tensorflow.org/versions/r1.15/api_docs/python/tf/keras/callbacks/Callback这也是keras和tensorflow.keras之间的区别。