Keras损失始终为0.0000e + 00

问题描述 投票:0回答:1

我实现了带有自定义损失功能的MLP,这是代码:

def custom_loss(groups_id_count):
  print('Computing loss...')
  def listnet_loss(real_labels, predicted_labels):
    start_range = 0
    for group in groups_id_count:
      end_range = start_range + group[1]
      batch_real_labels = real_labels[start_range:end_range]
      batch_predicted_labels = predicted_labels[start_range:end_range]
      loss = -K.sum(get_top_one_probability(batch_real_labels)) * tf.math.log(get_top_one_probability(batch_predicted_labels))
      start_range = end_range
    print('loss: ', loss)
    return loss
  return listnet_loss

一个时期的损失印刷时期始终为0.0000e+00,关于print变量的loss语句为Tensor("listnet_loss/mul_24:0", shape=(None, None), dtype=float32)

这是get_top_one_probability功能:

def get_top_one_probability(vector):
  return (K.exp(vector) / K.sum(K.exp(vector)))

UPDATE

get_top_one_probability(batch_predicted_labels)的输出始终为:

Tensor("listnet_loss/truediv_36:0", shape=(None, 1), dtype=float32)

real_labels的输出是:

Tensor("ExpandDims:0", shape=(None, 1), dtype=float32)

batch_real_labelsbatch_predicted_labels的输出始终为:

Tensor("listnet_loss/strided_slice:0", shape=(None, 1), dtype=float32)

UPDATE 2

使用K.shape(real_labels),我注意到形状是(2,),但是我希望形状与传递给fit函数的标签数量相对应。错了吗?

我的损失函数有问题吗?预先感谢。

tensorflow keras loss
1个回答
0
投票

我认为问题在于变量loss的范围。您也不会在循环的每次迭代中都增加损失。

© www.soinside.com 2019 - 2024. All rights reserved.