我正在使用TensorFlow训练LSTM模型。我们知道,在训练过程中,是每个时期的loss
和val_loss
的报告,这是训练和测试数据集损失的平均值。我打算跟踪火车数据集中特定样本的丢失(特定日期)。另外,应该注意,我在fit
函数中对火车数据进行混洗。
这里是用于跟踪单个样本损失的代码:
import tensorflow as tf
import numpy as np
import keras
x = tf.Variable(initial_value=np.ndarray(shape=(10, 10), dtype=np.float32)) # your sample input
y =np.random.randint(0, 9, size=(10, )) # your sample label
y_labels = keras.utils.to_categorical(y, 10)
loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits_v2(labels=x, logits=y_labels)) # loss operation for that particular sample
tf.summary.scalar('loss', loss) #logging loss op in summary
print('loss op', loss)
merge = tf.summary.merge_all()
saver = tf.train.Saver()
with tf.Session() as sess:
sess.run(tf.global_variables_initializer())
loss_val, merge_val = sess.run([loss, merge]) # no need to pass any feed_dict, loss value calculated is specific to that sample
print('loss val', loss_val)
# merge val could be put in to tf summary writer