批量训练循环中没有来自 GradientTape 的梯度

问题描述 投票:0回答:0

我是 TensorFlow 的新手,我正在尝试在 v2 中实现简单的协同过滤。我一次训练整个训练集没有问题,但是当我尝试批量训练时遇到问题。具体来说,在计算

grads
时,输出梯度为
[None, None]
。完整尝试的 colab 文件可以在here.

中找到
        with tf.GradientTape() as tape:
            tape.watch(user_embedding)
            tape.watch(item_embedding)

            ## Compute the predicted ratings
            predicted_ratings = tf.reduce_sum(user_batch * item_batch, axis=1)

            ## Compute loss
            true_ratings = tf.cast(train_batch_st.values, tf.float32)
            loss = tf.losses.mean_squared_error(true_ratings, predicted_ratings) # batch loss
            # Cumulative epoch loss (across all batches)
            epoch_loss += loss

            ## Compute gradients of loss with respect to user and item embeddings
            grads = tape.gradient(loss, [user_embedding, item_embedding])
            print(grads) # grads None, None thus causing error below

            # Apply gradients
            optimizer.apply_gradients(zip(grads, [user_embedding, item_embedding]))

感谢您的帮助!

python tensorflow tensorflow2.0 collaborative-filtering
© www.soinside.com 2019 - 2024. All rights reserved.