使用 tf.keras.layers.AlphaDropout,得到:“greater_equal() 得到了意外的关键字参数‘seed’”

问题描述 投票:0回答:1

我必须对模型正则化执行 alpha dropout。我正在使用 Jupyter Notebook 和值得注意的数据包:

  • python 3.12.3
  • tensorflow 2.16.1(带有 keras 3.3.2 和 numpy 1.26.4)
import tensorflow as tf
from sklearn.model_selection import train_test_split
from tensorflow import keras
from tensorflow.keras import layers
from tensorflow.keras.callbacks import EarlyStopping
import pandas as pd
import matplotlib.pyplot as plt
from sklearn.preprocessing import StandardScaler
import numpy as np

(X_train_full, y_train_full), (X_test, y_test) = keras.datasets.cifar10.load_data()
X_train, X_valid, y_train, y_valid = train_test_split(X_train_full, y_train_full, test_size=0.15, random_state=11)

scaler = StandardScaler()

X_train_scaled = scaler.fit_transform(X_train.astype(np.float32).reshape(-1, 32*32*3)).reshape(-1, 32, 32, 3)
X_valid_scaled = scaler.transform(X_valid.astype(np.float32).reshape(-1, 32*32*3)).reshape(-1, 32, 32, 3)
X_test_scaled = scaler.transform(X_test.astype(np.float32).reshape(-1, 32*32*3)).reshape(-1, 32, 32, 3)

model = keras.models.Sequential()
model.add(keras.layers.Flatten(input_shape=[32, 32, 3]))
for _ in range(20):
    model.add(keras.layers.Dense(100, activation="selu", kernel_initializer="lecun_normal"))
    model.add(keras.layers.AlphaDropout(rate=0.5))
model.add(keras.layers.Dense(10, activation="softmax"))

s = 30 * len(X_train_scaled) // 32 # batch size = 32
learning_rate = keras.optimizers.schedules.ExponentialDecay(0.01, s, 0.1)
optimizer = keras.optimizers.Nadam(learning_rate)
model.compile(loss="sparse_categorical_crossentropy", optimizer=optimizer, metrics=["accuracy"])

early_stopping_cb = keras.callbacks.EarlyStopping(patience=10, restore_best_weights=True)

history = model.fit(X_train_scaled, y_train, epochs=30, 
                    validation_data=(X_valid_scaled, y_valid),
                    callbacks=[early_stopping_cb])

我收到此错误消息:

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In[6], line 16
     12 model.compile(loss="sparse_categorical_crossentropy", optimizer=optimizer, metrics=["accuracy"])
     14 early_stopping_cb = keras.callbacks.EarlyStopping(patience=10, restore_best_weights=True)
---> 16 history = model.fit(X_train_scaled, y_train, epochs=30, 
     17                     validation_data=(X_valid_scaled, y_valid),
     18                     callbacks=[early_stopping_cb])

File ~\anaconda3\envs\rnapa7\Lib\site-packages\keras\src\utils\traceback_utils.py:122, in filter_traceback.<locals>.error_handler(*args, **kwargs)
    119     filtered_tb = _process_traceback_frames(e.__traceback__)
    120     # To get the full stack trace, call:
    121     # `keras.config.disable_traceback_filtering()`
--> 122     raise e.with_traceback(filtered_tb) from None
    123 finally:
    124     del filtered_tb

File ~\anaconda3\envs\rnapa7\Lib\site-packages\keras\src\legacy\layers.py:38, in AlphaDropout.call(self, inputs, training)
     36 else:
     37     noise_shape = self.noise_shape
---> 38 kept_idx = tf.greater_equal(
     39     backend.random.uniform(noise_shape),
     40     self.rate,
     41     seed=self.seed_generator,
     42 )
     43 kept_idx = tf.cast(kept_idx, inputs.dtype)
     45 # Get affine transformation params

TypeError: Exception encountered when calling AlphaDropout.call().

greater_equal() got an unexpected keyword argument 'seed'

Arguments received by AlphaDropout.call():
  • inputs=tf.Tensor(shape=(None, 100), dtype=float32)
  • training=True

该类在 TensorFlow 2.16.1 上似乎已被弃用,但我确实需要使用 alpha dropout。还有其他办法吗?

我承认我已经在 Google、这里、甚至使用 Copilot 和 ChatGPT 进行了查找,但没有取得成果。你们有什么建议吗?我还查看了 Tensorflow 文档,看看 Dropout 是否已经积累了带有任何参数的“工作”,但我不认为自己是掌握它的专业学生。

有人可以帮忙吗?

tensorflow keras deep-learning neural-network dropout
1个回答
0
投票

您可以安装tensorflow 2.15.1和Keras 2.4.3并尝试使用AlphaDropout

© www.soinside.com 2019 - 2024. All rights reserved.