如何通过Keras Wrapper(或一般而言)传递参数?

问题描述 投票:2回答:1

我有以下模型,我想通过GridSearchCV()调整多个参数。到目前为止,它仍然有效,但是我无法以相同的方式调整compile()的参数:

def create_model(activation='relu'
                 , init='he_normal' 
                 , loss = 'mean_squared_error' 
                 , optimizer = 'adam' 
                 , metrics=['accuracy'] 
                 , dropout_rate= 0
                 , learning_rate = 0.1
                 , decay_rate = 0.005
                 , momentum = 0.8
                ):

    # Model definition
    model = Sequential()
    model.add(Dense(16, input_shape=(2, ), 
                    activation = activation,
                    kernel_initializer = init))         
                 ....

    model.add(Dropout(dropout_rate))
    model.add(Dense(2, 
                    activation = "tanh"))

    # Compile model 
    model.compile(loss = 'mean_squared_error'
                  , optimizer = 'adam'
                                      # (
                                      #  learning_rate = learning_rate
                                      # , decay_rate = decay_rate
                                      # , momentum =momentum
                                      # )
                  , metrics = ['accuracy']
                 )

    return model

# define the grid search parameters
batch_size = [2, 6]
epochs = [5, 10]
activation = ['relu', 'tanh']    
optimizer = ['Adam', 'sgd']    
metrics = ['mse', 'acc']
loss = ['mse', 'mae'] 
dropout_rate = [0.1, 0.25]
learning_rate = [0.2, 0.3]
decay_rate = [0.001, 0.005]
momentum = [0.4, 0.7]
kernel_initializer = ['init', 'normal']

param_grid = dict(batch_size=batch_size
                      , epochs=epochs
                      , activation = activation
                      , optimizer = optimizer
                      , metrics = metrics
                      , loss = loss        
                      , dropout_rate = dropout_rate 
                      , learning_rate = learning_rate
                      , decay_rate = decay_rate
                      , momentum = momentum
                      , kernel_initializer = kernel_initializer
                     )

hypparas = param_grid

model = KerasRegressor(build_fn=create_model, verbose=0)

model_cv = GridSearchCV(estimator=model, 
                    param_grid=hypparas, 
                    n_jobs=1,    # -1 uses all cores
                    cv=5
                    )

model_cv_result = model_cv.fit(X, y)

并且我想将learning_ratedecay_ratemomentum也添加到要调整的超参数中。但是它不能像上面那样工作(这就是为什么我没有注释compile()中的特定行的原因。我分别需要更改什么,如何将这些参数传递给create_model()

python keras scikit-learn neural-network
1个回答
0
投票

这可能不是最优雅的解决方案,但是对我来说,最明显的锤子是定义另一个返回优化器的函数。我已经简化了您的示例。

from keras.layers import Dense
from keras.models import Sequential
from keras.wrappers.scikit_learn import KerasRegressor
from sklearn.model_selection import GridSearchCV
from keras.optimizers import Adam, SGD

def return_optimizer(type, learning_rate):
    if type == 'Adam':
        return Adam(lr=learning_rate)

    elif type == 'SGD':
        return SGD(lr=learning_rate)

然后,为您的create_model()添加一行,如下所示,

def create_model(optimizer='adam', activation = 'sigmoid', 
learning_rate=0.1):
  model = Sequential()

  model.add(Dense(1, activation=activation, input_shape=(1,)))
  model.add(Dense(1, activation=activation))

  opt = return_optimizer(type=optimizer, learning_rate=learning_rate)
  model.compile(loss = 'mean_squared_error', optimizer=opt, metrics=['accuracy'])
  return model

然后,您的GridSerachCV()的网格为

param_grid = {
    'epochs': [2, 5],
    'optimizer': ['Adam', 'SGD'],
    'learning_rate': [0.1, 0.2]
}

最后,

model = KerasRegressor(build_fn=create_model, verbose=0)

model_cv = GridSearchCV(estimator=model,
                    param_grid=param_grid,
                    n_jobs=1,   
                    cv=5
                    )

请让我知道如何解决。

© www.soinside.com 2019 - 2024. All rights reserved.