HyperOpt 调优 XGB 产生的最佳参数不在我给出的范围内?

问题描述 投票:0回答:0

我正在使用 hyperopt 调整 XGB 模型

xgb_sapce = {
        'max_depth': hp.hp.choice('max_depth', np.arange(5, 15, dtype=int)),
        'n_estimators': hp.hp.choice('n_estimators', np.arange(50, 800, 10)),
        'subsample': hp.hp.quniform('subsample',0.5,1.0,0.05),
        'colsample_bytree': hp.hp.quniform('colsample_bytree',0.8,0.9,0.02),
        'min_child_weight': hp.hp.quniform('min_child_weight', 100, 1000, 100),
        'reg_alpha': hp.hp.uniform('reg_alpha', 0.0, 1.0),   # L1 term
        'reg_lambda': hp.hp.uniform('reg_lambda', 0.0, 1.0), # L2 term
        # 'scale_pos_weight': hp.hp.uniform('scale_pos_weight', 0.1, 10.0),
        'learning_rate': hp.hp.quniform('learning_rate', 0.2, 0.5, 0.0001),
        'booster': 'gbtree', 'objective': 'binary:logistic', 'eval_metric': 'error', 'nthread': 12, 'verbosity': 0,
    }
    bayes_trials = hp.Trials()
    best_params = hp.fmin(fn=xgb_hyperopt, space=xgb_sapce, 
                               algo=hp.tpe.suggest, max_evals=5000, trials=bayes_trials)

我使用 xgb_space 获得了超出我的规范范围的最佳参数。例如。

max_depth
在最佳参数中是 6,而我在 xgb_space 中写了
'max_depth': hp.hp.choice('max_depth', np.arange(5, 15, dtype=int))
。我有什么不对吗?

best params {'colsample_bytree': 0.8500000000000001, 'learning_rate': 0.3985, 'max_depth': 6, 'min_child_weigh
t': 100.0, 'n_estimators': 15, 'reg_alpha': 0.43947748576358814, 'reg_lambda': 0.31095806914217305, 'subsample
': 0.9500000000000001}
machine-learning xgboost kaggle
© www.soinside.com 2019 - 2024. All rights reserved.