feature_weights参数没有影响Xgboost

问题描述 投票:0回答:1

xgboost
有一个参数
feature_weights
应该会影响模型选择特征的概率,也就是说,我们可以给每个特征更多或更少的权重,但似乎该参数不起作用或者是我做错了什么吗?

X <- as.matrix(iris[,-5])
Y <- ifelse(iris$Species=="setosa", 1, 0)

library(xgboost)
dm1 <- xgb.DMatrix(X, label = Y)
#I set different probabilities for each feature
dm2 <- xgb.DMatrix(X, label = Y, feature_weights = c(1, 0, 0, 0.01))
params <- list(objective = "binary:logistic", eval_metric = "logloss")

set.seed(1)



xgb1 <- xgboost(data = dm1, params = params, nrounds = 10, print_every_n = 5)

[1] train-logloss:0.448305 
[6] train-logloss:0.090220 
[10]train-logloss:0.033148 



xgb2 <- xgboost(data = dm2, params = params, nrounds = 10, print_every_n = 5)

[1] train-logloss:0.448305 
[6] train-logloss:0.090220 
[10]train-logloss:0.033148 

但是模型的行为完全相同,似乎参数

feature_weights
被简单地忽略了

r machine-learning xgboost
1个回答
0
投票

看来该参数仅在

colsample_by*
参数之一小于 1 时才起作用(默认情况下它们都是 1)。

© www.soinside.com 2019 - 2024. All rights reserved.