调整阈值cros_val_score sklearn

问题描述 投票:2回答:1

有一种方法可以设置阈值cross_val_score sklearn?

我已经训练了一个模型,然后将阈值调整为0.22。下面的模型:

# Try with Threshold
pred_proba = LGBM_Model.predict_proba(X_test)


# Adjust threshold for predictions proba
prediction_with_threshold = []
for item in pred_proba[:,0]:
    if item > 0.22 :
        prediction_with_threshold.append(0)
    else:
        prediction_with_threshold.append(1)

print(classification_report(y_test,prediction_with_threshold))

然后,我想使用cross_val_score验证此模型。我已经搜索过,但是找不到设置cross_val_score阈值的方法。我使用过的cross_val_score如下所示:

F1Scores = cross_val_score(LGBMClassifier(random_state=101,learning_rate=0.01,max_depth=-1,min_data_in_leaf=60,num_iterations=200,num_leaves=70),X,y,cv=5,scoring='f1')
F1Scores

### how to adjust threshold to 0.22 ??

或者还有其他方法可以使用阈值来验证此模型吗?

python machine-learning scikit-learn cross-validation
1个回答
1
投票

假设您正在处理两类分类问题,则可以使用阈值化方法覆盖predict对象的LGBMClassifier方法,如下所示:

import numpy as np
from lightgbm import LGBMClassifier
from sklearn.datasets import make_classification

X, y = make_classification(n_features=10, random_state=0, n_classes=2, n_samples=1000, n_informative=8)

class MyLGBClassifier(LGBMClassifier):
    def predict(self,X, threshold=0.22,raw_score=False, num_iteration=None,
                pred_leaf=False, pred_contrib=False, **kwargs):
        result = super(MyLGBClassifier, self).predict_proba(X, raw_score, num_iteration,
                                    pred_leaf, pred_contrib, **kwargs)
        predictions = [1 if p>threshold else 0 for p in result[:,0]]
        return predictions

clf = MyLGBClassifier()
clf.fit(X,y)
clf.predict(X,threshold=2)  # just testing the implementation
# [0,0,0,0,..,0,0,0]        # we get all zeros since we have set threshold as 2

F1Scores = cross_val_score(MyLGBClassifier(random_state=101,learning_rate=0.01,max_depth=-1,min_data_in_leaf=60,num_iterations=2,num_leaves=5),X,y,cv=5,scoring='f1')
F1Scores
#array([0.84263959, 0.83333333, 0.8       , 0.78787879, 0.87684729])

希望这会有所帮助!

© www.soinside.com 2019 - 2024. All rights reserved.