具有不同功能的模型的Sklearn投票合奏,并通过k倍交叉验证进行测试

问题描述 投票:0回答:1

我有一个具有4个不同组特征的数据框。

我需要使用这四个不同的功能组创建4个不同的模型,并将它们与整体投票分类器结合起来。此外,我需要使用k倍交叉验证来测试分类器。

但是,我发现很难将不同的功能集,投票分类器和k倍交叉验证与sklearn中的功能结合起来。以下是我到目前为止的代码。

y = df1.index
x = preprocessing.scale(df1)

SVM = svm.SVC(kernel='rbf', C=1)
rf=RandomForestClassifier(n_estimators=200)
ann = MLPClassifier(solver='lbfgs', alpha=1e-5, hidden_layer_sizes=(25, 2), random_state=1)
neigh = KNeighborsClassifier(n_neighbors=10)

models = list()
models.append(('facial', SVM))
models.append(('posture', rf))
models.append(('computer', ann))
models.append(('physio', neigh))

ens = VotingClassifier(estimators=models)

cv = KFold(n_splits=10, random_state=None, shuffle=True)
scores = cross_val_score(ens, x, y, cv=cv, scoring='accuracy')

如您所见,该程序对所有4种型号都使用相同的功能。如何改进该程序以实现我的目标?

scikit-learn classification voting ensemble-learning k-fold
1个回答
0
投票

我确实使用管道实现了这一目标,

y = df1.index
x = preprocessing.scale(df1)

phy_features = ['A', 'B', 'C']
phy_transformer = Pipeline(steps=[('imputer', SimpleImputer(strategy='median')), ('scaler', StandardScaler())])
phy_processer = ColumnTransformer(transformers=[('phy', phy_transformer, phy_features)])

fa_features = ['D', 'E', 'F']
fa_transformer = Pipeline(steps=[('imputer', SimpleImputer(strategy='median')), ('scaler', StandardScaler())])
fa_processer = ColumnTransformer(transformers=[('fa', fa_transformer, fa_features)])


pipe_phy = Pipeline(steps=[('preprocessor', phy_processer ),('classifier', SVM)])
pipe_fa = Pipeline(steps=[('preprocessor', fa_processer ),('classifier', SVM)])

ens = VotingClassifier(estimators=[pipe_phy, pipe_fa])

cv = KFold(n_splits=10, random_state=None, shuffle=True)
for train_index, test_index in cv.split(x):
    x_train, x_test = x[train_index], x[test_index]
    y_train, y_test = y[train_index], y[test_index]
    ens.fit(x_train,y_train)
    print(ens.score(x_test, y_test))

如果使用ColumnTransforms时收到TypeError,请参考sklearn Pipeline: argument of type 'ColumnTransformer' is not iterable。>

© www.soinside.com 2019 - 2024. All rights reserved.