了解精确度,召回率和F度量

问题描述 投票:0回答:1

我正在尝试显示精度,召回率和F感,但其中的极低值,您知道为什么吗?

total_verbatim = X.shape[0]
print(total_verbatim)
labels = np.zeros(total_verbatim);#creation de variable ; consulter les mal étiquettés +bien étiquettés
#error avec configuration avec l'ensemble 
labels[1:1315]=0; #motivations
labels[1316:1891]=1;#freins


cv_splitter = KFold(n_splits=10, shuffle=False, random_state=None)
model1 = LinearSVC()
model2 = MultinomialNB()
models = [model1, model2]
for model in models:    
    #verbatim_preprocess = np.array(verbatim_train_remove_stop_words_lemmatize)
    y_pred = cross_val_predict(model, X, labels, cv=cv_splitter)
    print("Model: {}".format(model))
    print("matrice confusion: {}".format(confusion_matrix(labels, y_pred)))
    print("Accuracy: {}".format(accuracy_score(labels, y_pred)))
    print("Precision: {}".format(precision_score(labels, y_pred)))
    print("Recall: {}".format(recall_score(labels, y_pred)))
    print("F mesure: {}".format(f1_score(labels, y_pred)))

有结果,当手动计算时,结果的精度和查全率要高得多:

Model: LinearSVC(C=1.0, class_weight=None, dual=True, fit_intercept=True,
     intercept_scaling=1, loss='squared_hinge', max_iter=1000,
     multi_class='ovr', penalty='l2', random_state=None, tol=0.0001,
     verbose=0)
matrice confusion: [[963 353]
 [518  57]]
Accuracy: 0.5393971443680592
Precision: 0.13902439024390245
Recall: 0.09913043478260869
F mesure: 0.11573604060913706
Model: MultinomialNB(alpha=1.0, class_prior=None, fit_prior=True)
matrice confusion: [[1248   68]
 [ 574    1]]
Accuracy: 0.6604970914859862
Precision: 0.014492753623188406
Recall: 0.0017391304347826088
F mesure: 0.0031055900621118015
python python-3.x scikit-learn classification confusion-matrix
1个回答
0
投票

Oui。分类模型不好。

仅通过查看混淆矩阵:

matrice confusion: [[963 353]
                    [518  57]]

您会看到1类的353个样本和2类的518个样本被错误分类。

理想情况下,应该只在对角线上计数。


为了改进模型,请尝试使用模型的不同超参数和不同的折数。

执行GridSearch

© www.soinside.com 2019 - 2024. All rights reserved.