In sklearn, precision_score, recall_recore, f1_score of two categories

precision_score: precision rate, precision rate
P = TPTP + FPP=\frac{TP}{TP+FP}P=TP+FPTP

#	假设二分类标签为1,2
from sklearn.metrics import accuracy_score, precision_score, recall_score, f1_score
precision_score(y_true, y_pred, average="binary", pos_label=1)
# pos_label设置为1,代表标签为1的样本是正例,标签为2的样本是负例。

accuracy_score: accuracy rate
A cc = TP + TNTP + FP + TN + FN Acc=\frac{TP+TN}{TP+FP+TN+FN}Acc=TP+FP+TN+FNTP+TN

#	假设二分类标签为1,2
accuracy_score(y_true, y_pred)

recall_score: recall rate, recall rate
R = TPTP + FNR=\frac{TP}{TP+FN}R=TP+FNTP

#	假设二分类标签为1,2
recall_score(y_true, y_pred, average="binary", pos_label=1)

f1_score: F1值
F 1 = 2 ∗ P ∗ R P + R F1=\frac{2*P*R}{P+R} F 1=P+R2PR

#	假设二分类标签为1,2
f1_score(y_true, y_pred, average="binary", pos_label=1)

Attachment: Confusion matrix of two categories

reality forecast result
Positive example Counterexample
Positive example TP (real case) FN (false counterexample)
Counterexample FP (false positive) TN (True Negative Example)

Guess you like

Origin blog.csdn.net/tailonh/article/details/112605018