Skip to content

Commit fe41424

Browse files
committed
passage flip
1 parent 5f99304 commit fe41424

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

faq/computing-the-f1-score.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -37,8 +37,8 @@ If we write the two metrics PRE and REC in terms of true positives (TP), true ne
3737
- PRE = TP / (TP + FP)
3838
- REC = TP / (TP + FN)
3939

40-
Thus, the recall score gives us an idea (expressed as a score from 1.0 to 0.0, from good to bad) of the proportion of how many actual spam emails (TP) we correctly classified as spam among all the emails we classified as spam (TP + FP).
41-
In contrast, the precision (also ranging from 1.0 to 0.0) tells us about how many of the actual spam emails (TP) we "retrieved" or "recalled" (TP + FN).
40+
Thus, the precision score gives us an idea (expressed as a score from 1.0 to 0.0, from good to bad) of the proportion of how many actual spam emails (TP) we correctly classified as spam among all the emails we classified as spam (TP + FP).
41+
In contrast, the recall (also ranging from 1.0 to 0.0) tells us about how many of the actual spam emails (TP) we "retrieved" or "recalled" (TP + FN).
4242

4343
---
4444

0 commit comments

Comments
 (0)