Evaluation: Precision and Recall

Sat Apr 21 2018

Two definitions that we see multiple times while doing classification are precision and recall.

Let us say we have a image recognition classification algorithm that wants to classify if an image is a hot-dog or not a hot-dog.

We have 70 images where 50 are hot-dogs and 20 are not hot-dogs.
After we run our program it selects 40 images that it believes are hot-dogs and 30 are not hot-dogs.

After checking them:
From the 40 that were selected as hot-logs 27 are actually hot-dogs and 13 are not hot-dogs
And from the 30 that were selected as not hot-dogs, 23 where hot-dogs and 7 are actually not hot-dogs.

So for the dogs

True positive: 27
True negative: 20
False positive: 13
False negative: 10
—————————
Identified as true: 40
Identified as false: 30

Precision = (True positive) / (Identified as true) = (TP + FP) = 27 / 40
Recall = (True positive) / (TP + FN) = 27 / 37

So precision is the true positive divided by the total amount of selected true items.
The recall if the true positive divided by the total amount of true items.

Full Stack Weekly Newsletter

A free weekly newsletter for full stack web developers!