Precision Vs Recall

Sanjay Kumar PhD
2 min readMay 19, 2023

--

Precision: Precision is the ratio of correctly predicted positive observations to the total predicted positives. It answers the question of all patients that were predicted as having a disease, how many of them actually had the disease? High precision relates to a low false positive rate. The formula to calculate precision is: Precision = True Positives / (True Positives + False Positives)

Precision is about being precise. So, even if we managed to capture only a few of the actual positive cases, as long as we predicted those few cases as positive, we are precise. In scenarios where we want to be very sure of our prediction, we would use this metric. For instance, if we are predicting whether a patient needs surgery, we want to be very sure about our prediction or the consequences could be dire.

Recall (Sensitivity): Recall is the ratio of correctly predicted positive observations to all observations in actual class — yes. It answers the question of all the patients that truly have the disease, how many did we predict correctly? The formula to calculate recall is: Recall = True Positives / (True Positives + False Negatives).

Recall is not so much about capturing cases correctly but about capturing all cases that have “Positive” as an actual outcome. If your use case/application is more about trying not to miss any positive cases (rather than making wrong predictions occasionally), this is the metric to focus on. For instance, if we are predicting whether a person is a criminal, we would want our system to capture all possible criminals (even if it means we misclassified some innocent people).

#datascience #machinelearning #classification #precision #recall

--

--

Sanjay Kumar PhD
Sanjay Kumar PhD

Written by Sanjay Kumar PhD

AI Product | Data Science| GenAI | Machine Learning | LLM | AI Agents | NLP| Data Analytics | Data Engineering | Deep Learning | Statistics

No responses yet