Micro-Average Precision and Recall, F-Score:
·
Definition:
Ø Micro-average precision and recall are performance
metrics used in the context of multiclass classification problems, where there
are multiple classes to predict.
·
Micro-Average
Precision:
Ø Micro-average precision is calculated by considering
all instances and all classes collectively. It treats each instance as an
individual contribution, irrespective of the class it belongs to.
Ø Formula:
Micro-Average Precision =
True Positives / (True Positives + False Positives)
Micro-Average Precision=Total True PositivesTotal True Positives + Total False PositivesMicro-Average Precision=Total True Positives + Total False PositivesTotal True Positives
·
Micro-Average
Recall:
Ø Micro-average recall is also calculated by considering
all instances and all classes collectively. It measures how well the model
identifies all instances of each class.
Ø Formula:
Micro-Average Recall = True
Positives / (True Positives + False Negatives)
·
Micro-Average
F-Score:
Ø Micro-average F-score is the harmonic mean of
micro-average precision and micro-average recall. It balances precision and
recall in a way that considers each instance equally.
Ø Formula:
Micro-Average F-score = 2 * (Micro-Average Precision * Micro-Average Recall) / (Micro-Average Precision + Micro-Average Recall)
Example:
Suppose we have a multiclass classification model with three classes:
"A," "B," and "C." After evaluation, we have the
following results:
- Total
True Positives: 80
- Total
False Positives: 10
- Total
False Negatives: 5
Now, we can calculate micro-average precision, micro-average recall, and
micro-average F-score:

0 Comments