DL0047 Focal Loss II

Please compare focal loss and weighted cross-entropy.

Answer

Weighted Cross-Entropy (WCE) rescales loss by class to correct prior imbalance and is simple and robust for noisy labels; Focal Loss (FL) multiplies cross-entropy by a difficulty-dependent factor \gamma to suppress easy-example gradients and focus learning on hard examples, making it preferable when many easy negatives overwhelm training but requiring careful tuning to avoid amplifying label noise.

\text{WeightedCE}(p_t) = -\alpha_t \log(p_t)
Where:
p_t is the model probability for the ground-truth class;
\alpha_t is the per-class weight for class t.

\text{FocalLoss}(p_t) = -\alpha_t (1 - p_t)^\gamma \log(p_t)
Where:
p_t is the model probability for the ground-truth class;
\alpha_t is the optional per-class weight for class t;
\gamma \ge 0 is the focusing parameter that down-weights easy examples.

Here is a table to compare focal loss and weighted cross-entropy.

The figure below compares Cross-Entropy, Weighted Cross-Entropy, and Focal Loss.


Login to view more content


Did you solve the problem?

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *