nntoolbox.losses.smooth module¶
-
class
nntoolbox.losses.smooth.
SmoothedCrossEntropy
(weight: Optional[torch.Tensor] = None, reduction: str = 'mean', eps: float = 0.1)[source]¶ Bases:
torch.nn.modules.module.Module
Drop-in replacement for cross entropy loss with label smoothing:
loss(y_hat, y) = -sum_c p_c * log y_hat_c
where p_c = 1 - epsilon if c = y and epsilon / (C - 1) otherwise
Based on:
Note that deprecated arguments of CrossEntropyLoss are not included
-
training
: bool¶
-