nntoolbox.losses.smooth module

class nntoolbox.losses.smooth.SmoothedCrossEntropy(weight: Optional[torch.Tensor] = None, reduction: str = 'mean', eps: float = 0.1)[source]

Bases: torch.nn.modules.module.Module

Drop-in replacement for cross entropy loss with label smoothing:

loss(y_hat, y) = -sum_c p_c * log y_hat_c

where p_c = 1 - epsilon if c = y and epsilon / (C - 1) otherwise

Based on:

http://openaccess.thecvf.com/content_CVPR_2019/papers/He_Bag_of_Tricks_for_Image_Classification_with_Convolutional_Neural_Networks_CVPR_2019_paper.pdf

Note that deprecated arguments of CrossEntropyLoss are not included

forward(output: torch.Tensor, label: torch.Tensor) → torch.Tensor[source]
Parameters
  • output – Predicted class scores. (batch_size, C, *)

  • label – The true label. (batch_size, *)

Returns

smooth_label(label: torch.Tensor, n_class: int) → torch.Tensor[source]

Smooth the label

Parameters
  • label – (batch_size, *)

  • n_class – number of class of the output

Returns

(batch_size, C, *)

training: bool