nntoolbox.callbacks.warmup module

Learning rate warmup (UNTESTED)

class nntoolbox.callbacks.warmup.ConstantLRWarmup(min_lr, duration: int, timescale: str = 'iter')[source]

Bases: nntoolbox.callbacks.warmup.LRWarmup

Keeping the learning rate at a small value for several iterations/epochs

get_lr() → float[source]
class nntoolbox.callbacks.warmup.GradualLRWarmup(min_lr: float, max_lr: float, duration: int, timescale: str = 'iter')[source]

Bases: nntoolbox.callbacks.warmup.LRWarmup

Gradually increase the learning rate from a small value for several iterations/epochs

get_lr() → float[source]
class nntoolbox.callbacks.warmup.LRWarmup(duration: int, timescale: str = 'iter')[source]

Bases: nntoolbox.callbacks.callbacks.Callback

Start training with a small learning rate

References:

Priya Goyal et al. “Accurate, Large Minibatch SGD: Training ImageNet in 1 Hour.” https://arxiv.org/abs/1706.02677

get_lr() → float[source]
on_batch_begin(data: Dict[str, torch.Tensor], train) → Dict[str, torch.Tensor][source]
on_epoch_begin()[source]
update_lr()[source]