ml.lr_schedulers.cosine_decay

Defines a cosine learning rate scheduler with decay.

class ml.lr_schedulers.cosine_decay.CosineDecayLRSchedulerConfig(name: str = '???', total_steps: int = '${task.max_steps}', num_resets: int = 0, phase: int = '???', ramp_up_percent: float = 0.05, ramp_up_steps: int = '???', eta_min: float = 0.01, eta_max: float = 1.0, min_decay: float = 0.0001)[source]

Bases: BaseLRSchedulerConfig

total_steps: int = '${task.max_steps}'
num_resets: int = 0
phase: int = '???'
ramp_up_percent: float = 0.05
ramp_up_steps: int = '???'
eta_min: float = 0.01
eta_max: float = 1.0
min_decay: float = 0.0001
classmethod resolve(config: CosineDecayLRSchedulerConfig) None[source]

Runs post-construction config resolution.

Parameters:

config – The config to resolve

class ml.lr_schedulers.cosine_decay.CosineDecayLRScheduler(config: BaseConfigT)[source]

Bases: BaseLRScheduler[CosineDecayLRSchedulerConfig]

get_lr_scale(state: State) float[source]

Given a state, returns the current learning rate.

Parameters:

state – The current trainer state

Returns:

The computed learning rate to use