In case, the classes of your dataset contain high similarity index, in such cases, it is imperative to have callbacks necessary for your model training and convergence. For obtaining such a model, callbacks are often used. This utility aims at providing callbacks which are oftenly used while training deep learning models and returns a list of callbacks. Pass this as an argument while training deep learning models.

Learning Rate Scheduler

There are 3 different types of learning rate schedulers.

  1. RAMPUP Learning Rate Scheduler (call through 'rampup')

  2. Simple Learning Rate Scheduler      (call through 'simple')

  3. Step-wise Learning Rate Scheduler   (call through 'stepped')

  4. Step Decay Learning Rate Scheduler (call through 'step_decay')

Early Stopping Callback

Use Early Stopping Callback as a measure to prevent the model from overfitting. The default callback setting is as follows
monitor : 'val_loss', min_delta = 0, patience = 0, verbose = 0,
mode = 'auto', baseline = None, restore_best_weights = False.


To use the default settings of Early Stopping Callback, pass

Readuce LR On Plateau

Prevent your model from getting stuck at local minima using ReduceLROnPlataeu callback. The default implementation has the following parameter settings =>
'monitor' : 'val_loss', 'factor' : 0.1, 'patience' : 10, 'verbose' : 0, mode = 'auto', min_delta = 0.0001, cooldown = 0, min_lr = 0






Combine Multiple callbacks

callbacks = get_callbacks(lr_scheduler = 'rampup', early_stopping = 'default', reduce_lr_on_plateau = 'default')

callbacks = get_callbacks(early_stopping = 'default')

from quick_ml.callbacks import get_callbacks

callbacks = get_callbacks(reduce_lr_on_plateau = 'default')

callbacks = get_callbacks(lr_scheduler = 'rampup')

Was this helpful?