Understanding Keras Callbacks

Telephone

 

What is a Callback?

According to Keras documentation, callbacks are a set of functions that are applied at given stages if training procedures. It has several tasks likes stops training after certain accuracy or loss score, saving model at checkpoints, adjusting learning rates, and write TensorBoard logs.

Why should we use Callbacks?

These callbacks have advantages of helping get the view of the internal structure and statistics of the model and preventing overfitting. It is a powerful tool to customize the behavior of Keras model during training for evaluation/inference. They return information on the training algorithm and their callbacks can be customized.

Also, read: Object Detection from a Webcam in Keras using YOLO model

What are some different Callback functions?

There are many callback functions in Keras. Following are some of the useful ones:

 

  1. BaseLogger and HistoryIt is used to obtain average accuracy and average loss information for each epoch and is the easiest one to apply.
  2. Model CheckpointAs the name suggests, this callback saves your model as a checkpoint so that your model weights can be accessed later on. This callback saves the model as a hdf5 format as a checkpoint file after successful epochs.
  3. Early StoppingThis is typically used to avoid overfitting of neural networks by terminating the training process in case the model is not learning anything. It stops any kind of overtraining of the model. 
  4. TensorBoardIt is an excellent TensorFlow visualization tool that helps in machine learning experimentation using different controlling parameters. By using this callback function, logs are directly written to the directory that can be used for examining the model functioning. 
  5. LearningRateSchedulerIt helps in adjusting learning rates over time to determine the size of steps during the gradient step process. It returns the desired learning rate based on the current epoch. 
  6. LambdaCallbackThis is used to create your own callback function. It allows you to create controls from high-level to low-level. As the documentation says, it is used to create simple and custom callbacks on-the-fly.

Implementation of Callbacks in Keras

Here, we are trying to implement Callbacks in Keras on the Iris dataset.

Model Checkpoint

import keras
from keras import Sequential
import seaborn as sns
from sklearn import preprocessing
df = pd.read_csv('/content/sample_data/Iris.csv')
df.head()
#sns.pairplot(df, hue="Species")
targets = df['Species']
df.drop(['Species', 'Id'], axis = 1, inplace=True)
x = df.values
min_max_scaler = preprocessing.MinMaxScaler()
x_scaled = min_max_scaler.fit_transform(x)
df = pd.DataFrame(x_scaled)
targets = targets.replace(['Iris-setosa', 'Iris-versicolor', 'Iris-virginica'],[0,1,2])
targets = keras.utils.np_utils.to_categorical(targets, num_classes=3)
model = Sequential()
model.add(Dense(10, input_shape=(4,)))
model.add(Dense(10, activation="relu"))
model.add(Dense(3, activation="sigmoid"))
model.compile(optimizer='rmsprop',
              loss='categorical_crossentropy',
              metrics=['acc'])
filepath="weights-improvement-{epoch:02d}-{acc:.2f}.hdf5"
checkpoint = ModelCheckpoint(filepath, monitor='acc', verbose=1, save_best_only=True, mode='max')
callbacks_list = [checkpoint]
model.fit(x=df, y=targets, epochs=100, batch_size=10, callbacks=callbacks_list, verbose=0)
import keras
from keras import Sequential
import seaborn as sns
from sklearn import preprocessing
df = pd.read_csv('/content/sample_data/Iris.csv')
df.head()
#sns.pairplot(df, hue="Species")
targets = df['Species']
df.drop(['Species', 'Id'], axis = 1, inplace=True)
x = df.values #returns a numpy array
min_max_scaler = preprocessing.MinMaxScaler()
x_scaled = min_max_scaler.fit_transform(x)
df = pd.DataFrame(x_scaled)
targets = targets.replace(['Iris-setosa', 'Iris-versicolor', 'Iris-virginica'],[0,1,2])
targets = keras.utils.np_utils.to_categorical(targets, num_classes=3)
model = Sequential()
model.add(Dense(10, input_shape=(4,)))
model.add(Dense(10, activation="relu"))
model.add(Dense(3, activation="sigmoid"))
model.compile(optimizer='rmsprop',
              loss='categorical_crossentropy',
              metrics=['acc'])
# load weights
model.load_weights("/content/weights-improvement-64-0.95.hdf5")
# Compile model (required to make predictions)
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
print("Created model and loaded weights from file")
scores = model.evaluate(x=df, y=targets, verbose=0)
print("%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))

Early Stopping

import keras
from keras import Sequential
import seaborn as sns
from sklearn import preprocessing
df = pd.read_csv('/content/sample_data/Iris.csv')
df.head()
#sns.pairplot(df, hue="Species")
targets = df['Species']
df.drop(['Species', 'Id'], axis = 1, inplace=True)
x = df.values #returns a numpy array
min_max_scaler = preprocessing.MinMaxScaler()
x_scaled = min_max_scaler.fit_transform(x)
df = pd.DataFrame(x_scaled)
targets = targets.replace(['Iris-setosa', 'Iris-versicolor', 'Iris-virginica'],[0,1,2])
targets = keras.utils.np_utils.to_categorical(targets, num_classes=3)
model = Sequential()
model.add(Dense(10, input_shape=(4,)))
model.add(Dense(10, activation="relu"))
model.add(Dense(3, activation="sigmoid"))
model.compile(optimizer='rmsprop',
              loss='categorical_crossentropy',
              metrics=['acc'])
from keras.callbacks import EarlyStopping
earlystop = EarlyStopping(monitor = 'val_loss',
                          min_delta = 0,
                          patience = 3,
                          verbose = 1,
                          restore_best_weights = True)
callbacks_list = [earlystop]
model.fit(x=df, y=targets, epochs=100, batch_size=10, callbacks=callbacks_list, verbose=0)
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
print("Created model and loaded weights from file")
scores = model.evaluate(x=df, y=targets, verbose=0)
print("%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))

Tensorboard

# Load the TensorBoard notebook extension
%load_ext tensorboard
import tensorflow as tf
import datetime
# Clear any logs from previous runs
!rm -rf ./logs/
import keras
from keras import Sequential
import seaborn as sns
from sklearn import preprocessing
from time import time
from keras.callbacks import TensorBoard
df = pd.read_csv('/content/sample_data/Iris.csv')
df.head()
#sns.pairplot(df, hue="Species")
targets = df['Species']
df.drop(['Species', 'Id'], axis = 1, inplace=True)
x = df.values #returns a numpy array
min_max_scaler = preprocessing.MinMaxScaler()
x_scaled = min_max_scaler.fit_transform(x)
df = pd.DataFrame(x_scaled)
targets = targets.replace(['Iris-setosa', 'Iris-versicolor', 'Iris-virginica'],[0,1,2])
targets = keras.utils.np_utils.to_categorical(targets, num_classes=3)
model = Sequential()
model.add(Dense(10, input_shape=(4,)))
model.add(Dense(10, activation="relu"))
model.add(Dense(3, activation="sigmoid"))
model.compile(optimizer='rmsprop',
              loss='categorical_crossentropy',
              metrics=['acc'])
tensorboard = TensorBoard(log_dir="logs/[]".format(time()))
callbacks_list = [tensorboard]
model.fit(x=df, y=targets, epochs=100, batch_size=10, callbacks=callbacks_list, verbose=0)
print("Created model and loaded weights from file")
scores = model.evaluate(x=df, y=targets, verbose=0)
print("%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))
%tensorboard --logdir=logs/

 

BaseLogger and History

import keras
from keras import Sequential
import seaborn as sns
from sklearn import preprocessing
from time import time
from keras.callbacks import TensorBoard
df = pd.read_csv('/content/sample_data/Iris.csv')
df.head()
#sns.pairplot(df, hue="Species")
targets = df['Species']
df.drop(['Species', 'Id'], axis = 1, inplace=True)
x = df.values #returns a numpy array
min_max_scaler = preprocessing.MinMaxScaler()
x_scaled = min_max_scaler.fit_transform(x)
df = pd.DataFrame(x_scaled)
targets = targets.replace(['Iris-setosa', 'Iris-versicolor', 'Iris-virginica'],[0,1,2])
targets = keras.utils.np_utils.to_categorical(targets, num_classes=3)
model = Sequential()
model.add(Dense(10, input_shape=(4,)))
model.add(Dense(10, activation="relu"))
model.add(Dense(3, activation="sigmoid"))
model.compile(optimizer='rmsprop',
              loss='categorical_crossentropy',
              metrics=['acc'])
history=model.fit(x=df, y=targets, epochs=100, batch_size=10, callbacks=callbacks_list, verbose=0)
plt.plot(history.history['loss'], label='Loss')
plt.plot(history.history['acc'], label='Accuracy')
plt.ylabel('Loss/Acc value')
plt.xlabel('No of epoch')
plt.legend(loc="upper left")
plt.show()

 

Conclusion

In this blog, we have learned about callbacks and callback functions in Keras.

You could always leave your comments for some suggestions or doubts.

Leave a Reply

Your email address will not be published. Required fields are marked *