ReLU Layer in Keras | Python

Hello everyone, In this tutorial, we will learn about the ReLU layer in Keras with Python code example.

ReLU stands for the Rectified Linear Unit and acts as an activation layer in Keras. An activation layer in Keras is equivalent to an input layer with an activation function passed as an argument. An activation function is a mathematical function between the input and output gates of the activation layer. The activation function performs the mathematical operations on the given input and passes its result as an output.

If this layer is used as the first layer in a Keras model, then the input_shape should be a tuple of integers

A ReLU Layer

tf.keras.layers.ReLU(max_value=None, negative_slope=0, threshold=0)

A ReLU layer accepts three arguments :

max_value: It should be float and greater than ‘0’. Its default value is None, which is unlimited.
negative_slope: It should be float and greater than ‘0’. Its default value is 0.
threshold: It should be a float. Its default value is 0.

An Example of a ReLU Layer using Keras

# Importing the libraries
import tensorflow as tf
# Creating a relu layer
relu_layer = tf.keras.layers.ReLU()
result = relu_layer([-5.0, -2.0, 0.0, 3.0])
list(result.numpy())

First, we need to import Tensorflow and create a ReLU layer. Then, a tuple of integers is given as input to the ReLU layer. At last, the output is converted into a list and printed.

Output:

[0.0, 0.0, 0.0, 3.0]

Since we have not given any arguments in the ReLU layer, the default values of negative_slope and threshold are 0. The default value of max_value is None, i.e. unlimited.
Here, the input given to the ReLU layer is like f(x), and the activation function computations of f(x) are done with respect to the arguments. The computations of f(x) are done element-wise and it follows the below mathematical function.

f(x) = 0 if x <= 0
       x if x > 0

Since -5.0, -2.0, and 0.0 in the input are less than or equal to zero, it becomes 0 in the output. And 3.0 in the input is greater than zero, it remains the same in the output.

Also, read: Sentiment analysis using Keras

ReLU layer with different arguments

Argument: max_value

import tensorflow as tf
relu_layer = tf.keras.layers.ReLU(max_value=2.0)
result = relu_layer([-5.0, -2.0, 0.0, 3.0])
list(result.numpy())

Output:

[0.0, 0.0, 0.0, 2.0]

Exactly similar to the previous example, the input is like f(x). The computations are done element-wise and follow the below mathematical function.

f(x) = 0 if x < max_value
       max_value if x >= max_value

Since -5.0, -2.0, and 0.0 in the input are less than max_value, it becomes 0 in the output. And 3.0 in the input is greater than the max_value, it is replaced with the max_value in the output.

Argument: negative_slope

relu_layer = tf.keras.layers.ReLU(negative_slope=2.0)
result = relu_layer([-5.0, -2.0, 0.0, 3.0])
list(result.numpy())

Output:

[-10.0, -4.0, 0.0, 3.0]

Here, the mathematical function followed is:

f(x) = (x * negative_slope) if x is negative
       x if x is positive

As we can see in the above output, all the negative values in the input are multiplied with the value of negative_slope and the positive values remain the same.

Argument: threshold

relu_layer = tf.keras.layers.ReLU(threshold=2.5)
result = relu_layer([-5.0, -1.5, 2.5, 3.0])
list(result.numpy())

Output:

[-0.0, -0.0, 0.0, 3.0]

Here, the mathematical function followed is:

f(x) = 0 if x <= threshold
       x if x > threshold

As we can see in the above output, if the values in the input are less than or equal to the threshold, it becomes zero. The value greater than the threshold remains the same.

I hope that this tutorial will help you in understanding the ReLU layer easily and efficiently. Thank you.

Leave a Reply

Your email address will not be published. Required fields are marked *