Activation Functions in Keras
In this tutorial, we will learn about activation functions and how to use them with Keras. This is one of the mathematical parts of deep learning.
Activation Functions
Suppose we have to classify an object with a lot of categories. For example, Fruits. We look at the image and based on the properties ( color, shape, etc.), we are finally able to find if the fruit is an apple, orange, etc. Obviously, we can do this using a bunch of “If-Else” statements like in normal code. Undoubtedly this would be a very tedious process ( Consider 1000 categories). Alternatively, we can use activation functions in our neural networks to find out which neuron should be activated.
In a neural network, we have different weights for neurons in each layer. Apart from this, we have activation functions to check if the neuron should be activated or not. Basically, activation functions are used to bring nonlinearity in our transformation. Also, we can use different activation functions for different layers.
Step Function
If the output is positive, the neuron is activated. One of the simplest activation functions. Moreover, you can set different thresholds and not just 0. Also, no inbuilt function is available in Keras as it is already very simple.
#Mathematically #f(x)=1 if x>=0 #f(x)=0 if x<0 def step(x): if x>=0: return 1 else: return 0 print(step(3))
Sigmoid Function
One of the most important functions for Neural Networks. It brings in nonlinearity in the model.
import math #mathematically #Sigmoid Function #f(x)=1/(1+exp(-x)) def sigmoid(x): k=math.exp(-x) d=1/(1+k) return d print(sigmoid(4))
For Keras, We can directly use this as:
import tensorflow as tf import numpy a = tf.constant([-20, -1.0, 0.0, 1.0, 20], dtype = tf.float32) b = tf.keras.activations.sigmoid(a) b.numpy()
ReLU
The Rectified Linear Unit offers better performance than other activation functions. Besides, there can be many modifications to this function like:
- Leaky
- Exponential
import math #For ReLU #f(x)=max(0.0,x) #Straight line in the first quadrant def relu(x): return max(0.0,x) #For LeakyReLU #f(x)=x if x>=0 #f(x)=alpha*x if(x<0) #You can choose different values of alpha def leakyrelu(x,alpha): if x>=0: return x else: return alpha*x #For Exponential Linear Units (ELU) #f(x)=x if(x>=0) #f(x)=alpha*exp(x)-1 def elu(x,alpha): if x>=0: return x else: return alpha*(math.exp(x))-1 print(relu(3)) print(leakyrelu(-3,0.5)) print(elu(-3,0.5))
In Keras we can use:
import tensorflow as tf foo = tf.constant([-10, -5, 0.0, 5, 10], dtype = tf.float32) tf.keras.activations.relu(foo).numpy() import keras #For LeakyRelu from keras.layers import LeakyReLU # instead of cnn_model.add(Activation('relu')) cnn_model.add(LeakyReLU(alpha=0.1)) #For ELU elu_aplha=0.5 model.add(ELU(alpha=elu_alpha))
Softmax Function
It pushes one value close to 1 and another to 0. Thus, it helps normalize probabilities. Moreover, it needs a vector as input
#SoftMax Function #f(x)=exp(x)/sum(exp(x)) import numpy as np def softmax(x): ex = np.exp(x - np.max(x)) return ex / ex.sum() a=[1,2,3] print(softmax(a))
In Keras we can use:
import tensorflow as tf a = tf.constant([-1.0, 0.0, 1.0], dtype = tf.float32) b = tf.keras.activations.softsign(a) b.numpy() #In layers you can use from keras.layers import Activation, Input, Dense from keras.models import Model x2 = Activation('softmax')(x2)
Tanh Function
A Non-Linear Function with range -1 to 1. Derivatives are steeper than sigmoid function
import math #For Tanh Function #f(x)=(exp(x)-exp(-x))/(exp(x)+exp(-x)) def tanh(x): k=math.exp(x) l=math.exp(-x) m=k-l n=k+l return m/n print(tanh(0.5))
For Keras, below is the code for activation function:
import numpy from tensorflow.keras import layers from tensorflow.keras import activations a = tf.constant([-3.0,-1.0, 0.0,1.0,3.0], dtype = tf.float32) b = tf.keras.activations.tanh(a) b.numpy() #For layers in Neural Network model.add(Dense(12, input_shape=(8,), activation='tanh')) model.add(Dense(8, activation='tanh'))
Activation Functions are one of the most important components of a neural network. There are some more less-used functions. You can check them at:
You can ask your doubts in the comments section.
Machine Learning is ❤️
Leave a Reply