Emotion Detection Using CNN in Python Using Keras

In this article,  We’ll Learn Real-Time Emotion Detection Using CNN. The Dataset we will use is already present in Kaggle you can download the dataset from here – Dataset-FER2013, It contains 48*48 px grayscale images. There are 7 classes/categories in this dataset (0=angry, 1=disgust, 2=fear, 3=happy, 4=sad, 5=surprise, 6=neutral).

Note: Use Google Colab and enable GPU

Since we are using google colab. Upload the dataset in your google drive, we’ll access the dataset from there.

IMPORT REQUIRED LIBRARIES

Here we are going to use the Keras deep learning API that comes with the TensorFlow module of Python for this emotion detection task. There are some other modules we are going to use. So below we are importing all the required Python modules.

import sys,os
import tensorflow as tf
import numpy as np
import pandas as pd
import keras
from keras.models import Sequential 
from keras.layers import Dense, Dropout, Activation, Flatten  
from keras.layers import Conv2D, MaxPooling2D, BatchNormalization,AveragePooling2D
from keras.losses import categorical_crossentropy  
from keras.optimizers import Adam  
from keras.regularizers import l2 
from keras.utils import np_utils

LOADING, READING AND PRE-PROCESSING THE DATASET

See the Python code below:

from google.colab import drive
drive.mount('/content/drive')  #this will ask an authentication code of your drive, after giving access to your drive you are ready to read the dataset from there

os.chdir("ENTER PATH WHERE DATASET IS IN YOUR DRIVE")  # EXAMPLE:- /content/drive/My Drive/Emotion Detection

df = pd.read_csv("archive.zip")    #READING DATASET
print(df)

OUTPUT:-

       emotion                                             pixels        Usage
0            0  70 80 82 72 58 58 60 63 54 58 60 48 89 115 121...     Training
1            0  151 150 147 155 148 133 111 140 170 174 182 15...     Training
2            2  231 212 156 164 174 138 161 173 182 200 106 38...     Training
3            4  24 32 36 30 32 23 19 20 30 41 21 22 32 34 21 1...     Training
4            6  4 0 0 0 0 0 0 0 0 0 0 0 3 15 23 28 48 50 58 84...     Training
...        ...                                                ...          ...
35882        6  50 36 17 22 23 29 33 39 34 37 37 37 39 43 48 5...  PrivateTest
35883        3  178 174 172 173 181 188 191 194 196 199 200 20...  PrivateTest
35884        0  17 17 16 23 28 22 19 17 25 26 20 24 31 19 27 9...  PrivateTest
35885        3  30 28 28 29 31 30 42 68 79 81 77 67 67 71 63 6...  PrivateTest
35886        2  19 13 14 12 13 16 21 33 50 57 71 84 97 108 122...  PrivateTest

[35887 rows x 3 columns]

Now, we’ll create different lists of the train and test image pixels. And Check in Usage, if found Training then adds pixels in X_train and respective emotions in train_y. Similarly, for pixels of the Public test, we add it to test lists. Therefore, Code:-

X_train,train_y,X_test,test_y=[],[],[],[]  


for index, row in df.iterrows():  
    val=row['pixels'].split(" ")  
    if 'Training' in row['Usage']:
      X_train.append(np.array(val,'float32'))  
      train_y.append(row['emotion'])  
    elif 'PublicTest' in row['Usage']:  
      X_test.append(np.array(val,'float32'))  
      test_y.append(row['emotion'])

Then converting and reshaping the pixels of X_train, X_test, and Converting emotions in train_y, test_y to categorical.

num_features = 64  
num_labels = 7  
batch_size = 64  
epochs = 175
width, height = 48, 48  


X_train = np.array(X_train,'float32')  
train_y = np.array(train_y,'float32')  
X_test = np.array(X_test,'float32')  
test_y = np.array(test_y,'float32')  

train_y=np_utils.to_categorical(train_y, num_classes=num_labels)  
test_y=np_utils.to_categorical(test_y, num_classes=num_labels)  ˆ

X_train -= np.mean(X_train, axis=0)  
X_train /= np.std(X_train, axis=0)  

X_test -= np.mean(X_test, axis=0)  
X_test /= np.std(X_test, axis=0)  

X_train = X_train.reshape(X_train.shape[0], 48, 48, 1)  

X_test = X_test.reshape(X_test.shape[0], 48, 48, 1)

BUILDING MODEL

model = Sequential()  

model.add(Conv2D(64, kernel_size=(3, 3), activation='relu', input_shape=(X_train.shape[1:]), padding='same'))  
model.add(Conv2D(64,kernel_size= (3, 3), activation='relu', padding='same'))  
model.add(BatchNormalization())  
model.add(MaxPooling2D(pool_size=(2,2)))  
model.add(Dropout(0.3))  

#2nd convolution layer  
model.add(Conv2D(64, (3, 3), activation='relu', padding='same'))  
model.add(Conv2D(64, (3, 3), activation='relu', padding='same'))  
model.add(BatchNormalization())  
model.add(MaxPooling2D(pool_size=(2,2)))  
model.add(Dropout(0.3))  

#3rd convolution layer  
model.add(Conv2D(128, (3, 3), activation='relu', padding='same'))  
model.add(Conv2D(128, (3, 3), activation='relu', padding='same'))  
model.add(BatchNormalization())  
model.add(MaxPooling2D(pool_size=(2,2)))  

model.add(Flatten())  
model.add(Dense(num_labels, activation='softmax'))  

model.summary()

OUTPUT:-

Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d (Conv2D)              (None, 48, 48, 64)        640       
_________________________________________________________________
conv2d_1 (Conv2D)            (None, 48, 48, 64)        36928     
_________________________________________________________________
batch_normalization (BatchNo (None, 48, 48, 64)        256       
_________________________________________________________________
max_pooling2d (MaxPooling2D) (None, 24, 24, 64)        0         
_________________________________________________________________
dropout (Dropout)            (None, 24, 24, 64)        0         
_________________________________________________________________
conv2d_2 (Conv2D)            (None, 24, 24, 64)        36928     
_________________________________________________________________
conv2d_3 (Conv2D)            (None, 24, 24, 64)        36928     
_________________________________________________________________
batch_normalization_1 (Batch (None, 24, 24, 64)        256       
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 12, 12, 64)        0         
_________________________________________________________________
dropout_1 (Dropout)          (None, 12, 12, 64)        0         
_________________________________________________________________
conv2d_4 (Conv2D)            (None, 12, 12, 128)       73856     
_________________________________________________________________
conv2d_5 (Conv2D)            (None, 12, 12, 128)       147584    
_________________________________________________________________
batch_normalization_2 (Batch (None, 12, 12, 128)       512       
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 6, 6, 128)         0         
_________________________________________________________________
flatten (Flatten)            (None, 4608)              0         
_________________________________________________________________
dense (Dense)                (None, 7)                 32263     
=================================================================
Total params: 366,151
Trainable params: 365,639
Non-trainable params: 512
_________________________________________________________________

COMPILE AND TRAIN THE MODEL

model.compile(loss=categorical_crossentropy,  
              optimizer=Adam(),  
              metrics=['accuracy'])  


model.fit(X_train, train_y, 
          steps_per_epoch=len(X_train) / batch_size, 
          batch_size=batch_size,  
          epochs=epochs,  
          verbose=1,  
          validation_data=(X_test, test_y)
          )

OUTPUT:-

Epoch 1/175
449/448 [==============================] - 21s 47ms/step - loss: 1.8567 - accuracy: 0.3392 - val_loss: 1.7904 - val_accuracy: 0.3413
Epoch 2/175
449/448 [==============================] - 20s 46ms/step - loss: 1.5181 - accuracy: 0.4424 - val_loss: 1.4303 - val_accuracy: 0.4921
Epoch 3/175
449/448 [==============================] - 20s 45ms/step - loss: 1.3464 - accuracy: 0.5004 - val_loss: 1.3037 - val_accuracy: 0.5127
Epoch 4/175
449/448 [==============================] - 21s 46ms/step - loss: 1.2346 - accuracy: 0.5389 - val_loss: 1.3261 - val_accuracy: 0.5088
Epoch 5/175
449/448 [==============================] - 20s 46ms/step - loss: 1.1319 - accuracy: 0.5797 - val_loss: 1.2096 - val_accuracy: 0.5570
Epoch 6/175
449/448 [==============================] - 21s 46ms/step - loss: 1.0737 - accuracy: 0.6008 - val_loss: 1.1866 - val_accuracy: 0.5770
Epoch 7/175
449/448 [==============================] - 21s 46ms/step - loss: 0.9985 - accuracy: 0.6278 - val_loss: 1.1647 - val_accuracy: 0.5717
Epoch 8/175
449/448 [==============================] - 21s 46ms/step - loss: 0.9328 - accuracy: 0.6566 - val_loss: 1.1829 - val_accuracy: 0.5723
Epoch 9/175
449/448 [==============================] - 20s 46ms/step - loss: 0.8712 - accuracy: 0.6794 - val_loss: 1.2172 - val_accuracy: 0.5740
Epoch 10/175
449/448 [==============================] - 20s 46ms/step - loss: 0.8068 - accuracy: 0.7029 - val_loss: 1.1768 - val_accuracy: 0.5926
Epoch 11/175
449/448 [==============================] - 20s 46ms/step - loss: 0.7480 - accuracy: 0.7270 - val_loss: 1.2087 - val_accuracy: 0.5904
Epoch 12/175
449/448 [==============================] - 20s 46ms/step - loss: 0.6874 - accuracy: 0.7489 - val_loss: 1.2136 - val_accuracy: 0.5965
Epoch 13/175
449/448 [==============================] - 21s 46ms/step - loss: 0.6417 - accuracy: 0.7655 - val_loss: 1.2881 - val_accuracy: 0.5901
Epoch 14/175
449/448 [==============================] - 21s 46ms/step - loss: 0.5789 - accuracy: 0.7911 - val_loss: 1.3005 - val_accuracy: 0.6024
Epoch 15/175
449/448 [==============================] - 21s 46ms/step - loss: 0.5451 - accuracy: 0.8023 - val_loss: 1.3627 - val_accuracy: 0.5834
Epoch 16/175
449/448 [==============================] - 20s 46ms/step - loss: 0.4888 - accuracy: 0.8208 - val_loss: 1.3465 - val_accuracy: 0.6049
Epoch 17/175
449/448 [==============================] - 20s 46ms/step - loss: 0.4543 - accuracy: 0.8348 - val_loss: 1.3999 - val_accuracy: 0.6038
Epoch 18/175
449/448 [==============================] - 20s 45ms/step - loss: 0.4209 - accuracy: 0.8485 - val_loss: 1.4820 - val_accuracy: 0.5954
Epoch 19/175
449/448 [==============================] - 21s 46ms/step - loss: 0.3861 - accuracy: 0.8607 - val_loss: 1.6021 - val_accuracy: 0.5890
Epoch 20/175
449/448 [==============================] - 21s 46ms/step - loss: 0.3689 - accuracy: 0.8664 - val_loss: 1.5178 - val_accuracy: 0.5993
Epoch 21/175
449/448 [==============================] - 21s 46ms/step - loss: 0.3422 - accuracy: 0.8773 - val_loss: 1.5716 - val_accuracy: 0.5890
Epoch 22/175
449/448 [==============================] - 21s 46ms/step - loss: 0.3305 - accuracy: 0.8799 - val_loss: 1.6905 - val_accuracy: 0.5899
Epoch 23/175
449/448 [==============================] - 21s 46ms/step - loss: 0.3136 - accuracy: 0.8859 - val_loss: 1.5996 - val_accuracy: 0.6052
Epoch 24/175
449/448 [==============================] - 20s 46ms/step - loss: 0.2951 - accuracy: 0.8936 - val_loss: 1.7175 - val_accuracy: 0.5924
Epoch 25/175
449/448 [==============================] - 21s 46ms/step - loss: 0.2752 - accuracy: 0.9032 - val_loss: 1.6579 - val_accuracy: 0.6004
Epoch 26/175
449/448 [==============================] - 20s 46ms/step - loss: 0.2607 - accuracy: 0.9065 - val_loss: 1.7728 - val_accuracy: 0.5982
Epoch 27/175
449/448 [==============================] - 20s 46ms/step - loss: 0.2575 - accuracy: 0.9065 - val_loss: 1.8015 - val_accuracy: 0.5865
Epoch 28/175
449/448 [==============================] - 20s 45ms/step - loss: 0.2491 - accuracy: 0.9107 - val_loss: 1.7949 - val_accuracy: 0.5918
Epoch 29/175
449/448 [==============================] - 20s 46ms/step - loss: 0.2291 - accuracy: 0.9203 - val_loss: 1.7892 - val_accuracy: 0.5979
Epoch 30/175
449/448 [==============================] - 20s 46ms/step - loss: 0.2292 - accuracy: 0.9172 - val_loss: 1.8955 - val_accuracy: 0.6049
Epoch 31/175
449/448 [==============================] - 21s 46ms/step - loss: 0.2275 - accuracy: 0.9182 - val_loss: 1.9020 - val_accuracy: 0.5910
Epoch 32/175
449/448 [==============================] - 21s 46ms/step - loss: 0.2151 - accuracy: 0.9241 - val_loss: 1.8902 - val_accuracy: 0.6030
Epoch 33/175
449/448 [==============================] - 21s 46ms/step - loss: 0.2083 - accuracy: 0.9259 - val_loss: 1.8927 - val_accuracy: 0.5977
Epoch 34/175
449/448 [==============================] - 21s 46ms/step - loss: 0.2085 - accuracy: 0.9269 - val_loss: 1.8889 - val_accuracy: 0.6041
Epoch 35/175
449/448 [==============================] - 20s 46ms/step - loss: 0.2022 - accuracy: 0.9290 - val_loss: 1.9135 - val_accuracy: 0.6004
Epoch 36/175
449/448 [==============================] - 20s 45ms/step - loss: 0.1883 - accuracy: 0.9340 - val_loss: 1.9541 - val_accuracy: 0.6060
Epoch 37/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1945 - accuracy: 0.9317 - val_loss: 2.0019 - val_accuracy: 0.6043
Epoch 38/175
449/448 [==============================] - 20s 46ms/step - loss: 0.1883 - accuracy: 0.9328 - val_loss: 1.9417 - val_accuracy: 0.5974
Epoch 39/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1741 - accuracy: 0.9405 - val_loss: 1.9762 - val_accuracy: 0.5974
Epoch 40/175
449/448 [==============================] - 20s 46ms/step - loss: 0.1821 - accuracy: 0.9370 - val_loss: 2.1234 - val_accuracy: 0.5901
Epoch 41/175
449/448 [==============================] - 20s 45ms/step - loss: 0.1657 - accuracy: 0.9421 - val_loss: 2.0580 - val_accuracy: 0.5921
Epoch 42/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1768 - accuracy: 0.9378 - val_loss: 2.1744 - val_accuracy: 0.5865
Epoch 43/175
449/448 [==============================] - 20s 46ms/step - loss: 0.1611 - accuracy: 0.9440 - val_loss: 2.0404 - val_accuracy: 0.6018
Epoch 44/175
449/448 [==============================] - 20s 46ms/step - loss: 0.1720 - accuracy: 0.9402 - val_loss: 2.1080 - val_accuracy: 0.6071
Epoch 45/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1598 - accuracy: 0.9452 - val_loss: 2.0619 - val_accuracy: 0.5926
Epoch 46/175
449/448 [==============================] - 20s 46ms/step - loss: 0.1523 - accuracy: 0.9466 - val_loss: 2.0883 - val_accuracy: 0.5968
Epoch 47/175
449/448 [==============================] - 20s 46ms/step - loss: 0.1521 - accuracy: 0.9476 - val_loss: 2.1487 - val_accuracy: 0.5924
Epoch 48/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1515 - accuracy: 0.9484 - val_loss: 2.1605 - val_accuracy: 0.5874
Epoch 49/175
449/448 [==============================] - 20s 46ms/step - loss: 0.1503 - accuracy: 0.9479 - val_loss: 2.2089 - val_accuracy: 0.6002
Epoch 50/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1464 - accuracy: 0.9508 - val_loss: 2.1147 - val_accuracy: 0.5979
Epoch 51/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1418 - accuracy: 0.9510 - val_loss: 2.2313 - val_accuracy: 0.5815
Epoch 52/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1470 - accuracy: 0.9504 - val_loss: 2.1788 - val_accuracy: 0.5999
Epoch 53/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1338 - accuracy: 0.9542 - val_loss: 2.1339 - val_accuracy: 0.6024
Epoch 54/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1360 - accuracy: 0.9540 - val_loss: 2.2155 - val_accuracy: 0.6032
Epoch 55/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1302 - accuracy: 0.9560 - val_loss: 2.2032 - val_accuracy: 0.5932
Epoch 56/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1372 - accuracy: 0.9528 - val_loss: 2.2022 - val_accuracy: 0.5890
Epoch 57/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1346 - accuracy: 0.9541 - val_loss: 2.1667 - val_accuracy: 0.6021
Epoch 58/175
449/448 [==============================] - 20s 46ms/step - loss: 0.1314 - accuracy: 0.9550 - val_loss: 2.2446 - val_accuracy: 0.6007
Epoch 59/175
449/448 [==============================] - 20s 46ms/step - loss: 0.1298 - accuracy: 0.9560 - val_loss: 2.2614 - val_accuracy: 0.5915
Epoch 60/175
449/448 [==============================] - 20s 46ms/step - loss: 0.1226 - accuracy: 0.9585 - val_loss: 2.2066 - val_accuracy: 0.5890
Epoch 61/175
449/448 [==============================] - 20s 46ms/step - loss: 0.1236 - accuracy: 0.9583 - val_loss: 2.2095 - val_accuracy: 0.5971
Epoch 62/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1218 - accuracy: 0.9607 - val_loss: 2.2142 - val_accuracy: 0.6038
Epoch 63/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1197 - accuracy: 0.9594 - val_loss: 2.2424 - val_accuracy: 0.5991
Epoch 64/175
449/448 [==============================] - 20s 46ms/step - loss: 0.1091 - accuracy: 0.9628 - val_loss: 2.2398 - val_accuracy: 0.5940
Epoch 65/175
449/448 [==============================] - 20s 46ms/step - loss: 0.1179 - accuracy: 0.9608 - val_loss: 2.3311 - val_accuracy: 0.5798
Epoch 66/175
449/448 [==============================] - 20s 46ms/step - loss: 0.1151 - accuracy: 0.9624 - val_loss: 2.2754 - val_accuracy: 0.5890
Epoch 67/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1264 - accuracy: 0.9585 - val_loss: 2.2798 - val_accuracy: 0.5918
Epoch 68/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1119 - accuracy: 0.9634 - val_loss: 2.3058 - val_accuracy: 0.5882
Epoch 69/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1113 - accuracy: 0.9635 - val_loss: 2.3107 - val_accuracy: 0.5940
Epoch 70/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1106 - accuracy: 0.9646 - val_loss: 2.2894 - val_accuracy: 0.5949
Epoch 71/175
449/448 [==============================] - 20s 46ms/step - loss: 0.1072 - accuracy: 0.9640 - val_loss: 2.3135 - val_accuracy: 0.5940
Epoch 72/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1116 - accuracy: 0.9625 - val_loss: 2.3210 - val_accuracy: 0.5940
Epoch 73/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1056 - accuracy: 0.9659 - val_loss: 2.2949 - val_accuracy: 0.5982
Epoch 74/175
449/448 [==============================] - 20s 46ms/step - loss: 0.1021 - accuracy: 0.9659 - val_loss: 2.3719 - val_accuracy: 0.5968
Epoch 75/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1037 - accuracy: 0.9659 - val_loss: 2.3883 - val_accuracy: 0.5963
Epoch 76/175
449/448 [==============================] - 20s 46ms/step - loss: 0.1080 - accuracy: 0.9638 - val_loss: 2.3795 - val_accuracy: 0.5899
Epoch 77/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1062 - accuracy: 0.9649 - val_loss: 2.3508 - val_accuracy: 0.5910
Epoch 78/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0955 - accuracy: 0.9689 - val_loss: 2.3829 - val_accuracy: 0.5949
Epoch 79/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0987 - accuracy: 0.9672 - val_loss: 2.4050 - val_accuracy: 0.5826
Epoch 80/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1007 - accuracy: 0.9677 - val_loss: 2.3465 - val_accuracy: 0.5952
Epoch 81/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0976 - accuracy: 0.9687 - val_loss: 2.3163 - val_accuracy: 0.5963
Epoch 82/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1010 - accuracy: 0.9681 - val_loss: 2.3415 - val_accuracy: 0.5882
Epoch 83/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0964 - accuracy: 0.9682 - val_loss: 2.4223 - val_accuracy: 0.5865
Epoch 84/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0922 - accuracy: 0.9701 - val_loss: 2.4284 - val_accuracy: 0.6041
Epoch 85/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0973 - accuracy: 0.9681 - val_loss: 2.3537 - val_accuracy: 0.6041
Epoch 86/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0914 - accuracy: 0.9706 - val_loss: 2.3283 - val_accuracy: 0.6038
Epoch 87/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0881 - accuracy: 0.9712 - val_loss: 2.3613 - val_accuracy: 0.5954
Epoch 88/175
449/448 [==============================] - 21s 46ms/step - loss: 0.1014 - accuracy: 0.9678 - val_loss: 2.4222 - val_accuracy: 0.5901
Epoch 89/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0882 - accuracy: 0.9722 - val_loss: 2.3874 - val_accuracy: 0.5977
Epoch 90/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0875 - accuracy: 0.9715 - val_loss: 2.4285 - val_accuracy: 0.5949
Epoch 91/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0880 - accuracy: 0.9724 - val_loss: 2.3932 - val_accuracy: 0.5988
Epoch 92/175
449/448 [==============================] - 20s 46ms/step - loss: 0.0895 - accuracy: 0.9716 - val_loss: 2.4383 - val_accuracy: 0.5954
Epoch 93/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0919 - accuracy: 0.9692 - val_loss: 2.3882 - val_accuracy: 0.5993
Epoch 94/175
449/448 [==============================] - 20s 46ms/step - loss: 0.0868 - accuracy: 0.9722 - val_loss: 2.4348 - val_accuracy: 0.5968
Epoch 95/175
449/448 [==============================] - 20s 46ms/step - loss: 0.0862 - accuracy: 0.9733 - val_loss: 2.4100 - val_accuracy: 0.6013
Epoch 96/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0869 - accuracy: 0.9722 - val_loss: 2.3346 - val_accuracy: 0.6013
Epoch 97/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0874 - accuracy: 0.9723 - val_loss: 2.3942 - val_accuracy: 0.6110
Epoch 98/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0841 - accuracy: 0.9725 - val_loss: 2.3957 - val_accuracy: 0.5915
Epoch 99/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0902 - accuracy: 0.9703 - val_loss: 2.3519 - val_accuracy: 0.6035
Epoch 100/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0833 - accuracy: 0.9738 - val_loss: 2.3758 - val_accuracy: 0.6063
Epoch 101/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0768 - accuracy: 0.9758 - val_loss: 2.4069 - val_accuracy: 0.6055
Epoch 102/175
449/448 [==============================] - 20s 46ms/step - loss: 0.0827 - accuracy: 0.9741 - val_loss: 2.4630 - val_accuracy: 0.5993
Epoch 103/175
449/448 [==============================] - 20s 45ms/step - loss: 0.0805 - accuracy: 0.9744 - val_loss: 2.5062 - val_accuracy: 0.6055
Epoch 104/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0763 - accuracy: 0.9765 - val_loss: 2.5178 - val_accuracy: 0.5938
Epoch 105/175
449/448 [==============================] - 20s 46ms/step - loss: 0.0800 - accuracy: 0.9758 - val_loss: 2.4403 - val_accuracy: 0.6052
Epoch 106/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0806 - accuracy: 0.9739 - val_loss: 2.4415 - val_accuracy: 0.5965
Epoch 107/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0758 - accuracy: 0.9763 - val_loss: 2.3893 - val_accuracy: 0.5985
Epoch 108/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0811 - accuracy: 0.9739 - val_loss: 2.4333 - val_accuracy: 0.5991
Epoch 109/175
449/448 [==============================] - 20s 46ms/step - loss: 0.0767 - accuracy: 0.9759 - val_loss: 2.4260 - val_accuracy: 0.5910
Epoch 110/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0752 - accuracy: 0.9751 - val_loss: 2.4478 - val_accuracy: 0.5910
Epoch 111/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0760 - accuracy: 0.9761 - val_loss: 2.4458 - val_accuracy: 0.5952
Epoch 112/175
449/448 [==============================] - 21s 47ms/step - loss: 0.0770 - accuracy: 0.9757 - val_loss: 2.4142 - val_accuracy: 0.5963
Epoch 113/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0760 - accuracy: 0.9760 - val_loss: 2.4680 - val_accuracy: 0.5843
Epoch 114/175
449/448 [==============================] - 20s 46ms/step - loss: 0.0765 - accuracy: 0.9751 - val_loss: 2.4400 - val_accuracy: 0.5991
Epoch 115/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0749 - accuracy: 0.9770 - val_loss: 2.5477 - val_accuracy: 0.5921
Epoch 116/175
449/448 [==============================] - 20s 46ms/step - loss: 0.0699 - accuracy: 0.9780 - val_loss: 2.4813 - val_accuracy: 0.5993
Epoch 117/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0739 - accuracy: 0.9780 - val_loss: 2.5207 - val_accuracy: 0.5988
Epoch 118/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0753 - accuracy: 0.9760 - val_loss: 2.4967 - val_accuracy: 0.5918
Epoch 119/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0767 - accuracy: 0.9762 - val_loss: 2.4891 - val_accuracy: 0.5921
Epoch 120/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0708 - accuracy: 0.9772 - val_loss: 2.5227 - val_accuracy: 0.5991
Epoch 121/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0702 - accuracy: 0.9786 - val_loss: 2.4075 - val_accuracy: 0.6007
Epoch 122/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0723 - accuracy: 0.9766 - val_loss: 2.4571 - val_accuracy: 0.6004
Epoch 123/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0724 - accuracy: 0.9773 - val_loss: 2.5567 - val_accuracy: 0.5960
Epoch 124/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0675 - accuracy: 0.9789 - val_loss: 2.4151 - val_accuracy: 0.6082
Epoch 125/175
449/448 [==============================] - 20s 46ms/step - loss: 0.0692 - accuracy: 0.9784 - val_loss: 2.4698 - val_accuracy: 0.6013
Epoch 126/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0727 - accuracy: 0.9782 - val_loss: 2.4374 - val_accuracy: 0.6032
Epoch 127/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0667 - accuracy: 0.9803 - val_loss: 2.4918 - val_accuracy: 0.5999
Epoch 128/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0684 - accuracy: 0.9800 - val_loss: 2.5180 - val_accuracy: 0.5963
Epoch 129/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0715 - accuracy: 0.9778 - val_loss: 2.4824 - val_accuracy: 0.6057
Epoch 130/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0702 - accuracy: 0.9785 - val_loss: 2.4656 - val_accuracy: 0.5963
Epoch 131/175
449/448 [==============================] - 20s 46ms/step - loss: 0.0632 - accuracy: 0.9814 - val_loss: 2.5017 - val_accuracy: 0.6116
Epoch 132/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0692 - accuracy: 0.9784 - val_loss: 2.4782 - val_accuracy: 0.5982
Epoch 133/175
449/448 [==============================] - 20s 46ms/step - loss: 0.0661 - accuracy: 0.9800 - val_loss: 2.5537 - val_accuracy: 0.6018
Epoch 134/175
449/448 [==============================] - 20s 46ms/step - loss: 0.0658 - accuracy: 0.9794 - val_loss: 2.5170 - val_accuracy: 0.6004
Epoch 135/175
449/448 [==============================] - 20s 46ms/step - loss: 0.0631 - accuracy: 0.9809 - val_loss: 2.5300 - val_accuracy: 0.6016
Epoch 136/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0664 - accuracy: 0.9790 - val_loss: 2.5197 - val_accuracy: 0.5968
Epoch 137/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0653 - accuracy: 0.9793 - val_loss: 2.5825 - val_accuracy: 0.5915
Epoch 138/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0670 - accuracy: 0.9788 - val_loss: 2.4759 - val_accuracy: 0.6060
Epoch 139/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0685 - accuracy: 0.9796 - val_loss: 2.5147 - val_accuracy: 0.5985
Epoch 140/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0603 - accuracy: 0.9820 - val_loss: 2.5310 - val_accuracy: 0.5952
Epoch 141/175
449/448 [==============================] - 20s 46ms/step - loss: 0.0656 - accuracy: 0.9797 - val_loss: 2.5063 - val_accuracy: 0.6035
Epoch 142/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0694 - accuracy: 0.9783 - val_loss: 2.4645 - val_accuracy: 0.6046
Epoch 143/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0671 - accuracy: 0.9801 - val_loss: 2.4294 - val_accuracy: 0.5991
Epoch 144/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0610 - accuracy: 0.9822 - val_loss: 2.5173 - val_accuracy: 0.5977
Epoch 145/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0598 - accuracy: 0.9806 - val_loss: 2.4935 - val_accuracy: 0.5979
Epoch 146/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0579 - accuracy: 0.9821 - val_loss: 2.5222 - val_accuracy: 0.6041
Epoch 147/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0634 - accuracy: 0.9815 - val_loss: 2.5418 - val_accuracy: 0.5977
Epoch 148/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0600 - accuracy: 0.9822 - val_loss: 2.5331 - val_accuracy: 0.6043
Epoch 149/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0594 - accuracy: 0.9824 - val_loss: 2.5022 - val_accuracy: 0.5991
Epoch 150/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0617 - accuracy: 0.9816 - val_loss: 2.5150 - val_accuracy: 0.6080
Epoch 151/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0591 - accuracy: 0.9823 - val_loss: 2.5346 - val_accuracy: 0.5999
Epoch 152/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0661 - accuracy: 0.9801 - val_loss: 2.4523 - val_accuracy: 0.6021
Epoch 153/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0583 - accuracy: 0.9813 - val_loss: 2.5202 - val_accuracy: 0.6032
Epoch 154/175
449/448 [==============================] - 20s 46ms/step - loss: 0.0614 - accuracy: 0.9821 - val_loss: 2.5766 - val_accuracy: 0.6018
Epoch 155/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0573 - accuracy: 0.9832 - val_loss: 2.4998 - val_accuracy: 0.5988
Epoch 156/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0572 - accuracy: 0.9835 - val_loss: 2.5666 - val_accuracy: 0.6057
Epoch 157/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0613 - accuracy: 0.9823 - val_loss: 2.4897 - val_accuracy: 0.6010
Epoch 158/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0582 - accuracy: 0.9838 - val_loss: 2.5121 - val_accuracy: 0.6041
Epoch 159/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0624 - accuracy: 0.9811 - val_loss: 2.4837 - val_accuracy: 0.6069
Epoch 160/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0585 - accuracy: 0.9827 - val_loss: 2.6001 - val_accuracy: 0.6102
Epoch 161/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0590 - accuracy: 0.9823 - val_loss: 2.5550 - val_accuracy: 0.6066
Epoch 162/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0581 - accuracy: 0.9826 - val_loss: 2.5223 - val_accuracy: 0.6007
Epoch 163/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0589 - accuracy: 0.9826 - val_loss: 2.5876 - val_accuracy: 0.5991
Epoch 164/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0530 - accuracy: 0.9836 - val_loss: 2.5902 - val_accuracy: 0.6010
Epoch 165/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0575 - accuracy: 0.9840 - val_loss: 2.5464 - val_accuracy: 0.6030
Epoch 166/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0563 - accuracy: 0.9840 - val_loss: 2.5730 - val_accuracy: 0.6030
Epoch 167/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0589 - accuracy: 0.9829 - val_loss: 2.4697 - val_accuracy: 0.6027
Epoch 168/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0553 - accuracy: 0.9842 - val_loss: 2.5325 - val_accuracy: 0.5993
Epoch 169/175
449/448 [==============================] - 20s 46ms/step - loss: 0.0577 - accuracy: 0.9838 - val_loss: 2.6346 - val_accuracy: 0.5996
Epoch 170/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0550 - accuracy: 0.9845 - val_loss: 2.5300 - val_accuracy: 0.6063
Epoch 171/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0528 - accuracy: 0.9850 - val_loss: 2.5102 - val_accuracy: 0.6071
Epoch 172/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0583 - accuracy: 0.9825 - val_loss: 2.5085 - val_accuracy: 0.6027
Epoch 173/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0561 - accuracy: 0.9840 - val_loss: 2.5149 - val_accuracy: 0.6071
Epoch 174/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0588 - accuracy: 0.9822 - val_loss: 2.4584 - val_accuracy: 0.6066
Epoch 175/175
449/448 [==============================] - 21s 46ms/step - loss: 0.0479 - accuracy: 0.9857 - val_loss: 2.5053 - val_accuracy: 0.6069
<tensorflow.python.keras.callbacks.History at 0x7f643d069828>

We have successfully trained our model and got 98% accuracy.

Save model

m_json = model.to_json()  
with open("m.json", "w") as json_file:  
    json_file.write(m_json)  
model.save_weights("Detection_Emotion.h5")

Now, comes the interesting part. We’ll use Opencv to predict emotions in Real-Time. But first of all, download the saved weights and model. We’ll work on our local machines now. why?? Because VideoCapture doesn’t work in google colab, in other words, using a webcam in google colab isn’t easy.

NOTE:- YOU NEED TENSORFLOW VERSION ‘2.3.0’ AND KERAS VERSION ‘2.4.1’ IN YOUR LOCAL MACHINE.

#import libraries 
import os  
import cv2  
import numpy as np  
from keras.models import model_from_json  
from keras.preprocessing import image  
from tensorflow.keras.models import load_model

#loading model  
model = model_from_json(open("m.json", "r").read())  
#loading weights  
model.load_weights('Detection_Emotion.h5')

Then, use the code given below to open your webcam and predict emotions.

face_haar_cascade = cv2.CascadeClassifier(cv2.data.haarcascades + 'haarcascade_frontalface_default.xml')

cap=cv2.VideoCapture(0)  

while True:  
    ret,test_img=cap.read()  
    if not ret:  
        continue  
    gray_img= cv2.cvtColor(test_img, cv2.COLOR_BGR2GRAY)  

    faces_detected = face_haar_cascade.detectMultiScale(gray_img, 1.32, 5)  


    for (x,y,w,h) in faces_detected:  
        cv2.rectangle(test_img,(x,y),(x+w,y+h),(255,0,0),thickness=7)  
        roi_gray=gray_img[y:y+w,x:x+h]  
        roi_gray=cv2.resize(roi_gray,(48,48))  
        img_pixels = image.img_to_array(roi_gray)  
        img_pixels = np.expand_dims(img_pixels, axis = 0)  
        img_pixels /= 255  

        predictions = model.predict(img_pixels)    
        max_index = np.argmax(predictions[0])  

        emotions = ('angry', 'disgust', 'fear', 'happy', 'sad', 'surprise', 'neutral')  
        predicted_emotion = emotions[max_index]  

        cv2.putText(test_img, predicted_emotion, (int(x), int(y)), cv2.FONT_HERSHEY_SIMPLEX, 1, (0,0,255), 2)  

    resized_img = cv2.resize(test_img, (1000, 700))  
    cv2.imshow('Facial emotion analysis ',resized_img)  



    if cv2.waitKey(10) == ord('q'):#press 'q' key to exit  
        break  

cap.release()  
cv2.destroyAllWindows

OUTPUT:-

Here, you can observe our model is predicting well. We have Successfully Done Real-Time Emotion Detection.

Thanks for Reading.

You can also see:

 

 

 

2 responses to “Emotion Detection Using CNN in Python Using Keras”

  1. Antra Joshi says:

    Great job… Informative and good explanation..

  2. Deepak mor says:

    Nice work bro…Informative learned something new

Leave a Reply

Your email address will not be published. Required fields are marked *