Multi-Layer Perceptron by Keras with example
In this blog, we are going to understand Multi-Layer Perceptron (MLP) by its implementation in Keras. Keras is a Python library based on TensorFlow that is specifically built for Deep Learning to create models as a sequence of layers.
It is important to learn about perceptrons because they are pioneers of larger neural networks. The content of this blog contains a brief introduction to Perceptron, MLP, and Implementation.
What is Perceptron?
Perceptron is a neural network proposed by Frank Rosenblatt to perform simple binary classification that can be depicted as ‘true’ or ‘false’. For example, in a human face detection system, the models would be able to identify whether an input image contains or does not contain a human face or if it is a face image then is it the face of a specific person or not.
It is inspired by the nerve cell, a pivotal functional and working unit of the human brain. Although this is a simple learning machine, it is a basic unit for an artificial neural network. As we can see from the figure below, there are input values (x) that are multiplied by the weights (w). These computations are summed up together in Linear Combination and are given as input to the Activation Function.
What is a Multilayer Perceptron?
A multilayer perceptron is stacked of different layers of the perceptron. It develops the ability to solve simple to complex problems. For example, the figure below shows the two neurons in the input layer, four neurons in the hidden layer, and one neuron in the output layer. Any multilayer perceptron also called neural network can be classified as Shallow Neural Network and Deep Neural Network depending on the number of layers.
Such neural networks have do not always have binary decision functions. In the figure given below, there are layers of perceptrons together which are all meant for different functions. MLPs are mathematically capable of learning mapping functions and universal approximation algorithms.
Implementation of Multi-layer Perceptron in Python using Keras
The basic components of the perceptron include Inputs, Weights and Biases, Linear combination, and Activation function. Following is the basic terminology of each of the components.
Inputs of a perceptron are real values input.
Weights are parameters within the neural network to transform input data.
Bias is an additional parameter used to adjust output along with a weighted sum.
Linear combination is the merging of input values.
Activation values are non-linear transformations of input for specific outputs.
To understand this further, we are going to implement a classification task on the MNIST dataset of handwritten digits using Keras deep learning module.
Let us first load the MNIST dataset and create test and validation set variables. We will be using utils.to_categorical
to convert y into 10 categorical labels.
import tensorflow as tf
import numpy as np
import keras
from keras.datasets import mnist
from keras.models import Sequential
from keras.layers import Dense, Activation
from keras.utils import to_categorical
(x_train, y_train), (x_test, y_test) = mnist.load_data()
x_train, x_test = x_train / 255.0, x_test / 255.0
y_train=keras.utils.to_categorical(y_train,10)
y_test=keras.utils.to_categorical(y_test,10
Next, we would create a Sequential model and add Dense layers with ‘ReLU’ activation function.
model = Sequential()
model.add(Dense(512, input_shape=(784, ), activation='relu'))
model.add(Dense(768, activation='relu'))
model.add(Dense(10, activation='softmax'))
Finally, we would compile the model with optimizer as adam, loss as categorical cross-entropy, and 100 number of epochs.
model.compile(optimizer='adam',loss='categorical_crossentropy', metrics=['accuracy'])
model.fit(x, y, epochs=100)
This is what the final output looks like.
Epoch 1/100
60000/60000 [==============================] - 27s 451us/step - loss: 1.4552 - accuracy: 0.9036
Epoch 2/100
60000/60000 [==============================] - 27s 450us/step - loss: 0.1949 - accuracy: 0.9504
Epoch 3/100
60000/60000 [==============================] - 27s 453us/step - loss: 0.1866 - accuracy: 0.9524
Epoch 4/100
60000/60000 [==============================] - 27s 454us/step - loss: 0.1758 - accuracy: 0.9559
Epoch 5/100
60000/60000 [==============================] - 28s 462us/step - loss: 0.1548 - accuracy: 0.9619
Epoch 6/100
60000/60000 [==============================] - 27s 458us/step - loss: 0.1402 - accuracy: 0.9659
Epoch 7/100
60000/60000 [==============================] - 27s 455us/step - loss: 0.1092 - accuracy: 0.9728
Epoch 8/100
60000/60000 [==============================] - 28s 473us/step - loss: 0.1184 - accuracy: 0.9720
Epoch 9/100
60000/60000 [==============================] - 28s 472us/step - loss: 0.0960 - accuracy: 0.9763
Epoch 10/100
60000/60000 [==============================] - 28s 463us/step - loss: 0.0958 - accuracy: 0.9771
Epoch 11/100
60000/60000 [==============================] - 28s 459us/step - loss: 0.0895 - accuracy: 0.9791
Epoch 12/100
60000/60000 [==============================] - 27s 454us/step - loss: 0.0847 - accuracy: 0.9805
Epoch 13/100
60000/60000 [==============================] - 27s 458us/step - loss: 0.0851 - accuracy: 0.9813
Epoch 14/100
60000/60000 [==============================] - 27s 457us/step - loss: 0.0803 - accuracy: 0.9814
Epoch 15/100
60000/60000 [==============================] - 27s 455us/step - loss: 0.0766 - accuracy: 0.9832
Epoch 16/100
60000/60000 [==============================] - 28s 469us/step - loss: 0.0780 - accuracy: 0.9835
Epoch 17/100
60000/60000 [==============================] - 28s 465us/step - loss: 0.0718 - accuracy: 0.9852
Epoch 18/100
60000/60000 [==============================] - 28s 465us/step - loss: 0.0702 - accuracy: 0.9854
Epoch 19/100
60000/60000 [==============================] - 28s 469us/step - loss: 0.0740 - accuracy: 0.9848
Epoch 20/100
60000/60000 [==============================] - 29s 476us/step - loss: 0.0649 - accuracy: 0.9858
Epoch 21/100
60000/60000 [==============================] - 28s 469us/step - loss: 0.0620 - accuracy: 0.9865
Epoch 22/100
60000/60000 [==============================] - 28s 474us/step - loss: 0.0653 - accuracy: 0.9859
Epoch 23/100
60000/60000 [==============================] - 28s 466us/step - loss: 0.0743 - accuracy: 0.9860
Epoch 24/100
60000/60000 [==============================] - 28s 469us/step - loss: 0.0634 - accuracy: 0.9868
Epoch 25/100
60000/60000 [==============================] - 28s 465us/step - loss: 0.0679 - accuracy: 0.9867
Epoch 26/100
60000/60000 [==============================] - 28s 474us/step - loss: 0.0674 - accuracy: 0.9875
Epoch 27/100
60000/60000 [==============================] - 28s 469us/step - loss: 0.0587 - accuracy: 0.9886
Epoch 28/100
60000/60000 [==============================] - 28s 473us/step - loss: 0.0581 - accuracy: 0.9891
Epoch 29/100
60000/60000 [==============================] - 28s 469us/step - loss: 0.0600 - accuracy: 0.9885
Epoch 30/100
60000/60000 [==============================] - 28s 474us/step - loss: 0.0643 - accuracy: 0.9880
Epoch 31/100
60000/60000 [==============================] - 28s 473us/step - loss: 0.0602 - accuracy: 0.9879
Epoch 32/100
60000/60000 [==============================] - 28s 467us/step - loss: 0.0700 - accuracy: 0.9860
Epoch 33/100
60000/60000 [==============================] - 28s 469us/step - loss: 0.0655 - accuracy: 0.9869
Epoch 34/100
60000/60000 [==============================] - 28s 467us/step - loss: 0.0582 - accuracy: 0.9877
Epoch 35/100
60000/60000 [==============================] - 28s 467us/step - loss: 0.0781 - accuracy: 0.9864
Epoch 36/100
60000/60000 [==============================] - 28s 471us/step - loss: 0.0681 - accuracy: 0.9870
Epoch 37/100
60000/60000 [==============================] - 28s 465us/step - loss: 0.0653 - accuracy: 0.9864
Epoch 38/100
60000/60000 [==============================] - 28s 464us/step - loss: 0.0631 - accuracy: 0.9875
Epoch 39/100
60000/60000 [==============================] - 28s 469us/step - loss: 0.0673 - accuracy: 0.9869
Epoch 40/100
60000/60000 [==============================] - 27s 457us/step - loss: 0.0642 - accuracy: 0.9876
Epoch 41/100
60000/60000 [==============================] - 27s 458us/step - loss: 0.0647 - accuracy: 0.9879
Epoch 42/100
60000/60000 [==============================] - 28s 461us/step - loss: 0.0779 - accuracy: 0.9863
Epoch 43/100
60000/60000 [==============================] - 27s 453us/step - loss: 0.0668 - accuracy: 0.9865
Epoch 44/100
60000/60000 [==============================] - 32s 529us/step - loss: 0.0627 - accuracy: 0.9871
Epoch 45/100
60000/60000 [==============================] - 28s 468us/step - loss: 0.0586 - accuracy: 0.9881
Epoch 46/100
60000/60000 [==============================] - 28s 474us/step - loss: 0.0501 - accuracy: 0.9899
Epoch 47/100
60000/60000 [==============================] - 29s 483us/step - loss: 0.0656 - accuracy: 0.9879
Epoch 48/100
60000/60000 [==============================] - 29s 482us/step - loss: 0.0757 - accuracy: 0.9869
Epoch 49/100
60000/60000 [==============================] - 28s 470us/step - loss: 0.0589 - accuracy: 0.9883
Epoch 50/100
60000/60000 [==============================] - 28s 470us/step - loss: 0.0711 - accuracy: 0.9877
Epoch 51/100
60000/60000 [==============================] - 28s 467us/step - loss: 0.0613 - accuracy: 0.9880
Epoch 52/100
60000/60000 [==============================] - 29s 479us/step - loss: 0.0786 - accuracy: 0.9867
Epoch 53/100
60000/60000 [==============================] - 29s 487us/step - loss: 0.0664 - accuracy: 0.9870
Epoch 54/100
60000/60000 [==============================] - 28s 470us/step - loss: 0.0723 - accuracy: 0.9864
Epoch 55/100
60000/60000 [==============================] - 28s 470us/step - loss: 0.0835 - accuracy: 0.9865
Epoch 56/100
60000/60000 [==============================] - 28s 468us/step - loss: 0.0601 - accuracy: 0.9880
Epoch 57/100
60000/60000 [==============================] - 28s 465us/step - loss: 0.0605 - accuracy: 0.9882
Epoch 58/100
60000/60000 [==============================] - 29s 475us/step - loss: 0.0641 - accuracy: 0.9865
Epoch 59/100
60000/60000 [==============================] - 29s 478us/step - loss: 0.0638 - accuracy: 0.9877
Epoch 60/100
60000/60000 [==============================] - 29s 491us/step - loss: 0.0634 - accuracy: 0.9866
Epoch 61/100
60000/60000 [==============================] - 28s 475us/step - loss: 0.0765 - accuracy: 0.9859
Epoch 62/100
60000/60000 [==============================] - 28s 474us/step - loss: 0.0713 - accuracy: 0.9861
Epoch 63/100
60000/60000 [==============================] - 29s 486us/step - loss: 0.0636 - accuracy: 0.9885
Epoch 64/100
60000/60000 [==============================] - 30s 495us/step - loss: 0.0610 - accuracy: 0.9877
Epoch 65/100
60000/60000 [==============================] - 29s 485us/step - loss: 0.0764 - accuracy: 0.9854
Epoch 66/100
60000/60000 [==============================] - 29s 475us/step - loss: 0.0687 - accuracy: 0.9869
Epoch 67/100
60000/60000 [==============================] - 28s 473us/step - loss: 0.0748 - accuracy: 0.9868
Epoch 68/100
60000/60000 [==============================] - 28s 471us/step - loss: 0.0554 - accuracy: 0.9894
Epoch 69/100
60000/60000 [==============================] - 29s 477us/step - loss: 0.0905 - accuracy: 0.9848
Epoch 70/100
60000/60000 [==============================] - 28s 466us/step - loss: 0.0910 - accuracy: 0.9818
Epoch 71/100
60000/60000 [==============================] - 29s 480us/step - loss: 0.0646 - accuracy: 0.9882
Epoch 72/100
60000/60000 [==============================] - 28s 475us/step - loss: 0.0727 - accuracy: 0.9860
Epoch 73/100
60000/60000 [==============================] - 28s 464us/step - loss: 0.0626 - accuracy: 0.9879
Epoch 74/100
60000/60000 [==============================] - 29s 481us/step - loss: 0.0704 - accuracy: 0.9882
Epoch 75/100
60000/60000 [==============================] - 28s 464us/step - loss: 0.0776 - accuracy: 0.9847
Epoch 76/100
60000/60000 [==============================] - 28s 461us/step - loss: 0.0794 - accuracy: 0.9865
Epoch 77/100
60000/60000 [==============================] - 29s 489us/step - loss: 0.0707 - accuracy: 0.9861
Epoch 78/100
60000/60000 [==============================] - 29s 488us/step - loss: 0.0770 - accuracy: 0.9848
Epoch 79/100
60000/60000 [==============================] - 30s 498us/step - loss: 0.0920 - accuracy: 0.9840
Epoch 80/100
60000/60000 [==============================] - 30s 496us/step - loss: 0.0742 - accuracy: 0.9872
Epoch 81/100
60000/60000 [==============================] - 29s 486us/step - loss: 0.0811 - accuracy: 0.9867
Epoch 82/100
60000/60000 [==============================] - 28s 475us/step - loss: 0.0602 - accuracy: 0.9882
Epoch 83/100
60000/60000 [==============================] - 28s 466us/step - loss: 0.0731 - accuracy: 0.9869
Epoch 84/100
60000/60000 [==============================] - 28s 470us/step - loss: 0.0756 - accuracy: 0.9869
Epoch 85/100
60000/60000 [==============================] - 30s 493us/step - loss: 0.0646 - accuracy: 0.9866
Epoch 86/100
60000/60000 [==============================] - 28s 469us/step - loss: 0.0928 - accuracy: 0.9822
Epoch 87/100
60000/60000 [==============================] - 28s 467us/step - loss: 0.0832 - accuracy: 0.9869
Epoch 88/100
60000/60000 [==============================] - 28s 470us/step - loss: 0.1079 - accuracy: 0.9851
Epoch 89/100
60000/60000 [==============================] - 29s 479us/step - loss: 0.0873 - accuracy: 0.9845
Epoch 90/100
60000/60000 [==============================] - 29s 481us/step - loss: 0.0847 - accuracy: 0.9835
Epoch 91/100
60000/60000 [==============================] - 29s 481us/step - loss: 0.0849 - accuracy: 0.9858
Epoch 92/100
60000/60000 [==============================] - 29s 489us/step - loss: 0.0732 - accuracy: 0.9856
Epoch 93/100
60000/60000 [==============================] - 29s 491us/step - loss: 0.0959 - accuracy: 0.9843
Epoch 94/100
60000/60000 [==============================] - 29s 480us/step - loss: 0.0539 - accuracy: 0.9865
Epoch 95/100
60000/60000 [==============================] - 29s 481us/step - loss: 0.0751 - accuracy: 0.9865
Epoch 96/100
60000/60000 [==============================] - 29s 490us/step - loss: 0.0814 - accuracy: 0.9840
Epoch 97/100
60000/60000 [==============================] - 29s 477us/step - loss: 0.0667 - accuracy: 0.9874
Epoch 98/100
60000/60000 [==============================] - 28s 472us/step - loss: 0.0931 - accuracy: 0.9842
Epoch 99/100
60000/60000 [==============================] - 28s 467us/step - loss: 0.0775 - accuracy: 0.9858
Epoch 100/100
60000/60000 [==============================] - 29s 482us/step - loss: 0.0730 - accuracy: 0.9840
<keras.callbacks.callbacks.History at 0x7f07e0257ba8>
At the end of the tutorial, we learned to implement a multilayer perceptron in the MNIST dataset using Keras. As we can see from its execution that at the end of 100 epochs, MNIST has shown a classification accuracy of 98.40% rate.
Too many errors inside the code.
(28,28) —>768 , but no code to talk about the conversion
former x_train, latter x, inconsistent.