Milestone 2
We have built two Convolutional Neural Network architectures in Milestone 1. Here, we will further try to achieve better performance by increasing the number of parameters/weights. Therefore, we will start this Milestone with three popular Transfer Learning architectures, namely, VGG16, ResNet v2, and Efficient Net. Please feel free to explore other pre-trained models as well. Link to Keras documentation for pre-trained models - https://keras.io/api/applications/
Note: We will mount our drive and import our dataset once again for Milestone 2.
Mounting the Drive
NOTE: Please use Google Colab from your browser for this notebook. Google.colab is not a library that can be downloaded locally on your device.
In [ ]: # Mounting the drive from google.colab import drive drive.mount('/content/drive')
Drive already mounted at /content/drive; to attempt to forcibly remount, cal l drive.mount("/content/drive", force_remount=True).
Importing the Libraries
In [ ]: import zipfile import matplotlib.pyplot as plt import numpy as np import pandas as pd import seaborn as sns import os
# Importing Deep Learning Libraries from tensorflow.keras.preprocessing.image import load_img, img_to_array from tensorflow.keras.preprocessing.image import ImageDataGenerator from tensorflow.keras.layers import Dense, Input, Dropout, GlobalAveragePooling2D from tensorflow.keras.models import Model, Sequential from tensorflow.keras.optimizers import Adam, SGD, RMSprop
Let us load the data
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 1 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…Reference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
Note:
You must download the dataset from the link provided on Olympus and upload the same on your Google drive before executing the code in the next cell. In case of any error, please make sure that the path of the file is correct as the path may be different for you.
In [ ]: # Storing the path of the data file from the Google drive path = '/content/drive/MyDrive/Colab_Notebooks/Applied_Data_Science/Deep_Learning/Facial_emotion_images.zip'
# The data is provided as a zip file so we need to extract the files from the zip with zipfile.ZipFile(path, 'r') as zip_ref: zip_ref.extractall()
In [ ]: picture_size = 48 folder_path = "Facial_emotion_images/"
Transfer Learning Architectures
In this section, we will create several Transfer Learning architectures. For the pre-trained models, we will select three popular architectures namely, VGG16, ResNet v2, and Efficient Net. The difference between these architectures and the previous architectures is that these will require 3 input channels while the earlier ones worked on 'grayscale' images. Therefore, we need to create new DataLoaders.
Creating our Data Loaders for Transfer Learning Architectures
In this section, we are creating data loaders that we will use as inputs to our Neural Network. Unlike in Milestone 1, we will have to go with color_mode = 'rgb' as this is the required format for the transfer learning architectures.
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 2 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…Reference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
In [ ]: batch_size = 32 img_size = 48
datagen_train = ImageDataGenerator(horizontal_flip = True, width_shift_range = 0.1, height_shift_range zoom_range =0.2, channel_shift_range = 10, brightness_range = (0.5, 2.), rescale = 1./255, shear_range = 0.3)
train_set = datagen_train.flow_from_directory(folder_path + "train", target_size = (img_size, img_size color_mode = 'rgb', batch_size = batch_size, class_mode = 'categorical', classes = ['happy', 'sad', 'neutral' shuffle = True)
datagen_validation = ImageDataGenerator(horizontal_flip = True, brightness_range=(0.,2.), rescale=1./255, shear_range=0.3)
validation_set = datagen_validation.flow_from_directory(folder_path + "validation" target_size = (img_size, img_size color_mode = 'rgb', batch_size = batch_size, class_mode = 'categorical', shuffle = True)
datagen_test = ImageDataGenerator(horizontal_flip = True, brightness_range=(0.,2.), rescale=1./255, shear_range=0.3)
test_set = datagen_test.flow_from_directory(folder_path + "test", target_size = (img_size, img_size color_mode = 'rgb', batch_size = batch_size, class_mode = 'categorical', shuffle = True)
Found 15109 images belonging to 4 classes. Found 4977 images belonging to 4 classes. Found 128 images belonging to 4 classes.
VGG16 Model
Importing the VGG16 Architecture
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 3 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…Reference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
In [ ]: from tensorflow.keras.applications.vgg16 import VGG16 from tensorflow.keras import Model
vgg = VGG16(include_top = False, weights = 'imagenet', input_shape = (48, 48 vgg.summary()
Model: "vgg16"
Layer (type) Output Shape Param # =================================================================
input_1 (InputLayer) [(None, 48, 48, 3)] 0
block1_conv1 (Conv2D) (None, 48, 48, 64) 1792
block1_conv2 (Conv2D) (None, 48, 48, 64) 36928
block1_pool (MaxPooling2D) (None, 24, 24, 64) 0
block2_conv1 (Conv2D) (None, 24, 24, 128) 73856
block2_conv2 (Conv2D) (None, 24, 24, 128) 147584
block2_pool (MaxPooling2D) (None, 12, 12, 128) 0
block3_conv1 (Conv2D) (None, 12, 12, 256) 295168
block3_conv2 (Conv2D) (None, 12, 12, 256) 590080
block3_conv3 (Conv2D) (None, 12, 12, 256) 590080
block3_pool (MaxPooling2D) (None, 6, 6, 256) 0
block4_conv1 (Conv2D) (None, 6, 6, 512) 1180160
block4_conv2 (Conv2D) (None, 6, 6, 512) 2359808
block4_conv3 (Conv2D) (None, 6, 6, 512) 2359808
block4_pool (MaxPooling2D) (None, 3, 3, 512) 0
block5_conv1 (Conv2D) (None, 3, 3, 512) 2359808
block5_conv2 (Conv2D) (None, 3, 3, 512) 2359808
block5_conv3 (Conv2D) (None, 3, 3, 512) 2359808
block5_pool (MaxPooling2D) (None, 1, 1, 512) 0 =================================================================
Total params: 14,714,688
Trainable params: 14,714,688
Non-trainable params: 0
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 4 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…Reference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
Model Building
In this model, we will import till the 'block5_pool' layer of the VGG16 model. You can scroll down in the model summary and look for 'block5_pool'. You can choose any other layer as well.
Then we will add a Flatten layer, which receives the output of the 'block5_pool' layer as its input.
We will add a few Dense layers and use 'relu' activation function on them.
You may use Dropout and BatchNormalization layers as well.
Then we will add our last dense layer, which must have 4 neurons and a 'softmax' activation function.
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 5 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…Reference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
In [ ]: transfer_layer = vgg.get_layer('block5_pool') vgg.trainable = False
# Add classification layers on top of it
= transfer_layer.output
= Conv2D(256, (1,1))(x)
= LeakyReLU(alpha = 0.2)(x)
= BatchNormalization()(x)
= MaxPooling2D(1,1)(x)
= Dropout(rate = 0.3)(x)
x = Conv2D(512, (1,1))(x)
x = LeakyReLU(alpha = 0.2)(x)
= BatchNormalization()(x) x = MaxPooling2D(1,1)(x)
= Dropout(rate = 0.3)(x)
x = Conv2D(64, (1,1))(x) x = LeakyReLU(alpha = 0.2)(x) x = BatchNormalization()(x) x = MaxPooling2D(1,1)(x) x = Dropout(rate = 0.3)(x)
# Flattenning the output from the 3rd block of the VGG16 model x = Flatten()(x)
# Adding a Dense layer with 256 neurons x = Dense(256)(x) x = LeakyReLU(alpha = 0.2)(x)
# Add a Dense Layer with 128 neurons x = Dense(128)(x) x = LeakyReLU(alpha = 0.2)(x)
# Add a DropOut layer with Drop out ratio of 0.3 x = Dropout(rate = 0.3)(x)
# Add a Dense Layer with 64 neurons x = Dense(64)(x) x = LeakyReLU(alpha = 0.2)(x)
# Add a Batch Normalization layer x = BatchNormalization()(x)
# Adding the final dense layer with 4 neurons and use 'softmax' activation pred = Dense(4, activation = 'softmax')(x)
vggmodel = Model(vgg.input, pred) # Initializing the model vggmodel.summary()
Model: "model" Layer (type) Output Shape Param # ================================================================= input_1 (InputLayer) [(None, 48, 48, 3)] 0 block1_conv1 (Conv2D) (None, 48, 48, 64) 1792
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 6 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…Reference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
x
x
x
x
x
x
x
x
block1_conv2 (Conv2D) (None, 48, 48, 64) 36928
block1_pool (MaxPooling2D) (None, 24, 24, 64) 0
block2_conv1 (Conv2D) (None, 24, 24, 128) 73856
block2_conv2 (Conv2D) (None, 24, 24, 128) 147584
block2_pool (MaxPooling2D) (None, 12, 12, 128) 0
block3_conv1 (Conv2D) (None, 12, 12, 256) 295168
block3_conv2 (Conv2D) (None, 12, 12, 256) 590080
block3_conv3 (Conv2D) (None, 12, 12, 256) 590080
block3_pool (MaxPooling2D) (None, 6, 6, 256) 0
block4_conv1 (Conv2D) (None, 6, 6, 512) 1180160
block4_conv2 (Conv2D) (None, 6, 6, 512) 2359808
block4_conv3 (Conv2D) (None, 6, 6, 512) 2359808
block4_pool (MaxPooling2D) (None, 3, 3, 512) 0
block5_conv1 (Conv2D) (None, 3, 3, 512) 2359808
block5_conv2 (Conv2D) (None, 3, 3, 512) 2359808
block5_conv3 (Conv2D) (None, 3, 3, 512) 2359808
block5_pool (MaxPooling2D) (None, 1, 1, 512) 0
conv2d (Conv2D) (None, 1, 1, 256) 131328
leaky_re_lu (LeakyReLU) (None, 1, 1, 256) 0
batch_normalization (BatchN (None, 1, 1, 256) 1024 ormalization)
max_pooling2d (MaxPooling2D (None, 1, 1, 256) 0 )
dropout (Dropout) (None, 1, 1, 256) 0
conv2d_1 (Conv2D) (None, 1, 1, 512) 131584
leaky_re_lu_1 (LeakyReLU) (None, 1, 1, 512) 0
batch_normalization_1 (Batc (None, 1, 1, 512) 2048 hNormalization)
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 7 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…Reference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
max_pooling2d_1 (MaxPooling (None, 1, 1, 512) 0 2D)
dropout_1 (Dropout) (None, 1, 1, 512) 0 conv2d_2 (Conv2D) (None, 1, 1, 64) 32832
leaky_re_lu_2 (LeakyReLU) (None, 1, 1, 64) 0
batch_normalization_2 (Batc (None, 1, 1, 64) 256 hNormalization)
max_pooling2d_2 (MaxPooling (None, 1, 1, 64) 0 2D)
dropout_2 (Dropout) (None, 1, 1, 64) 0 flatten (Flatten) (None, 64) 0 dense (Dense) (None, 256) 16640
leaky_re_lu_3 (LeakyReLU) (None, 256) 0 dense_1 (Dense) (None, 128) 32896
leaky_re_lu_4 (LeakyReLU) (None, 128) 0 dropout_3 (Dropout) (None, 128) 0 dense_2 (Dense) (None, 64) 8256 leaky_re_lu_5 (LeakyReLU) (None, 64) 0
batch_normalization_3 (Batc (None, 64) 256 hNormalization) dense_3 (Dense) (None, 4) 260 =================================================================
Compiling and Training the VGG16 Model
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 8 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…Reference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
Total params: 15,072,068 Trainable params: 355,588 Non-trainable params: 14,716,480
In [ ]: from keras.callbacks import ModelCheckpoint, EarlyStopping, ReduceLROnPlateau checkpoint = ModelCheckpoint("./vggmodel.h5", monitor = 'val_loss', verbose early_stopping = EarlyStopping(monitor = 'val_loss', min_delta = 0, patience = 3, verbose = 1, restore_best_weights = True )
reduce_learningrate = ReduceLROnPlateau(monitor = 'val_loss', factor = 0.2, patience = 3, verbose = 1, min_delta = 0.0001)
callbacks_list = [early_stopping, checkpoint, reduce_learningrate] epochs = 20
In [ ]: # Write your code to compile the vggmodel. Use categorical crossentropy as the loss vggmodel.compile(optimizer = Adam(learning_rate = 0.001), loss = 'categorical_crossentropy'
In [ ]: history = vggmodel.fit(x= train_set, validation_data = validation_set, epochs
Epoch 1/20
473/473 [==============================] - 37s 72ms/step - loss: 1.3384 - ac curacy: 0.3839 - val_loss: 1.2922 - val_accuracy: 0.3695
Epoch 2/20
473/473 [==============================] - 32s 69ms/step - loss: 1.2432 - ac curacy: 0.4342 - val_loss: 1.2609 - val_accuracy: 0.3703 Epoch 3/20
473/473 [==============================] - 32s 69ms/step - loss: 1.2176 - ac curacy: 0.4533 - val_loss: 1.3316 - val_accuracy: 0.3546
Epoch 4/20
473/473 [==============================] - 33s 69ms/step - loss: 1.2078 - ac curacy: 0.4580 - val_loss: 1.2323 - val_accuracy: 0.4330
Epoch 5/20
473/473 [==============================] - 33s 70ms/step - loss: 1.1965 - ac curacy: 0.4678 - val_loss: 1.2506 - val_accuracy: 0.4049
Epoch 6/20
473/473 [==============================] - 32s 68ms/step - loss: 1.1944 - ac curacy: 0.4623 - val_loss: 1.2775 - val_accuracy: 0.3808
Epoch 7/20
473/473 [==============================] - 33s 69ms/step - loss: 1.1888 - ac curacy: 0.4716 - val_loss: 1.2650 - val_accuracy: 0.3882 Epoch 8/20
473/473 [==============================] - 32s 69ms/step - loss: 1.1892 - ac curacy: 0.4689 - val_loss: 1.2802 - val_accuracy: 0.3755 Epoch 9/20
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 9 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…Reference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
473/473
[==============================] - 33s 69ms/step - loss: 1.1799 - ac curacy: 0.4747 - val_loss: 1.2507 - val_accuracy: 0.4272
Epoch 10/20
473/473 [==============================] - 33s 71ms/step - loss: 1.1783 - ac curacy: 0.4755 - val_loss: 1.2712 - val_accuracy: 0.4010
Epoch 11/20
473/473 [==============================] - 33s 69ms/step - loss: 1.1771 - ac curacy: 0.4793 - val_loss: 1.2426 - val_accuracy: 0.4073
Epoch 12/20
473/473 [==============================] - 33s 69ms/step - loss: 1.1751 - ac curacy: 0.4810 - val_loss: 1.2607 - val_accuracy: 0.3964
Epoch 13/20
473/473 [==============================] - 32s 69ms/step - loss: 1.1743 - ac curacy: 0.4856 - val_loss: 1.2593 - val_accuracy: 0.4101
Epoch 14/20
473/473 [==============================] - 32s 69ms/step - loss: 1.1719 - ac curacy: 0.4806 - val_loss: 1.2484 - val_accuracy: 0.4002
Epoch 15/20
473/473 [==============================] - 33s 69ms/step - loss: 1.1698 - ac curacy: 0.4843 - val_loss: 1.3131 - val_accuracy: 0.3564
Epoch 16/20
473/473 [==============================] - 33s 70ms/step - loss: 1.1704 - ac curacy: 0.4822 - val_loss: 1.2568 - val_accuracy: 0.4145
Epoch 17/20
473/473 [==============================] - 32s 68ms/step - loss: 1.1677 - ac curacy: 0.4822 - val_loss: 1.3002 - val_accuracy: 0.3781
Epoch 18/20
473/473 [==============================] - 33s 69ms/step - loss: 1.1669 - ac curacy: 0.4827 - val_loss: 1.2594 - val_accuracy: 0.3902
Epoch 19/20
473/473 [==============================] - 32s 69ms/step - loss: 1.1651 - ac curacy: 0.4825 - val_loss: 1.2785 - val_accuracy: 0.3735
Epoch 20/20
473/473 [==============================] - 32s 68ms/step - loss: 1.1617 - ac curacy: 0.4945 - val_loss: 1.2708 - val_accuracy: 0.3878
Evaluating the VGG16 model
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 10 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
In [ ]: # Write your code to evaluate model performance on the test set dict_hist = history.history list_ep = [i for i in range(1, 21)] plt.figure(figsize = (8, 8)) plt.plot(list_ep, dict_hist['accuracy'], ls = '--' , label = 'accuracy') plt.plot(list_ep, dict_hist['val_accuracy'], ls = '--' , label = 'val_accuracy' plt.ylabel('Accuracy') plt.xlabel('Epochs') plt.legend() plt.show()
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 11 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
Think About It:
What do you infer from the general trend in the training performance?
Is the training accuracy consistently improving?
Is the validation accuracy also improving similarly?
Observations and Insights:
The model is not learning well at all. The training data is maxing out at (), and is doing even worse at generalization (). There are several problemas with the model. First are learning parameters, as no training goes back to the transfer model, and the model is flattened at a 1X1 as it comes out of max pooling, this does not allow for weight changes and fine tuning the model.
I tried to use different parameters within the hidden layers, and got similiar results. I also experimented with a rapid rize and drop of neurons from layer to layer (for example from 128 to 512 to 128 to 64. I played with changes in the learning rate to see if the algorithem was stuck in a local minima; however, this had little effect on the outcome.
I did make some changes to the hyper parameters for data augmentation as well. I do think this had some effect.
Note: You can even go back and build your own architecture on top of the VGG16 Transfer layer and see if you can improve the performance
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 12 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
ResNet V2 Model Model: "resnet101" Layer (type) Output Shape Param # Connected t o ============================================================================ ====================== input_2 (InputLayer) [(None, 48, 48, 3)] 0 [] conv1_pad (ZeroPadding2D) (None, 54, 54, 3) 0 ['input_2[0 ][0]'] In [ ]: import tensorflow as tf import tensorflow.keras.applications as ap from tensorflow.keras import Model Resnet = ap.ResNet101(include_top = False, weights = "imagenet", input_shape Resnet.summary()
conv1_conv (Conv2D) (None, 24, 24, 64) 9472 ['conv1_pad [0][0]']
conv1_bn (BatchNormalization) (None, 24, 24, 64) 256 ['conv1_con v[0][0]']
conv1_relu (Activation) (None, 24, 24, 64) 0 ['conv1_bn[ 0][0]']
pool1_pad (ZeroPadding2D) (None, 26, 26, 64) 0 ['conv1_rel u[0][0]']
pool1_pool (MaxPooling2D) (None, 12, 12, 64) 0 ['pool1_pad [0][0]']
conv2_block1_1_conv (Conv2D) (None, 12, 12, 64) 4160 ['pool1_poo l[0][0]']
conv2_block1_1_bn (BatchNormal (None, 12, 12, 64) 256 ['conv2_blo ck1_1_conv[0][0]'] ization)
conv2_block1_1_relu (Activatio (None, 12, 12, 64) 0 ['conv2_blo ck1_1_bn[0][0]'] n)
conv2_block1_2_conv (Conv2D) (None, 12, 12, 64) 36928
['conv2_blo ck1_1_relu[0][0]']
conv2_block1_2_bn (BatchNormal (None, 12, 12, 64) 256 ['conv2_blo ck1_2_conv[0][0]'] ization)
conv2_block1_2_relu (Activatio (None, 12, 12, 64) 0 ['conv2_blo ck1_2_bn[0][0]'] n)
conv2_block1_0_conv (Conv2D) (None, 12, 12, 256) 16640 ['pool1_poo l[0][0]']
conv2_block1_3_conv (Conv2D) (None, 12, 12, 256) 16640 ['conv2_blo ck1_2_relu[0][0]']
conv2_block1_0_bn (BatchNormal (None, 12, 12, 256) 1024 ['conv2_blo ck1_0_conv[0][0]'] ization)
conv2_block1_3_bn (BatchNormal (None, 12, 12, 256) 1024
['conv2_blo ck1_3_conv[0][0]'] ization)
conv2_block1_add (Add) (None, 12, 12, 256) 0 ['conv2_blo
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 13 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
ck1_0_bn[0][0]',
'conv2_blo ck1_3_bn[0][0]']
conv2_block1_out (Activation) (None, 12, 12, 256) 0
['conv2_blo ck1_add[0][0]']
conv2_block2_1_conv (Conv2D) (None, 12, 12, 64) 16448 ['conv2_blo ck1_out[0][0]']
conv2_block2_1_bn (BatchNormal (None, 12, 12, 64) 256 ['conv2_blo ck2_1_conv[0][0]'] ization)
conv2_block2_1_relu (Activatio (None, 12, 12, 64) 0 ['conv2_blo ck2_1_bn[0][0]'] n)
conv2_block2_2_conv (Conv2D) (None, 12, 12, 64) 36928 ['conv2_blo ck2_1_relu[0][0]']
conv2_block2_2_bn (BatchNormal (None, 12, 12, 64) 256 ['conv2_blo ck2_2_conv[0][0]'] ization)
conv2_block2_2_relu (Activatio (None, 12, 12, 64) 0
['conv2_blo ck2_2_bn[0][0]'] n)
conv2_block2_3_conv (Conv2D) (None, 12, 12, 256) 16640
['conv2_blo ck2_2_relu[0][0]']
conv2_block2_3_bn (BatchNormal (None, 12, 12, 256) 1024 ['conv2_blo ck2_3_conv[0][0]'] ization)
conv2_block2_add (Add) (None, 12, 12, 256) 0 ['conv2_blo ck1_out[0][0]', 'conv2_blo ck2_3_bn[0][0]']
conv2_block2_out (Activation) (None, 12, 12, 256) 0
['conv2_blo ck2_add[0][0]']
conv2_block3_1_conv (Conv2D) (None, 12, 12, 64) 16448 ['conv2_blo ck2_out[0][0]']
conv2_block3_1_bn (BatchNormal (None, 12, 12, 64) 256 ['conv2_blo ck3_1_conv[0][0]'] ization)
['conv2_blo ck3_1_bn[0][0]']
conv2_block3_1_relu (Activatio (None, 12, 12, 64) 0
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 14 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
n)
conv2_block3_2_conv (Conv2D) (None, 12, 12, 64) 36928 ['conv2_blo ck3_1_relu[0][0]']
conv2_block3_2_bn (BatchNormal (None, 12, 12, 64) 256 ['conv2_blo ck3_2_conv[0][0]'] ization)
conv2_block3_2_relu (Activatio (None, 12, 12, 64) 0 ['conv2_blo ck3_2_bn[0][0]'] n)
conv2_block3_3_conv (Conv2D) (None, 12, 12, 256) 16640 ['conv2_blo ck3_2_relu[0][0]']
conv2_block3_3_bn (BatchNormal (None, 12, 12, 256) 1024 ['conv2_blo ck3_3_conv[0][0]'] ization)
conv2_block3_add (Add) (None, 12, 12, 256) 0 ['conv2_blo ck2_out[0][0]', 'conv2_blo ck3_3_bn[0][0]']
conv2_block3_out (Activation) (None, 12, 12, 256) 0 ['conv2_blo ck3_add[0][0]']
conv3_block1_1_conv (Conv2D) (None, 6, 6, 128) 32896 ['conv2_blo ck3_out[0][0]']
conv3_block1_1_bn (BatchNormal (None, 6, 6, 128) 512 ['conv3_blo ck1_1_conv[0][0]'] ization)
conv3_block1_1_relu (Activatio (None, 6, 6, 128) 0 ['conv3_blo ck1_1_bn[0][0]'] n)
conv3_block1_2_conv (Conv2D) (None, 6, 6, 128) 147584 ['conv3_blo ck1_1_relu[0][0]']
conv3_block1_2_bn (BatchNormal (None, 6, 6, 128) 512 ['conv3_blo ck1_2_conv[0][0]'] ization)
conv3_block1_2_relu (Activatio (None, 6, 6, 128) 0
['conv3_blo ck1_2_bn[0][0]'] n)
conv3_block1_0_conv (Conv2D) (None, 6, 6, 512) 131584 ['conv2_blo ck3_out[0][0]']
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 15 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
conv3_block1_3_conv (Conv2D) (None, 6, 6, 512) 66048 ['conv3_blo ck1_2_relu[0][0]']
conv3_block1_0_bn (BatchNormal (None, 6, 6, 512) 2048 ['conv3_blo ck1_0_conv[0][0]'] ization)
conv3_block1_3_bn (BatchNormal (None, 6, 6, 512) 2048 ['conv3_blo ck1_3_conv[0][0]'] ization)
conv3_block1_add (Add) (None, 6, 6, 512) 0 ['conv3_blo ck1_0_bn[0][0]', 'conv3_blo ck1_3_bn[0][0]']
conv3_block1_out (Activation) (None, 6, 6, 512) 0 ['conv3_blo ck1_add[0][0]']
conv3_block2_1_conv (Conv2D) (None, 6, 6, 128) 65664 ['conv3_blo ck1_out[0][0]']
conv3_block2_1_bn (BatchNormal (None, 6, 6, 128) 512 ['conv3_blo ck2_1_conv[0][0]'] ization)
conv3_block2_1_relu (Activatio (None, 6, 6, 128) 0
['conv3_blo ck2_1_bn[0][0]'] n)
conv3_block2_2_conv (Conv2D) (None, 6, 6, 128) 147584 ['conv3_blo ck2_1_relu[0][0]']
conv3_block2_2_bn (BatchNormal (None, 6, 6, 128) 512 ['conv3_blo ck2_2_conv[0][0]'] ization)
conv3_block2_2_relu (Activatio (None, 6, 6, 128) 0 ['conv3_blo ck2_2_bn[0][0]'] n)
conv3_block2_3_conv (Conv2D) (None, 6, 6, 512) 66048 ['conv3_blo ck2_2_relu[0][0]']
conv3_block2_3_bn (BatchNormal (None, 6, 6, 512) 2048 ['conv3_blo ck2_3_conv[0][0]'] ization)
conv3_block2_add (Add) (None, 6, 6, 512) 0 ['conv3_blo ck1_out[0][0]', 'conv3_blo ck2_3_bn[0][0]']
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 16 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
conv3_block2_out (Activation) (None, 6, 6, 512) 0 ['conv3_blo ck2_add[0][0]']
conv3_block3_1_conv (Conv2D) (None, 6, 6, 128) 65664 ['conv3_blo ck2_out[0][0]']
conv3_block3_1_bn (BatchNormal (None, 6, 6, 128) 512 ['conv3_blo ck3_1_conv[0][0]'] ization)
conv3_block3_1_relu (Activatio (None, 6, 6, 128) 0 ['conv3_blo ck3_1_bn[0][0]'] n)
conv3_block3_2_conv (Conv2D) (None, 6, 6, 128) 147584 ['conv3_blo ck3_1_relu[0][0]']
conv3_block3_2_bn (BatchNormal (None, 6, 6, 128) 512 ['conv3_blo ck3_2_conv[0][0]'] ization)
conv3_block3_2_relu (Activatio (None, 6, 6, 128) 0 ['conv3_blo ck3_2_bn[0][0]'] n)
conv3_block3_3_conv (Conv2D) (None, 6, 6, 512) 66048 ['conv3_blo ck3_2_relu[0][0]']
conv3_block3_3_bn (BatchNormal (None, 6, 6, 512) 2048 ['conv3_blo ck3_3_conv[0][0]'] ization)
conv3_block3_add (Add) (None, 6, 6, 512) 0 ['conv3_blo ck2_out[0][0]', 'conv3_blo ck3_3_bn[0][0]']
conv3_block3_out (Activation) (None, 6, 6, 512) 0 ['conv3_blo ck3_add[0][0]']
conv3_block4_1_conv (Conv2D) (None, 6, 6, 128) 65664 ['conv3_blo ck3_out[0][0]']
conv3_block4_1_bn (BatchNormal (None, 6, 6, 128) 512 ['conv3_blo ck4_1_conv[0][0]'] ization)
conv3_block4_1_relu (Activatio (None, 6, 6, 128) 0 ['conv3_blo ck4_1_bn[0][0]'] n)
conv3_block4_2_conv (Conv2D) (None, 6, 6, 128) 147584 ['conv3_blo ck4_1_relu[0][0]']
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 17 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
conv3_block4_2_bn (BatchNormal (None, 6, 6, 128) 512 ['conv3_blo ck4_2_conv[0][0]'] ization)
conv3_block4_2_relu (Activatio (None, 6, 6, 128) 0 ['conv3_blo ck4_2_bn[0][0]'] n)
conv3_block4_3_conv (Conv2D) (None, 6, 6, 512) 66048 ['conv3_blo ck4_2_relu[0][0]']
conv3_block4_3_bn (BatchNormal (None, 6, 6, 512) 2048 ['conv3_blo ck4_3_conv[0][0]'] ization)
conv3_block4_add (Add) (None, 6, 6, 512) 0 ['conv3_blo ck3_out[0][0]', 'conv3_blo ck4_3_bn[0][0]']
conv3_block4_out (Activation) (None, 6, 6, 512) 0 ['conv3_blo ck4_add[0][0]']
conv4_block1_1_conv (Conv2D) (None, 3, 3, 256) 131328 ['conv3_blo ck4_out[0][0]']
conv4_block1_1_bn (BatchNormal (None, 3, 3, 256) 1024 ['conv4_blo ck1_1_conv[0][0]'] ization)
conv4_block1_1_relu (Activatio (None, 3, 3, 256) 0 ['conv4_blo ck1_1_bn[0][0]'] n)
conv4_block1_2_conv (Conv2D) (None, 3, 3, 256) 590080 ['conv4_blo ck1_1_relu[0][0]']
conv4_block1_2_bn (BatchNormal (None, 3, 3, 256) 1024 ['conv4_blo ck1_2_conv[0][0]'] ization)
conv4_block1_2_relu (Activatio (None, 3, 3, 256) 0 ['conv4_blo ck1_2_bn[0][0]'] n)
conv4_block1_0_conv (Conv2D) (None, 3, 3, 1024) 525312 ['conv3_blo ck4_out[0][0]']
conv4_block1_3_conv (Conv2D) (None, 3, 3, 1024) 263168 ['conv4_blo ck1_2_relu[0][0]']
conv4_block1_0_bn (BatchNormal (None, 3, 3, 1024) 4096 ['conv4_blo
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 18 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
ck1_0_conv[0][0]'] ization)
conv4_block1_3_bn (BatchNormal (None, 3, 3, 1024) 4096 ['conv4_blo ck1_3_conv[0][0]'] ization)
conv4_block1_add (Add) (None, 3, 3, 1024) 0 ['conv4_blo ck1_0_bn[0][0]', 'conv4_blo ck1_3_bn[0][0]']
conv4_block1_out (Activation) (None, 3, 3, 1024) 0 ['conv4_blo ck1_add[0][0]']
conv4_block2_1_conv (Conv2D) (None, 3, 3, 256) 262400 ['conv4_blo ck1_out[0][0]']
conv4_block2_1_bn (BatchNormal (None, 3, 3, 256) 1024 ['conv4_blo ck2_1_conv[0][0]'] ization)
conv4_block2_1_relu (Activatio (None, 3, 3, 256) 0 ['conv4_blo ck2_1_bn[0][0]'] n)
conv4_block2_2_conv (Conv2D) (None, 3, 3, 256) 590080 ['conv4_blo ck2_1_relu[0][0]']
conv4_block2_2_bn (BatchNormal (None, 3, 3, 256) 1024 ['conv4_blo ck2_2_conv[0][0]'] ization)
conv4_block2_2_relu (Activatio (None, 3, 3, 256) 0 ['conv4_blo ck2_2_bn[0][0]'] n)
conv4_block2_3_conv (Conv2D) (None, 3, 3, 1024) 263168 ['conv4_blo ck2_2_relu[0][0]']
conv4_block2_3_bn (BatchNormal (None, 3, 3, 1024) 4096 ['conv4_blo ck2_3_conv[0][0]'] ization)
conv4_block2_add (Add) (None, 3, 3, 1024) 0 ['conv4_blo ck1_out[0][0]', 'conv4_blo ck2_3_bn[0][0]']
conv4_block2_out (Activation) (None, 3, 3, 1024) 0 ['conv4_blo ck2_add[0][0]']
conv4_block3_1_conv (Conv2D) (None, 3, 3, 256) 262400 ['conv4_blo
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 19 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
ck2_out[0][0]']
conv4_block3_1_bn (BatchNormal (None, 3, 3, 256) 1024 ['conv4_blo ck3_1_conv[0][0]'] ization)
conv4_block3_1_relu (Activatio (None, 3, 3, 256) 0 ['conv4_blo ck3_1_bn[0][0]'] n)
conv4_block3_2_conv (Conv2D) (None, 3, 3, 256) 590080 ['conv4_blo ck3_1_relu[0][0]']
conv4_block3_2_bn (BatchNormal (None, 3, 3, 256) 1024 ['conv4_blo ck3_2_conv[0][0]'] ization)
conv4_block3_2_relu (Activatio (None, 3, 3, 256) 0 ['conv4_blo ck3_2_bn[0][0]'] n)
conv4_block3_3_conv (Conv2D) (None, 3, 3, 1024) 263168 ['conv4_blo ck3_2_relu[0][0]']
conv4_block3_3_bn (BatchNormal (None, 3, 3, 1024) 4096 ['conv4_blo ck3_3_conv[0][0]'] ization)
conv4_block3_add (Add) (None, 3, 3, 1024) 0 ['conv4_blo ck2_out[0][0]', 'conv4_blo ck3_3_bn[0][0]']
conv4_block3_out (Activation) (None, 3, 3, 1024) 0 ['conv4_blo ck3_add[0][0]']
conv4_block4_1_conv (Conv2D) (None, 3, 3, 256) 262400 ['conv4_blo ck3_out[0][0]']
conv4_block4_1_bn (BatchNormal (None, 3, 3, 256) 1024 ['conv4_blo ck4_1_conv[0][0]'] ization)
conv4_block4_1_relu (Activatio (None, 3, 3, 256) 0 ['conv4_blo ck4_1_bn[0][0]'] n)
conv4_block4_2_conv (Conv2D) (None, 3, 3, 256) 590080 ['conv4_blo ck4_1_relu[0][0]']
conv4_block4_2_bn (BatchNormal (None, 3, 3, 256) 1024 ['conv4_blo ck4_2_conv[0][0]'] ization)
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 20 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
conv4_block4_2_relu (Activatio (None, 3, 3, 256) 0 ['conv4_blo ck4_2_bn[0][0]'] n)
conv4_block4_3_conv (Conv2D) (None, 3, 3, 1024) 263168 ['conv4_blo ck4_2_relu[0][0]']
conv4_block4_3_bn (BatchNormal (None, 3, 3, 1024) 4096 ['conv4_blo ck4_3_conv[0][0]'] ization)
conv4_block4_add (Add) (None, 3, 3, 1024) 0 ['conv4_blo ck3_out[0][0]', 'conv4_blo ck4_3_bn[0][0]']
conv4_block4_out (Activation) (None, 3, 3, 1024) 0 ['conv4_blo ck4_add[0][0]']
conv4_block5_1_conv (Conv2D) (None, 3, 3, 256) 262400 ['conv4_blo ck4_out[0][0]']
conv4_block5_1_bn (BatchNormal (None, 3, 3, 256) 1024 ['conv4_blo ck5_1_conv[0][0]'] ization)
conv4_block5_1_relu (Activatio (None, 3, 3, 256) 0 ['conv4_blo ck5_1_bn[0][0]'] n)
conv4_block5_2_conv (Conv2D) (None, 3, 3, 256) 590080 ['conv4_blo ck5_1_relu[0][0]']
conv4_block5_2_bn (BatchNormal (None, 3, 3, 256) 1024 ['conv4_blo ck5_2_conv[0][0]'] ization)
conv4_block5_2_relu (Activatio (None, 3, 3, 256) 0 ['conv4_blo ck5_2_bn[0][0]'] n)
conv4_block5_3_conv (Conv2D) (None, 3, 3, 1024) 263168 ['conv4_blo ck5_2_relu[0][0]']
conv4_block5_3_bn (BatchNormal (None, 3, 3, 1024) 4096 ['conv4_blo ck5_3_conv[0][0]'] ization)
conv4_block5_add (Add) (None, 3, 3, 1024) 0 ['conv4_blo ck4_out[0][0]', 'conv4_blo ck5_3_bn[0][0]']
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 21 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
conv4_block5_out (Activation) (None, 3, 3, 1024) 0 ['conv4_blo ck5_add[0][0]']
conv4_block6_1_conv (Conv2D) (None, 3, 3, 256) 262400 ['conv4_blo ck5_out[0][0]']
conv4_block6_1_bn (BatchNormal (None, 3, 3, 256) 1024 ['conv4_blo ck6_1_conv[0][0]'] ization)
conv4_block6_1_relu (Activatio (None, 3, 3, 256) 0 ['conv4_blo ck6_1_bn[0][0]'] n)
conv4_block6_2_conv (Conv2D) (None, 3, 3, 256) 590080 ['conv4_blo ck6_1_relu[0][0]']
conv4_block6_2_bn (BatchNormal (None, 3, 3, 256) 1024 ['conv4_blo ck6_2_conv[0][0]'] ization)
conv4_block6_2_relu (Activatio (None, 3, 3, 256) 0
['conv4_blo ck6_2_bn[0][0]'] n)
conv4_block6_3_conv (Conv2D) (None, 3, 3, 1024) 263168 ['conv4_blo ck6_2_relu[0][0]']
conv4_block6_3_bn (BatchNormal (None, 3, 3, 1024) 4096 ['conv4_blo ck6_3_conv[0][0]'] ization)
conv4_block6_add (Add) (None, 3, 3, 1024) 0 ['conv4_blo ck5_out[0][0]', 'conv4_blo ck6_3_bn[0][0]']
conv4_block6_out (Activation) (None, 3, 3, 1024) 0 ['conv4_blo ck6_add[0][0]']
conv4_block7_1_conv (Conv2D) (None, 3, 3, 256) 262400 ['conv4_blo ck6_out[0][0]']
conv4_block7_1_bn (BatchNormal (None, 3, 3, 256) 1024 ['conv4_blo ck7_1_conv[0][0]'] ization)
conv4_block7_1_relu (Activatio (None, 3, 3, 256) 0 ['conv4_blo ck7_1_bn[0][0]'] n)
conv4_block7_2_conv (Conv2D) (None, 3, 3, 256) 590080 ['conv4_blo
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 22 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
ck7_1_relu[0][0]']
conv4_block7_2_bn (BatchNormal (None, 3, 3, 256) 1024 ['conv4_blo ck7_2_conv[0][0]'] ization)
conv4_block7_2_relu (Activatio (None, 3, 3, 256) 0 ['conv4_blo ck7_2_bn[0][0]'] n)
conv4_block7_3_conv (Conv2D) (None, 3, 3, 1024) 263168 ['conv4_blo ck7_2_relu[0][0]']
conv4_block7_3_bn (BatchNormal (None, 3, 3, 1024) 4096 ['conv4_blo ck7_3_conv[0][0]'] ization)
conv4_block7_add (Add) (None, 3, 3, 1024) 0 ['conv4_blo ck6_out[0][0]', 'conv4_blo ck7_3_bn[0][0]']
conv4_block7_out (Activation) (None, 3, 3, 1024) 0 ['conv4_blo ck7_add[0][0]']
conv4_block8_1_conv (Conv2D) (None, 3, 3, 256) 262400 ['conv4_blo ck7_out[0][0]']
conv4_block8_1_bn (BatchNormal (None, 3, 3, 256) 1024 ['conv4_blo ck8_1_conv[0][0]'] ization)
conv4_block8_1_relu (Activatio (None, 3, 3, 256) 0 ['conv4_blo ck8_1_bn[0][0]'] n)
conv4_block8_2_conv (Conv2D) (None, 3, 3, 256) 590080 ['conv4_blo ck8_1_relu[0][0]']
conv4_block8_2_bn (BatchNormal (None, 3, 3, 256) 1024 ['conv4_blo ck8_2_conv[0][0]'] ization)
conv4_block8_2_relu (Activatio (None, 3, 3, 256) 0 ['conv4_blo ck8_2_bn[0][0]'] n)
conv4_block8_3_conv (Conv2D) (None, 3, 3, 1024) 263168 ['conv4_blo ck8_2_relu[0][0]']
conv4_block8_3_bn (BatchNormal (None, 3, 3, 1024) 4096 ['conv4_blo ck8_3_conv[0][0]'] ization)
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 23 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
conv4_block8_add (Add) (None, 3, 3, 1024) 0 ['conv4_blo ck7_out[0][0]', 'conv4_blo ck8_3_bn[0][0]']
conv4_block8_out (Activation) (None, 3, 3, 1024) 0 ['conv4_blo ck8_add[0][0]']
conv4_block9_1_conv (Conv2D) (None, 3, 3, 256) 262400 ['conv4_blo ck8_out[0][0]']
conv4_block9_1_bn (BatchNormal (None, 3, 3, 256) 1024 ['conv4_blo ck9_1_conv[0][0]'] ization)
conv4_block9_1_relu (Activatio (None, 3, 3, 256) 0 ['conv4_blo ck9_1_bn[0][0]'] n)
conv4_block9_2_conv (Conv2D) (None, 3, 3, 256) 590080 ['conv4_blo ck9_1_relu[0][0]']
conv4_block9_2_bn (BatchNormal (None, 3, 3, 256) 1024 ['conv4_blo ck9_2_conv[0][0]'] ization)
conv4_block9_2_relu (Activatio (None, 3, 3, 256) 0 ['conv4_blo ck9_2_bn[0][0]'] n)
conv4_block9_3_conv (Conv2D) (None, 3, 3, 1024) 263168 ['conv4_blo ck9_2_relu[0][0]']
conv4_block9_3_bn (BatchNormal (None, 3, 3, 1024) 4096 ['conv4_blo ck9_3_conv[0][0]'] ization)
conv4_block9_add (Add) (None, 3, 3, 1024) 0 ['conv4_blo ck8_out[0][0]', 'conv4_blo ck9_3_bn[0][0]']
conv4_block9_out (Activation) (None, 3, 3, 1024) 0 ['conv4_blo ck9_add[0][0]']
conv4_block10_1_conv (Conv2D) (None, 3, 3, 256) 262400 ['conv4_blo ck9_out[0][0]']
conv4_block10_1_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck10_1_conv[0][0]'] lization)
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 24 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
conv4_block10_1_relu (Activati (None, 3, 3, 256) 0 ['conv4_blo ck10_1_bn[0][0]'] on)
conv4_block10_2_conv (Conv2D) (None, 3, 3, 256) 590080 ['conv4_blo ck10_1_relu[0][0]']
conv4_block10_2_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck10_2_conv[0][0]'] lization)
conv4_block10_2_relu (Activati (None, 3, 3, 256) 0 ['conv4_blo ck10_2_bn[0][0]'] on)
conv4_block10_3_conv (Conv2D) (None, 3, 3, 1024) 263168 ['conv4_blo ck10_2_relu[0][0]']
conv4_block10_3_bn (BatchNorma (None, 3, 3, 1024) 4096
['conv4_blo ck10_3_conv[0][0]'] lization)
conv4_block10_add (Add) (None, 3, 3, 1024) 0 ['conv4_blo ck9_out[0][0]', 'conv4_blo
ck10_3_bn[0][0]']
conv4_block10_out (Activation) (None, 3, 3, 1024) 0 ['conv4_blo ck10_add[0][0]']
conv4_block11_1_conv (Conv2D) (None, 3, 3, 256) 262400 ['conv4_blo ck10_out[0][0]']
conv4_block11_1_bn (BatchNorma (None, 3, 3, 256) 1024
['conv4_blo ck11_1_conv[0][0]'] lization)
conv4_block11_1_relu (Activati (None, 3, 3, 256) 0 ['conv4_blo ck11_1_bn[0][0]'] on)
conv4_block11_2_conv (Conv2D) (None, 3, 3, 256) 590080 ['conv4_blo ck11_1_relu[0][0]']
conv4_block11_2_bn (BatchNorma (None, 3, 3, 256) 1024
['conv4_blo ck11_2_conv[0][0]'] lization)
conv4_block11_2_relu (Activati (None, 3, 3, 256) 0
['conv4_blo ck11_2_bn[0][0]'] on)
conv4_block11_3_conv (Conv2D) (None, 3, 3, 1024) 263168 ['conv4_blo
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 25 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
ck11_2_relu[0][0]']
conv4_block11_3_bn (BatchNorma (None, 3, 3, 1024) 4096 ['conv4_blo ck11_3_conv[0][0]'] lization)
conv4_block11_add (Add) (None, 3, 3, 1024) 0 ['conv4_blo ck10_out[0][0]', 'conv4_blo ck11_3_bn[0][0]']
conv4_block11_out (Activation) (None, 3, 3, 1024) 0 ['conv4_blo ck11_add[0][0]']
conv4_block12_1_conv (Conv2D) (None, 3, 3, 256) 262400 ['conv4_blo ck11_out[0][0]']
conv4_block12_1_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck12_1_conv[0][0]'] lization)
conv4_block12_1_relu (Activati (None, 3, 3, 256) 0 ['conv4_blo ck12_1_bn[0][0]'] on)
conv4_block12_2_conv (Conv2D) (None, 3, 3, 256) 590080 ['conv4_blo ck12_1_relu[0][0]']
conv4_block12_2_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck12_2_conv[0][0]'] lization)
conv4_block12_2_relu (Activati (None, 3, 3, 256) 0 ['conv4_blo ck12_2_bn[0][0]'] on)
conv4_block12_3_conv (Conv2D) (None, 3, 3, 1024) 263168 ['conv4_blo ck12_2_relu[0][0]']
conv4_block12_3_bn (BatchNorma (None, 3, 3, 1024) 4096 ['conv4_blo ck12_3_conv[0][0]'] lization)
conv4_block12_add (Add) (None, 3, 3, 1024) 0 ['conv4_blo ck11_out[0][0]', 'conv4_blo ck12_3_bn[0][0]']
conv4_block12_out (Activation) (None, 3, 3, 1024) 0 ['conv4_blo ck12_add[0][0]']
conv4_block13_1_conv (Conv2D) (None, 3, 3, 256) 262400 ['conv4_blo ck12_out[0][0]']
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 26 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
conv4_block13_1_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck13_1_conv[0][0]'] lization)
conv4_block13_1_relu (Activati (None, 3, 3, 256) 0 ['conv4_blo ck13_1_bn[0][0]'] on)
conv4_block13_2_conv (Conv2D) (None, 3, 3, 256) 590080 ['conv4_blo ck13_1_relu[0][0]']
conv4_block13_2_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck13_2_conv[0][0]'] lization)
conv4_block13_2_relu (Activati (None, 3, 3, 256) 0 ['conv4_blo ck13_2_bn[0][0]'] on)
conv4_block13_3_conv (Conv2D) (None, 3, 3, 1024) 263168 ['conv4_blo ck13_2_relu[0][0]']
conv4_block13_3_bn (BatchNorma (None, 3, 3, 1024) 4096 ['conv4_blo ck13_3_conv[0][0]'] lization)
conv4_block13_add (Add) (None, 3, 3, 1024) 0 ['conv4_blo ck12_out[0][0]', 'conv4_blo ck13_3_bn[0][0]']
conv4_block13_out (Activation) (None, 3, 3, 1024) 0 ['conv4_blo ck13_add[0][0]']
conv4_block14_1_conv (Conv2D) (None, 3, 3, 256) 262400 ['conv4_blo ck13_out[0][0]']
conv4_block14_1_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck14_1_conv[0][0]'] lization)
conv4_block14_1_relu (Activati (None, 3, 3, 256) 0
['conv4_blo ck14_1_bn[0][0]'] on)
conv4_block14_2_conv (Conv2D) (None, 3, 3, 256) 590080 ['conv4_blo ck14_1_relu[0][0]']
conv4_block14_2_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck14_2_conv[0][0]'] lization)
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 27 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
conv4_block14_2_relu (Activati (None, 3, 3, 256) 0 ['conv4_blo ck14_2_bn[0][0]'] on)
conv4_block14_3_conv (Conv2D) (None, 3, 3, 1024) 263168 ['conv4_blo ck14_2_relu[0][0]']
conv4_block14_3_bn (BatchNorma (None, 3, 3, 1024) 4096 ['conv4_blo ck14_3_conv[0][0]'] lization)
conv4_block14_add (Add) (None, 3, 3, 1024) 0 ['conv4_blo ck13_out[0][0]', 'conv4_blo ck14_3_bn[0][0]']
conv4_block14_out (Activation) (None, 3, 3, 1024) 0 ['conv4_blo ck14_add[0][0]']
conv4_block15_1_conv (Conv2D) (None, 3, 3, 256) 262400 ['conv4_blo ck14_out[0][0]']
conv4_block15_1_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck15_1_conv[0][0]'] lization)
conv4_block15_1_relu (Activati (None, 3, 3, 256) 0 ['conv4_blo ck15_1_bn[0][0]'] on)
conv4_block15_2_conv (Conv2D) (None, 3, 3, 256) 590080 ['conv4_blo ck15_1_relu[0][0]']
conv4_block15_2_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck15_2_conv[0][0]'] lization)
conv4_block15_2_relu (Activati (None, 3, 3, 256) 0 ['conv4_blo ck15_2_bn[0][0]'] on)
conv4_block15_3_conv (Conv2D) (None, 3, 3, 1024) 263168 ['conv4_blo ck15_2_relu[0][0]']
conv4_block15_3_bn (BatchNorma (None, 3, 3, 1024) 4096 ['conv4_blo ck15_3_conv[0][0]'] lization)
conv4_block15_add (Add) (None, 3, 3, 1024) 0 ['conv4_blo ck14_out[0][0]', 'conv4_blo ck15_3_bn[0][0]']
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 28 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
conv4_block15_out (Activation) (None, 3, 3, 1024) 0 ['conv4_blo ck15_add[0][0]']
conv4_block16_1_conv (Conv2D) (None, 3, 3, 256) 262400 ['conv4_blo ck15_out[0][0]']
conv4_block16_1_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck16_1_conv[0][0]'] lization)
conv4_block16_1_relu (Activati (None, 3, 3, 256) 0 ['conv4_blo ck16_1_bn[0][0]'] on)
conv4_block16_2_conv (Conv2D) (None, 3, 3, 256) 590080 ['conv4_blo ck16_1_relu[0][0]']
conv4_block16_2_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck16_2_conv[0][0]'] lization)
conv4_block16_2_relu (Activati (None, 3, 3, 256) 0 ['conv4_blo ck16_2_bn[0][0]'] on)
conv4_block16_3_conv (Conv2D) (None, 3, 3, 1024) 263168 ['conv4_blo ck16_2_relu[0][0]']
conv4_block16_3_bn (BatchNorma (None, 3, 3, 1024) 4096 ['conv4_blo ck16_3_conv[0][0]'] lization)
conv4_block16_add (Add) (None, 3, 3, 1024) 0 ['conv4_blo ck15_out[0][0]', 'conv4_blo ck16_3_bn[0][0]']
conv4_block16_out (Activation) (None, 3, 3, 1024) 0 ['conv4_blo ck16_add[0][0]']
conv4_block17_1_conv (Conv2D) (None, 3, 3, 256) 262400 ['conv4_blo ck16_out[0][0]']
conv4_block17_1_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck17_1_conv[0][0]'] lization)
conv4_block17_1_relu (Activati (None, 3, 3, 256) 0 ['conv4_blo ck17_1_bn[0][0]'] on)
conv4_block17_2_conv (Conv2D) (None, 3, 3, 256) 590080 ['conv4_blo ck17_1_relu[0][0]']
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 29 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
conv4_block17_2_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck17_2_conv[0][0]'] lization)
conv4_block17_2_relu (Activati (None, 3, 3, 256) 0 ['conv4_blo ck17_2_bn[0][0]'] on)
conv4_block17_3_conv (Conv2D) (None, 3, 3, 1024) 263168 ['conv4_blo ck17_2_relu[0][0]']
conv4_block17_3_bn (BatchNorma (None, 3, 3, 1024) 4096 ['conv4_blo ck17_3_conv[0][0]'] lization)
conv4_block17_add (Add) (None, 3, 3, 1024) 0 ['conv4_blo ck16_out[0][0]', 'conv4_blo ck17_3_bn[0][0]']
conv4_block17_out (Activation) (None, 3, 3, 1024) 0 ['conv4_blo ck17_add[0][0]']
conv4_block18_1_conv (Conv2D) (None, 3, 3, 256) 262400 ['conv4_blo ck17_out[0][0]']
conv4_block18_1_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck18_1_conv[0][0]'] lization)
conv4_block18_1_relu (Activati (None, 3, 3, 256) 0
['conv4_blo ck18_1_bn[0][0]'] on)
conv4_block18_2_conv (Conv2D) (None, 3, 3, 256) 590080 ['conv4_blo ck18_1_relu[0][0]']
conv4_block18_2_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck18_2_conv[0][0]'] lization)
conv4_block18_2_relu (Activati (None, 3, 3, 256) 0
['conv4_blo ck18_2_bn[0][0]'] on)
conv4_block18_3_conv (Conv2D) (None, 3, 3, 1024) 263168
['conv4_blo ck18_2_relu[0][0]']
['conv4_blo ck18_3_conv[0][0]'] lization)
conv4_block18_3_bn (BatchNorma (None, 3, 3, 1024) 4096
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 30 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
conv4_block18_add (Add) (None, 3, 3, 1024) 0 ['conv4_blo ck17_out[0][0]', 'conv4_blo ck18_3_bn[0][0]']
conv4_block18_out (Activation) (None, 3, 3, 1024) 0 ['conv4_blo ck18_add[0][0]']
conv4_block19_1_conv (Conv2D) (None, 3, 3, 256) 262400 ['conv4_blo ck18_out[0][0]']
conv4_block19_1_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck19_1_conv[0][0]'] lization)
conv4_block19_1_relu (Activati (None, 3, 3, 256) 0 ['conv4_blo ck19_1_bn[0][0]'] on)
conv4_block19_2_conv (Conv2D) (None, 3, 3, 256) 590080 ['conv4_blo ck19_1_relu[0][0]']
conv4_block19_2_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck19_2_conv[0][0]'] lization)
conv4_block19_2_relu (Activati (None, 3, 3, 256) 0 ['conv4_blo ck19_2_bn[0][0]'] on)
conv4_block19_3_conv (Conv2D) (None, 3, 3, 1024) 263168 ['conv4_blo ck19_2_relu[0][0]']
conv4_block19_3_bn (BatchNorma (None, 3, 3, 1024) 4096 ['conv4_blo ck19_3_conv[0][0]'] lization)
conv4_block19_add (Add) (None, 3, 3, 1024) 0 ['conv4_blo ck18_out[0][0]', 'conv4_blo ck19_3_bn[0][0]']
conv4_block19_out (Activation) (None, 3, 3, 1024) 0 ['conv4_blo ck19_add[0][0]']
conv4_block20_1_conv (Conv2D) (None, 3, 3, 256) 262400 ['conv4_blo ck19_out[0][0]']
conv4_block20_1_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck20_1_conv[0][0]'] lization)
conv4_block20_1_relu (Activati (None, 3, 3, 256) 0 ['conv4_blo
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 31 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
ck20_1_bn[0][0]'] on)
conv4_block20_2_conv (Conv2D) (None, 3, 3, 256) 590080 ['conv4_blo ck20_1_relu[0][0]']
conv4_block20_2_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck20_2_conv[0][0]'] lization)
conv4_block20_2_relu (Activati (None, 3, 3, 256) 0 ['conv4_blo ck20_2_bn[0][0]'] on)
conv4_block20_3_conv (Conv2D) (None, 3, 3, 1024) 263168 ['conv4_blo ck20_2_relu[0][0]']
conv4_block20_3_bn (BatchNorma (None, 3, 3, 1024) 4096 ['conv4_blo ck20_3_conv[0][0]'] lization)
conv4_block20_add (Add) (None, 3, 3, 1024) 0 ['conv4_blo ck19_out[0][0]', 'conv4_blo
ck20_3_bn[0][0]']
conv4_block20_out (Activation) (None, 3, 3, 1024) 0 ['conv4_blo ck20_add[0][0]']
conv4_block21_1_conv (Conv2D) (None, 3, 3, 256) 262400 ['conv4_blo ck20_out[0][0]']
conv4_block21_1_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck21_1_conv[0][0]'] lization)
conv4_block21_1_relu (Activati (None, 3, 3, 256) 0 ['conv4_blo ck21_1_bn[0][0]'] on)
conv4_block21_2_conv (Conv2D) (None, 3, 3, 256) 590080 ['conv4_blo ck21_1_relu[0][0]']
conv4_block21_2_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck21_2_conv[0][0]'] lization)
conv4_block21_2_relu (Activati (None, 3, 3, 256) 0 ['conv4_blo ck21_2_bn[0][0]'] on)
conv4_block21_3_conv (Conv2D) (None, 3, 3, 1024) 263168 ['conv4_blo ck21_2_relu[0][0]']
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 32 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
conv4_block21_3_bn (BatchNorma (None, 3, 3, 1024) 4096 ['conv4_blo ck21_3_conv[0][0]'] lization)
conv4_block21_add (Add) (None, 3, 3, 1024) 0 ['conv4_blo ck20_out[0][0]', 'conv4_blo ck21_3_bn[0][0]']
conv4_block21_out (Activation) (None, 3, 3, 1024) 0 ['conv4_blo ck21_add[0][0]']
conv4_block22_1_conv (Conv2D) (None, 3, 3, 256) 262400 ['conv4_blo ck21_out[0][0]']
conv4_block22_1_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck22_1_conv[0][0]'] lization)
conv4_block22_1_relu (Activati (None, 3, 3, 256) 0 ['conv4_blo ck22_1_bn[0][0]'] on)
conv4_block22_2_conv (Conv2D) (None, 3, 3, 256) 590080 ['conv4_blo ck22_1_relu[0][0]']
conv4_block22_2_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck22_2_conv[0][0]'] lization)
conv4_block22_2_relu (Activati (None, 3, 3, 256) 0 ['conv4_blo ck22_2_bn[0][0]'] on)
conv4_block22_3_conv (Conv2D) (None, 3, 3, 1024) 263168 ['conv4_blo ck22_2_relu[0][0]']
conv4_block22_3_bn (BatchNorma (None, 3, 3, 1024) 4096 ['conv4_blo ck22_3_conv[0][0]'] lization)
conv4_block22_add (Add) (None, 3, 3, 1024) 0
['conv4_blo ck21_out[0][0]', 'conv4_blo ck22_3_bn[0][0]']
conv4_block22_out (Activation) (None, 3, 3, 1024) 0
['conv4_blo ck22_add[0][0]']
conv4_block23_1_conv (Conv2D) (None, 3, 3, 256) 262400 ['conv4_blo ck22_out[0][0]']
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 33 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
conv4_block23_1_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck23_1_conv[0][0]'] lization)
conv4_block23_1_relu (Activati (None, 3, 3, 256) 0 ['conv4_blo ck23_1_bn[0][0]'] on)
conv4_block23_2_conv (Conv2D) (None, 3, 3, 256) 590080 ['conv4_blo ck23_1_relu[0][0]']
conv4_block23_2_bn (BatchNorma (None, 3, 3, 256) 1024 ['conv4_blo ck23_2_conv[0][0]'] lization)
conv4_block23_2_relu (Activati (None, 3, 3, 256) 0 ['conv4_blo ck23_2_bn[0][0]'] on)
conv4_block23_3_conv (Conv2D) (None, 3, 3, 1024) 263168 ['conv4_blo ck23_2_relu[0][0]']
conv4_block23_3_bn (BatchNorma (None, 3, 3, 1024) 4096 ['conv4_blo ck23_3_conv[0][0]'] lization)
conv4_block23_add (Add) (None, 3, 3, 1024) 0 ['conv4_blo ck22_out[0][0]', 'conv4_blo ck23_3_bn[0][0]']
conv4_block23_out (Activation) (None, 3, 3, 1024) 0 ['conv4_blo ck23_add[0][0]']
conv5_block1_1_conv (Conv2D) (None, 2, 2, 512) 524800 ['conv4_blo ck23_out[0][0]']
conv5_block1_1_bn (BatchNormal (None, 2, 2, 512) 2048 ['conv5_blo ck1_1_conv[0][0]'] ization)
conv5_block1_1_relu (Activatio (None, 2, 2, 512) 0 ['conv5_blo ck1_1_bn[0][0]'] n)
conv5_block1_2_conv (Conv2D) (None, 2, 2, 512) 2359808 ['conv5_blo ck1_1_relu[0][0]']
conv5_block1_2_bn (BatchNormal (None, 2, 2, 512) 2048 ['conv5_blo ck1_2_conv[0][0]'] ization)
conv5_block1_2_relu (Activatio (None, 2, 2, 512) 0 ['conv5_blo
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 34 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
ck1_2_bn[0][0]'] n)
conv5_block1_0_conv (Conv2D) (None, 2, 2, 2048) 2099200 ['conv4_blo ck23_out[0][0]']
conv5_block1_3_conv (Conv2D) (None, 2, 2, 2048) 1050624 ['conv5_blo ck1_2_relu[0][0]']
conv5_block1_0_bn (BatchNormal (None, 2, 2, 2048) 8192 ['conv5_blo ck1_0_conv[0][0]'] ization)
conv5_block1_3_bn (BatchNormal (None, 2, 2, 2048) 8192 ['conv5_blo ck1_3_conv[0][0]'] ization)
conv5_block1_add (Add) (None, 2, 2, 2048) 0 ['conv5_blo ck1_0_bn[0][0]', 'conv5_blo ck1_3_bn[0][0]']
conv5_block1_out (Activation) (None, 2, 2, 2048) 0 ['conv5_blo ck1_add[0][0]']
conv5_block2_1_conv (Conv2D) (None, 2, 2, 512) 1049088 ['conv5_blo ck1_out[0][0]']
conv5_block2_1_bn (BatchNormal (None, 2, 2, 512) 2048 ['conv5_blo ck2_1_conv[0][0]'] ization)
conv5_block2_1_relu (Activatio (None, 2, 2, 512) 0 ['conv5_blo ck2_1_bn[0][0]'] n)
conv5_block2_2_conv (Conv2D) (None, 2, 2, 512) 2359808 ['conv5_blo ck2_1_relu[0][0]']
conv5_block2_2_bn (BatchNormal (None, 2, 2, 512) 2048 ['conv5_blo ck2_2_conv[0][0]'] ization)
conv5_block2_2_relu (Activatio (None, 2, 2, 512) 0 ['conv5_blo ck2_2_bn[0][0]'] n)
conv5_block2_3_conv (Conv2D) (None, 2, 2, 2048) 1050624 ['conv5_blo ck2_2_relu[0][0]']
conv5_block2_3_bn (BatchNormal (None, 2, 2, 2048) 8192 ['conv5_blo ck2_3_conv[0][0]'] ization)
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 35 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
conv5_block2_add (Add) (None, 2, 2, 2048) 0 ['conv5_blo ck1_out[0][0]', 'conv5_blo ck2_3_bn[0][0]']
conv5_block2_out (Activation) (None, 2, 2, 2048) 0 ['conv5_blo ck2_add[0][0]']
conv5_block3_1_conv (Conv2D) (None, 2, 2, 512) 1049088 ['conv5_blo ck2_out[0][0]']
conv5_block3_1_bn (BatchNormal (None, 2, 2, 512) 2048 ['conv5_blo ck3_1_conv[0][0]'] ization)
conv5_block3_1_relu (Activatio (None, 2, 2, 512) 0 ['conv5_blo ck3_1_bn[0][0]'] n)
conv5_block3_2_conv (Conv2D) (None, 2, 2, 512) 2359808 ['conv5_blo ck3_1_relu[0][0]']
conv5_block3_2_bn (BatchNormal (None, 2, 2, 512) 2048 ['conv5_blo ck3_2_conv[0][0]'] ization)
conv5_block3_2_relu (Activatio (None, 2, 2, 512) 0 ['conv5_blo ck3_2_bn[0][0]'] n)
conv5_block3_3_conv (Conv2D) (None, 2, 2, 2048) 1050624 ['conv5_blo ck3_2_relu[0][0]']
conv5_block3_3_bn (BatchNormal (None, 2, 2, 2048) 8192 ['conv5_blo ck3_3_conv[0][0]'] ization)
conv5_block3_add (Add) (None, 2, 2, 2048) 0 ['conv5_blo ck2_out[0][0]', 'conv5_blo ck3_3_bn[0][0]']
conv5_block3_out (Activation) (None, 2, 2, 2048) 0 ['conv5_blo ck3_add[0][0]']
Total params: 42,658,176
Trainable params: 42,552,832 Non-trainable params: 105,344
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 36 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
============================================================================ ======================
Model Building
In this model, we will import till the 'conv5_block3_add' layer of the ResNet model. You can scroll down in the model summary and look for 'conv5_block3_add'. You can choose any other layer as well. Then we will add a Flatten layer, which receives the output of the 'conv5_block3_add' layer as its input. We will add a few Dense layers and use 'relu' activation function on them. You may use Dropout and BatchNormalization layers as well. Then we will add our last dense layer, which must have 4 neurons and a 'softmax' activation function.
In [ ]: from tensorflow.python.compiler import xla transfer_layer_Resnet = Resnet.get_layer('conv5_block3_add') Resnet.trainable=False
# Add classification layers on top of it x = MaxPooling2D(2,2)(transfer_layer_Resnet.output)
# Flattenning the output from the 3rd block of the VGG16 model x = Flatten()(x)
# Add a Dense layer with 256 neurons x = Dense(256)(x) x = LeakyReLU(alpha = 0.2)(x) x = BatchNormalization()(x) x = Dropout(rate = 0.3)(x)
# Add a Dense Layer with 128 neurons x = Dense(128)(x) x = LeakyReLU(alpha = 0.2)(x)
# Add a DropOut layer with Drop out ratio of 0.3 x = Dropout(rate = 0.3)(x)
# Add a Dense Layer with 64 neurons x = Dense(64)(x) x = LeakyReLU(alpha = 0.2)(x)
# Add a Batch Normalization layer x = BatchNormalization()(x)
# Add the final dense layer with 4 neurons and use a 'softmax' activation pred = Dense(4, activation = 'softmax')(x)
resnetmodel = Model(Resnet.input, pred) # Initializing the model
Compiling and Training the Model
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 37 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
In [ ]: from keras.callbacks import ModelCheckpoint, EarlyStopping, ReduceLROnPlateau checkpoint = ModelCheckpoint("./Resnetmodel.h5", monitor = 'val_acc', verbose early_stopping = EarlyStopping(monitor = 'val_loss', min_delta = 0, patience = 3, verbose = 1, restore_best_weights = True) # Write your code here. You reduce_learningrate = ReduceLROnPlateau(monitor = 'val_loss', factor = 0.2, patience = 3, verbose = 1, min_delta = 0.0001)
# Write your code here. You may play around with the callbacks_list = [early_stopping, checkpoint, reduce_learningrate] epochs = 20
In [ ]: resnetmodel.compile(optimizer = Adam(learning_rate = 0.001), loss = 'categorical_crossentropy'
# Write your code to compile your resnetmodel. Use categorical crossentropy as your
In [ ]: history = resnetmodel.fit(x= train_set, validation_data = validation_set, epochs
Epoch 1/20
473/473 [==============================] - 47s 82ms/step - loss: 1.3987 - ac curacy: 0.3179 - val_loss: 1.4586 - val_accuracy: 0.2512
Epoch 2/20
473/473 [==============================] - 35s 75ms/step - loss: 1.3306 - ac curacy: 0.3501 - val_loss: 2.6751 - val_accuracy: 0.1830
Epoch 3/20
473/473 [==============================] - 36s 75ms/step - loss: 1.3054 - ac curacy: 0.3764 - val_loss: 1.4824 - val_accuracy: 0.3227
Epoch 4/20
473/473 [==============================] - 36s 75ms/step - loss: 1.2984 - ac curacy: 0.3796 - val_loss: 1.7348 - val_accuracy: 0.1981
Epoch 5/20
473/473 [==============================] - 36s 75ms/step - loss: 1.2879 - ac curacy: 0.3893 - val_loss: 1.8835 - val_accuracy: 0.2311
Epoch 6/20
473/473 [==============================] - 36s 77ms/step - loss: 1.2822 - ac curacy: 0.3926 - val_loss: 2.1088 - val_accuracy: 0.1861
Epoch 7/20
473/473 [==============================] - 35s 75ms/step - loss: 1.2793 - ac curacy: 0.3971 - val_loss: 2.3964 - val_accuracy: 0.1772 Epoch 8/20
473/473 [==============================] - 35s 75ms/step - loss: 1.2741 - ac
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 38 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
curacy: 0.4015 - val_loss: 1.4315 - val_accuracy: 0.3689
Epoch 9/20
473/473 [==============================] - 36s 75ms/step - loss: 1.2702 - ac curacy: 0.4026 - val_loss: 1.3646 - val_accuracy: 0.3502
Epoch 10/20
473/473
[==============================] - 36s 75ms/step - loss: 1.2646 - ac curacy: 0.4062 - val_loss: 1.3177 - val_accuracy: 0.3874
Epoch 11/20
473/473 [==============================] - 36s 77ms/step - loss: 1.2663 - ac curacy: 0.4091 - val_loss: 1.3753 - val_accuracy: 0.3755
Epoch 12/20
473/473 [==============================] - 36s 75ms/step - loss: 1.2694 - ac curacy: 0.4022 - val_loss: 1.8130 - val_accuracy: 0.3717
Epoch 13/20
473/473 [==============================] - 35s 75ms/step - loss: 1.2600 - ac curacy: 0.4141 - val_loss: 1.7202 - val_accuracy: 0.3771
Epoch 14/20
473/473
[==============================] - 36s 75ms/step - loss: 1.2579 - ac curacy: 0.4182 - val_loss: 1.6971 - val_accuracy: 0.2712
Epoch 15/20
473/473 [==============================] - 35s 75ms/step - loss: 1.2583 - ac curacy: 0.4102 - val_loss: 1.7715 - val_accuracy: 0.1897
Epoch 16/20
473/473 [==============================] - 36s 76ms/step - loss: 1.2498 - ac curacy: 0.4211 - val_loss: 1.7228 - val_accuracy: 0.2351
Epoch 17/20
473/473
[==============================] - 36s 75ms/step - loss: 1.2524 - ac curacy: 0.4182 - val_loss: 1.4560 - val_accuracy: 0.3767
Epoch 18/20
473/473 [==============================] - 36s 76ms/step - loss: 1.2499 - ac curacy: 0.4222 - val_loss: 1.3444 - val_accuracy: 0.3010
Epoch 19/20
473/473 [==============================] - 35s 75ms/step - loss: 1.2535 - ac curacy: 0.4211 - val_loss: 1.5870 - val_accuracy: 0.2429
Epoch 20/20
473/473 [==============================] - 35s 75ms/step - loss: 1.2479 - ac curacy: 0.4244 - val_loss: 1.8032 - val_accuracy: 0.2684
Evaluating the ResNet Model
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 39 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
In [ ]: # Write your code to evaluate model performance on the test set dict_hist = history.history list_ep = [i for i in range(1, 21)] plt.figure(figsize = (8, 8)) plt.plot(list_ep, dict_hist['accuracy'], ls = '--' , label = 'accuracy') plt.plot(list_ep, dict_hist['val_accuracy'], ls = '--' , label = 'val_accuracy' plt.ylabel('Accuracy') plt.xlabel('Epochs') plt.legend() plt.show()
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 40 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
Observations and Insights:
I have gone back and forth with several options on Resnet, however, there was only marginal improvement. There are challenges in adding additional convolutional layers for refinement, and it seems to very sensitive to overfitting. The more I build on hidden layers, seems to hurt generalization.
Note: You can even go back and build your own architecture on top of the ResNet Transfer layer and see if you can improve the performance.
EfficientNet Model
In [ ]: import tensorflow as tf import tensorflow.keras.applications as ap from tensorflow.keras import Model
EfficientNet = ap.EfficientNetV2B2(include_top=False,weights="imagenet", input_shape
EfficientNet.summary()
Model: "efficientnetv2-b2"
Layer (type)
Output Shape Param # Connected t o
input_3 (InputLayer) [(None, 48, 48, 3)] 0 [] rescaling (Rescaling) (None, 48, 48, 3) 0 ['input_3[0 ][0]'] normalization (Normalization) (None, 48, 48, 3) 0 ['rescaling [0][0]']
stem_conv (Conv2D) (None, 24, 24, 32) 864 ['normaliza tion[0][0]']
stem_bn (BatchNormalization) (None, 24, 24, 32) 128 ['stem_conv [0][0]']
stem_activation (Activation) (None, 24, 24, 32) 0 ['stem_bn[0 ][0]']
block1a_project_conv (Conv2D) (None, 24, 24, 16) 4608 ['stem_acti vation[0][0]']
block1a_project_bn (BatchNorma (None, 24, 24, 16) 64 ['block1a_p roject_conv[0][0]'] lization)
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 41 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
============================================================================ ======================
block1a_project_activation (Ac (None, 24, 24, 16) 0 ['block1a_p roject_bn[0][0]'] tivation)
block1b_project_conv (Conv2D) (None, 24, 24, 16) 2304 ['block1a_p roject_activation[0][0 ]']
block1b_project_bn (BatchNorma (None, 24, 24, 16) 64 ['block1b_p roject_conv[0][0]'] lization)
block1b_project_activation (Ac (None, 24, 24, 16) 0 ['block1b_p roject_bn[0][0]'] tivation)
block1b_add (Add) (None, 24, 24, 16) 0 ['block1b_p roject_activation[0][0 ]', 'block1a_p
roject_activation[0][0 ]']
block2a_expand_conv (Conv2D) (None, 12, 12, 64) 9216 ['block1b_a dd[0][0]']
block2a_expand_bn (BatchNormal (None, 12, 12, 64) 256 ['block2a_e xpand_conv[0][0]'] ization)
block2a_expand_activation (Act (None, 12, 12, 64) 0 ['block2a_e xpand_bn[0][0]'] ivation)
block2a_project_conv (Conv2D) (None, 12, 12, 32) 2048 ['block2a_e xpand_activation[0][0] ']
block2a_project_bn (BatchNorma (None, 12, 12, 32) 128 ['block2a_p roject_conv[0][0]'] lization)
block2b_expand_conv (Conv2D) (None, 12, 12, 128) 36864 ['block2a_p roject_bn[0][0]']
block2b_expand_bn (BatchNormal (None, 12, 12, 128) 512 ['block2b_e xpand_conv[0][0]'] ization)
block2b_expand_activation (Act (None, 12, 12, 128) 0 ['block2b_e xpand_bn[0][0]'] ivation)
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 42 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
block2b_project_conv (Conv2D) (None, 12, 12, 32) 4096
['block2b_e xpand_activation[0][0] ']
block2b_project_bn (BatchNorma (None, 12, 12, 32) 128
['block2b_p roject_conv[0][0]'] lization)
block2b_add (Add) (None, 12, 12, 32) 0
['block2b_p roject_bn[0][0]', 'block2a_p roject_bn[0][0]']
block2c_expand_conv (Conv2D) (None, 12, 12, 128) 36864 ['block2b_a dd[0][0]']
block2c_expand_bn (BatchNormal (None, 12, 12, 128) 512 ['block2c_e xpand_conv[0][0]'] ization)
block2c_expand_activation (Act (None, 12, 12, 128) 0
['block2c_e xpand_bn[0][0]'] ivation)
block2c_project_conv (Conv2D) (None, 12, 12, 32) 4096 ['block2c_e xpand_activation[0][0]
block2c_project_bn (BatchNorma (None, 12, 12, 32) 128
['block2c_p roject_conv[0][0]'] lization)
block2c_add (Add) (None, 12, 12, 32) 0
['block2c_p roject_bn[0][0]', 'block2b_a dd[0][0]']
block3a_expand_conv (Conv2D) (None, 6, 6, 128) 36864 ['block2c_a dd[0][0]']
block3a_expand_bn (BatchNormal (None, 6, 6, 128) 512 ['block3a_e xpand_conv[0][0]'] ization)
block3a_expand_activation (Act (None, 6, 6, 128) 0
['block3a_e xpand_bn[0][0]'] ivation)
block3a_project_conv (Conv2D) (None, 6, 6, 56) 7168 ['block3a_e xpand_activation[0][0]
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 43 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
']
']
block3a_project_bn (BatchNorma (None, 6, 6, 56) 224 ['block3a_p roject_conv[0][0]'] lization)
block3b_expand_conv (Conv2D) (None, 6, 6, 224) 112896 ['block3a_p roject_bn[0][0]']
block3b_expand_bn (BatchNormal (None, 6, 6, 224) 896 ['block3b_e xpand_conv[0][0]'] ization)
block3b_expand_activation (Act (None, 6, 6, 224) 0 ['block3b_e xpand_bn[0][0]'] ivation)
block3b_project_conv (Conv2D) (None, 6, 6, 56) 12544 ['block3b_e xpand_activation[0][0] ']
block3b_project_bn (BatchNorma (None, 6, 6, 56) 224 ['block3b_p roject_conv[0][0]'] lization)
block3b_add (Add) (None, 6, 6, 56) 0 ['block3b_p roject_bn[0][0]', 'block3a_p roject_bn[0][0]']
block3c_expand_conv (Conv2D) (None, 6, 6, 224) 112896 ['block3b_a dd[0][0]']
block3c_expand_bn (BatchNormal (None, 6, 6, 224) 896 ['block3c_e xpand_conv[0][0]'] ization)
block3c_expand_activation (Act (None, 6, 6, 224) 0 ['block3c_e xpand_bn[0][0]'] ivation)
block3c_project_conv (Conv2D) (None, 6, 6, 56) 12544 ['block3c_e xpand_activation[0][0] ']
block3c_project_bn (BatchNorma (None, 6, 6, 56) 224 ['block3c_p roject_conv[0][0]'] lization)
block3c_add (Add) (None, 6, 6, 56) 0 ['block3c_p roject_bn[0][0]', 'block3b_a dd[0][0]']
block4a_expand_conv (Conv2D) (None, 6, 6, 224) 12544 ['block3c_a
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 44 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
dd[0][0]']
block4a_expand_bn (BatchNormal (None, 6, 6, 224) 896 ['block4a_e xpand_conv[0][0]'] ization)
block4a_expand_activation (Act (None, 6, 6, 224) 0 ['block4a_e xpand_bn[0][0]'] ivation)
block4a_dwconv2 (DepthwiseConv (None, 3, 3, 224) 2016 ['block4a_e xpand_activation[0][0] 2D) ']
block4a_bn (BatchNormalization (None, 3, 3, 224) 896 ['block4a_d wconv2[0][0]'] )
block4a_activation (Activation (None, 3, 3, 224) 0 ['block4a_b n[0][0]'] )
block4a_se_squeeze (GlobalAver (None, 224) 0 ['block4a_a ctivation[0][0]'] agePooling2D)
block4a_se_reshape (Reshape) (None, 1, 1, 224) 0 ['block4a_s e_squeeze[0][0]']
block4a_se_reduce (Conv2D) (None, 1, 1, 14) 3150 ['block4a_s e_reshape[0][0]']
block4a_se_expand (Conv2D) (None, 1, 1, 224) 3360 ['block4a_s e_reduce[0][0]']
block4a_se_excite (Multiply) (None, 3, 3, 224) 0 ['block4a_a ctivation[0][0]', 'block4a_s e_expand[0][0]']
block4a_project_conv (Conv2D) (None, 3, 3, 104) 23296 ['block4a_s e_excite[0][0]']
block4a_project_bn (BatchNorma (None, 3, 3, 104) 416 ['block4a_p roject_conv[0][0]'] lization)
block4b_expand_conv (Conv2D) (None, 3, 3, 416) 43264 ['block4a_p roject_bn[0][0]']
block4b_expand_bn (BatchNormal (None, 3, 3, 416) 1664 ['block4b_e xpand_conv[0][0]'] ization)
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 45 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
block4b_expand_activation (Act (None, 3, 3, 416) 0 ['block4b_e xpand_bn[0][0]'] ivation)
block4b_dwconv2 (DepthwiseConv (None, 3, 3, 416) 3744 ['block4b_e xpand_activation[0][0] 2D) ']
block4b_bn (BatchNormalization (None, 3, 3, 416) 1664 ['block4b_d wconv2[0][0]'] )
block4b_activation (Activation (None, 3, 3, 416) 0 ['block4b_b n[0][0]'] )
block4b_se_squeeze (GlobalAver (None, 416) 0 ['block4b_a ctivation[0][0]'] agePooling2D)
block4b_se_reshape (Reshape) (None, 1, 1, 416) 0 ['block4b_s e_squeeze[0][0]']
block4b_se_reduce (Conv2D) (None, 1, 1, 26) 10842 ['block4b_s e_reshape[0][0]']
block4b_se_expand (Conv2D) (None, 1, 1, 416) 11232 ['block4b_s e_reduce[0][0]']
block4b_se_excite (Multiply) (None, 3, 3, 416) 0 ['block4b_a ctivation[0][0]', 'block4b_s e_expand[0][0]']
block4b_project_conv (Conv2D) (None, 3, 3, 104) 43264 ['block4b_s e_excite[0][0]']
block4b_project_bn (BatchNorma (None, 3, 3, 104) 416 ['block4b_p roject_conv[0][0]'] lization)
block4b_add (Add) (None, 3, 3, 104) 0 ['block4b_p roject_bn[0][0]', 'block4a_p roject_bn[0][0]']
block4c_expand_conv (Conv2D) (None, 3, 3, 416) 43264 ['block4b_a dd[0][0]']
block4c_expand_bn (BatchNormal (None, 3, 3, 416) 1664 ['block4c_e xpand_conv[0][0]'] ization)
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 46 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
block4c_expand_activation (Act (None, 3, 3, 416) 0 ['block4c_e xpand_bn[0][0]'] ivation)
block4c_dwconv2 (DepthwiseConv (None, 3, 3, 416) 3744 ['block4c_e xpand_activation[0][0] 2D) ']
block4c_bn (BatchNormalization (None, 3, 3, 416) 1664 ['block4c_d wconv2[0][0]'] )
block4c_activation (Activation (None, 3, 3, 416) 0 ['block4c_b n[0][0]'] )
block4c_se_squeeze (GlobalAver (None, 416) 0 ['block4c_a ctivation[0][0]'] agePooling2D)
block4c_se_reshape (Reshape) (None, 1, 1, 416) 0 ['block4c_s e_squeeze[0][0]']
block4c_se_reduce (Conv2D) (None, 1, 1, 26) 10842 ['block4c_s e_reshape[0][0]']
block4c_se_expand (Conv2D) (None, 1, 1, 416) 11232 ['block4c_s e_reduce[0][0]']
block4c_se_excite (Multiply) (None, 3, 3, 416) 0 ['block4c_a ctivation[0][0]', 'block4c_s e_expand[0][0]']
block4c_project_conv (Conv2D) (None, 3, 3, 104) 43264 ['block4c_s e_excite[0][0]']
block4c_project_bn (BatchNorma (None, 3, 3, 104) 416 ['block4c_p roject_conv[0][0]'] lization)
block4c_add (Add) (None, 3, 3, 104) 0 ['block4c_p roject_bn[0][0]', 'block4b_a dd[0][0]']
block4d_expand_conv (Conv2D) (None, 3, 3, 416) 43264 ['block4c_a dd[0][0]']
block4d_expand_bn (BatchNormal (None, 3, 3, 416) 1664 ['block4d_e xpand_conv[0][0]'] ization)
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 47 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
block4d_expand_activation (Act (None, 3, 3, 416) 0 ['block4d_e xpand_bn[0][0]'] ivation)
block4d_dwconv2 (DepthwiseConv (None, 3, 3, 416) 3744 ['block4d_e xpand_activation[0][0] 2D) ']
block4d_bn (BatchNormalization (None, 3, 3, 416) 1664 ['block4d_d wconv2[0][0]'] )
block4d_activation (Activation (None, 3, 3, 416) 0 ['block4d_b n[0][0]'] )
block4d_se_squeeze (GlobalAver (None, 416) 0 ['block4d_a ctivation[0][0]'] agePooling2D)
block4d_se_reshape (Reshape) (None, 1, 1, 416) 0 ['block4d_s e_squeeze[0][0]']
block4d_se_reduce (Conv2D) (None, 1, 1, 26) 10842 ['block4d_s e_reshape[0][0]']
block4d_se_expand (Conv2D) (None, 1, 1, 416) 11232 ['block4d_s e_reduce[0][0]']
block4d_se_excite (Multiply) (None, 3, 3, 416) 0 ['block4d_a ctivation[0][0]', 'block4d_s e_expand[0][0]']
block4d_project_conv (Conv2D) (None, 3, 3, 104) 43264 ['block4d_s e_excite[0][0]']
block4d_project_bn (BatchNorma (None, 3, 3, 104) 416 ['block4d_p roject_conv[0][0]'] lization)
block4d_add (Add) (None, 3, 3, 104) 0 ['block4d_p roject_bn[0][0]', 'block4c_a dd[0][0]']
block5a_expand_conv (Conv2D) (None, 3, 3, 624) 64896 ['block4d_a dd[0][0]']
block5a_expand_bn (BatchNormal (None, 3, 3, 624) 2496 ['block5a_e xpand_conv[0][0]'] ization)
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 48 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
block5a_expand_activation (Act (None, 3, 3, 624) 0 ['block5a_e xpand_bn[0][0]'] ivation)
block5a_dwconv2 (DepthwiseConv (None, 3, 3, 624) 5616 ['block5a_e xpand_activation[0][0] 2D) ']
block5a_bn (BatchNormalization (None, 3, 3, 624) 2496 ['block5a_d wconv2[0][0]'] )
block5a_activation (Activation (None, 3, 3, 624) 0 ['block5a_b n[0][0]'] )
block5a_se_squeeze (GlobalAver (None, 624) 0 ['block5a_a ctivation[0][0]'] agePooling2D)
block5a_se_reshape (Reshape) (None, 1, 1, 624) 0 ['block5a_s e_squeeze[0][0]']
block5a_se_reduce (Conv2D) (None, 1, 1, 26) 16250 ['block5a_s e_reshape[0][0]']
block5a_se_expand (Conv2D) (None, 1, 1, 624) 16848 ['block5a_s e_reduce[0][0]']
block5a_se_excite (Multiply) (None, 3, 3, 624) 0 ['block5a_a ctivation[0][0]', 'block5a_s e_expand[0][0]']
block5a_project_conv (Conv2D) (None, 3, 3, 120) 74880 ['block5a_s e_excite[0][0]']
block5a_project_bn (BatchNorma (None, 3, 3, 120) 480 ['block5a_p roject_conv[0][0]'] lization)
block5b_expand_conv (Conv2D) (None, 3, 3, 720) 86400 ['block5a_p roject_bn[0][0]']
block5b_expand_bn (BatchNormal (None, 3, 3, 720) 2880 ['block5b_e xpand_conv[0][0]'] ization)
block5b_expand_activation (Act (None, 3, 3, 720) 0 ['block5b_e xpand_bn[0][0]'] ivation)
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 49 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
block5b_dwconv2 (DepthwiseConv (None, 3, 3, 720) 6480 ['block5b_e xpand_activation[0][0] 2D) ']
block5b_bn (BatchNormalization (None, 3, 3, 720) 2880 ['block5b_d wconv2[0][0]'] )
block5b_activation (Activation (None, 3, 3, 720) 0 ['block5b_b n[0][0]'] )
block5b_se_squeeze (GlobalAver (None, 720) 0 ['block5b_a ctivation[0][0]'] agePooling2D)
block5b_se_reshape (Reshape) (None, 1, 1, 720) 0 ['block5b_s e_squeeze[0][0]']
block5b_se_reduce (Conv2D) (None, 1, 1, 30) 21630 ['block5b_s e_reshape[0][0]']
block5b_se_expand (Conv2D) (None, 1, 1, 720) 22320 ['block5b_s e_reduce[0][0]']
block5b_se_excite (Multiply) (None, 3, 3, 720) 0
['block5b_a ctivation[0][0]', 'block5b_s e_expand[0][0]']
block5b_project_conv (Conv2D) (None, 3, 3, 120) 86400 ['block5b_s e_excite[0][0]']
block5b_project_bn (BatchNorma (None, 3, 3, 120) 480 ['block5b_p roject_conv[0][0]'] lization)
block5b_add (Add) (None, 3, 3, 120) 0 ['block5b_p roject_bn[0][0]', 'block5a_p roject_bn[0][0]']
block5c_expand_conv (Conv2D) (None, 3, 3, 720) 86400 ['block5b_a dd[0][0]']
block5c_expand_bn (BatchNormal (None, 3, 3, 720) 2880 ['block5c_e xpand_conv[0][0]'] ization)
['block5c_e xpand_bn[0][0]'] ivation)
block5c_expand_activation (Act (None, 3, 3, 720) 0
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 50 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
block5c_dwconv2 (DepthwiseConv (None, 3, 3, 720) 6480 ['block5c_e xpand_activation[0][0] 2D) ']
block5c_bn (BatchNormalization (None, 3, 3, 720) 2880 ['block5c_d wconv2[0][0]'] )
block5c_activation (Activation (None, 3, 3, 720) 0 ['block5c_b n[0][0]'] )
block5c_se_squeeze (GlobalAver (None, 720) 0 ['block5c_a ctivation[0][0]'] agePooling2D)
block5c_se_reshape (Reshape) (None, 1, 1, 720) 0 ['block5c_s e_squeeze[0][0]']
block5c_se_reduce (Conv2D) (None, 1, 1, 30) 21630 ['block5c_s e_reshape[0][0]']
block5c_se_expand (Conv2D) (None, 1, 1, 720) 22320 ['block5c_s e_reduce[0][0]']
block5c_se_excite (Multiply) (None, 3, 3, 720) 0
['block5c_a ctivation[0][0]', 'block5c_s e_expand[0][0]']
block5c_project_conv (Conv2D) (None, 3, 3, 120) 86400 ['block5c_s e_excite[0][0]']
block5c_project_bn (BatchNorma (None, 3, 3, 120) 480 ['block5c_p roject_conv[0][0]'] lization)
block5c_add (Add) (None, 3, 3, 120) 0 ['block5c_p roject_bn[0][0]', 'block5b_a dd[0][0]']
block5d_expand_conv (Conv2D) (None, 3, 3, 720) 86400 ['block5c_a dd[0][0]']
block5d_expand_bn (BatchNormal (None, 3, 3, 720) 2880 ['block5d_e xpand_conv[0][0]'] ization)
block5d_expand_activation (Act (None, 3, 3, 720) 0 ['block5d_e xpand_bn[0][0]'] ivation)
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 51 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
block5d_dwconv2 (DepthwiseConv (None, 3, 3, 720) 6480 ['block5d_e xpand_activation[0][0] 2D) ']
block5d_bn (BatchNormalization (None, 3, 3, 720) 2880 ['block5d_d wconv2[0][0]'] )
block5d_activation (Activation (None, 3, 3, 720) 0 ['block5d_b n[0][0]'] )
block5d_se_squeeze (GlobalAver (None, 720) 0 ['block5d_a ctivation[0][0]'] agePooling2D)
block5d_se_reshape (Reshape) (None, 1, 1, 720) 0 ['block5d_s e_squeeze[0][0]']
block5d_se_reduce (Conv2D) (None, 1, 1, 30) 21630 ['block5d_s e_reshape[0][0]']
block5d_se_expand (Conv2D) (None, 1, 1, 720) 22320 ['block5d_s e_reduce[0][0]']
block5d_se_excite (Multiply) (None, 3, 3, 720) 0
['block5d_a ctivation[0][0]', 'block5d_s e_expand[0][0]']
block5d_project_conv (Conv2D) (None, 3, 3, 120) 86400 ['block5d_s e_excite[0][0]']
block5d_project_bn (BatchNorma (None, 3, 3, 120) 480 ['block5d_p roject_conv[0][0]'] lization)
block5d_add (Add) (None, 3, 3, 120) 0 ['block5d_p roject_bn[0][0]', 'block5c_a dd[0][0]']
block5e_expand_conv (Conv2D) (None, 3, 3, 720) 86400 ['block5d_a dd[0][0]']
block5e_expand_bn (BatchNormal (None, 3, 3, 720) 2880 ['block5e_e xpand_conv[0][0]'] ization)
block5e_expand_activation (Act (None, 3, 3, 720) 0 ['block5e_e xpand_bn[0][0]'] ivation)
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 52 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
block5e_dwconv2 (DepthwiseConv (None, 3, 3, 720) 6480 ['block5e_e xpand_activation[0][0] 2D) ']
block5e_bn (BatchNormalization (None, 3, 3, 720) 2880 ['block5e_d wconv2[0][0]'] )
block5e_activation (Activation (None, 3, 3, 720) 0 ['block5e_b n[0][0]'] )
block5e_se_squeeze (GlobalAver (None, 720) 0 ['block5e_a ctivation[0][0]'] agePooling2D)
block5e_se_reshape (Reshape) (None, 1, 1, 720) 0 ['block5e_s e_squeeze[0][0]']
block5e_se_reduce (Conv2D) (None, 1, 1, 30) 21630 ['block5e_s e_reshape[0][0]']
block5e_se_expand (Conv2D) (None, 1, 1, 720) 22320 ['block5e_s e_reduce[0][0]']
block5e_se_excite (Multiply) (None, 3, 3, 720) 0
['block5e_a ctivation[0][0]', 'block5e_s e_expand[0][0]']
block5e_project_conv (Conv2D) (None, 3, 3, 120) 86400 ['block5e_s e_excite[0][0]']
block5e_project_bn (BatchNorma (None, 3, 3, 120) 480 ['block5e_p roject_conv[0][0]'] lization)
block5e_add (Add) (None, 3, 3, 120) 0 ['block5e_p roject_bn[0][0]', 'block5d_a dd[0][0]']
block5f_expand_conv (Conv2D) (None, 3, 3, 720) 86400 ['block5e_a dd[0][0]']
block5f_expand_bn (BatchNormal (None, 3, 3, 720) 2880 ['block5f_e xpand_conv[0][0]'] ization)
block5f_expand_activation (Act (None, 3, 3, 720) 0 ['block5f_e xpand_bn[0][0]'] ivation)
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 53 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
block5f_dwconv2 (DepthwiseConv (None, 3, 3, 720) 6480 ['block5f_e xpand_activation[0][0] 2D) ']
block5f_bn (BatchNormalization (None, 3, 3, 720) 2880 ['block5f_d wconv2[0][0]'] )
block5f_activation (Activation (None, 3, 3, 720) 0 ['block5f_b n[0][0]'] )
block5f_se_squeeze (GlobalAver (None, 720) 0 ['block5f_a ctivation[0][0]'] agePooling2D)
block5f_se_reshape (Reshape) (None, 1, 1, 720) 0 ['block5f_s e_squeeze[0][0]']
block5f_se_reduce (Conv2D) (None, 1, 1, 30) 21630 ['block5f_s e_reshape[0][0]']
block5f_se_expand (Conv2D) (None, 1, 1, 720) 22320 ['block5f_s e_reduce[0][0]']
block5f_se_excite (Multiply) (None, 3, 3, 720) 0
['block5f_a ctivation[0][0]', 'block5f_s e_expand[0][0]']
block5f_project_conv (Conv2D) (None, 3, 3, 120) 86400 ['block5f_s e_excite[0][0]']
block5f_project_bn (BatchNorma (None, 3, 3, 120) 480 ['block5f_p roject_conv[0][0]'] lization)
block5f_add (Add) (None, 3, 3, 120) 0 ['block5f_p roject_bn[0][0]', 'block5e_a dd[0][0]']
block6a_expand_conv (Conv2D) (None, 3, 3, 720) 86400 ['block5f_a dd[0][0]']
block6a_expand_bn (BatchNormal (None, 3, 3, 720) 2880 ['block6a_e xpand_conv[0][0]'] ization)
block6a_expand_activation (Act (None, 3, 3, 720) 0 ['block6a_e xpand_bn[0][0]'] ivation)
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 54 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
block6a_dwconv2 (DepthwiseConv (None, 2, 2, 720) 6480 ['block6a_e xpand_activation[0][0] 2D) ']
block6a_bn (BatchNormalization (None, 2, 2, 720) 2880 ['block6a_d wconv2[0][0]'] )
block6a_activation (Activation (None, 2, 2, 720) 0 ['block6a_b n[0][0]'] )
block6a_se_squeeze (GlobalAver (None, 720) 0 ['block6a_a ctivation[0][0]'] agePooling2D)
block6a_se_reshape (Reshape) (None, 1, 1, 720) 0 ['block6a_s e_squeeze[0][0]']
block6a_se_reduce (Conv2D) (None, 1, 1, 30) 21630 ['block6a_s e_reshape[0][0]']
block6a_se_expand (Conv2D) (None, 1, 1, 720) 22320 ['block6a_s e_reduce[0][0]']
block6a_se_excite (Multiply) (None, 2, 2, 720) 0
['block6a_a ctivation[0][0]', 'block6a_s e_expand[0][0]']
block6a_project_conv (Conv2D) (None, 2, 2, 208) 149760 ['block6a_s e_excite[0][0]']
block6a_project_bn (BatchNorma (None, 2, 2, 208) 832 ['block6a_p roject_conv[0][0]'] lization)
block6b_expand_conv (Conv2D) (None, 2, 2, 1248) 259584 ['block6a_p roject_bn[0][0]']
block6b_expand_bn (BatchNormal (None, 2, 2, 1248) 4992 ['block6b_e xpand_conv[0][0]'] ization)
block6b_expand_activation (Act (None, 2, 2, 1248) 0 ['block6b_e xpand_bn[0][0]'] ivation)
block6b_dwconv2 (DepthwiseConv (None, 2, 2, 1248) 11232 ['block6b_e xpand_activation[0][0] 2D) ']
block6b_bn (BatchNormalization (None, 2, 2, 1248) 4992 ['block6b_d
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 55 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
wconv2[0][0]'] )
block6b_activation (Activation (None, 2, 2, 1248) 0 ['block6b_b n[0][0]'] )
block6b_se_squeeze (GlobalAver (None, 1248) 0 ['block6b_a ctivation[0][0]'] agePooling2D)
block6b_se_reshape (Reshape) (None, 1, 1, 1248) 0 ['block6b_s e_squeeze[0][0]']
block6b_se_reduce (Conv2D) (None, 1, 1, 52) 64948 ['block6b_s e_reshape[0][0]']
block6b_se_expand (Conv2D) (None, 1, 1, 1248) 66144 ['block6b_s e_reduce[0][0]']
block6b_se_excite (Multiply) (None, 2, 2, 1248) 0 ['block6b_a ctivation[0][0]', 'block6b_s e_expand[0][0]']
block6b_project_conv (Conv2D) (None, 2, 2, 208) 259584 ['block6b_s e_excite[0][0]']
block6b_project_bn (BatchNorma (None, 2, 2, 208) 832 ['block6b_p roject_conv[0][0]'] lization)
block6b_add (Add) (None, 2, 2, 208) 0 ['block6b_p roject_bn[0][0]', 'block6a_p roject_bn[0][0]']
block6c_expand_conv (Conv2D) (None, 2, 2, 1248) 259584 ['block6b_a dd[0][0]']
block6c_expand_bn (BatchNormal (None, 2, 2, 1248) 4992 ['block6c_e xpand_conv[0][0]'] ization)
block6c_expand_activation (Act (None, 2, 2, 1248) 0 ['block6c_e xpand_bn[0][0]'] ivation)
block6c_dwconv2 (DepthwiseConv (None, 2, 2, 1248) 11232 ['block6c_e xpand_activation[0][0] 2D) ']
block6c_bn (BatchNormalization (None, 2, 2, 1248) 4992 ['block6c_d
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 56 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
wconv2[0][0]'] )
block6c_activation (Activation (None, 2, 2, 1248) 0 ['block6c_b n[0][0]'] )
block6c_se_squeeze (GlobalAver (None, 1248) 0 ['block6c_a ctivation[0][0]'] agePooling2D)
block6c_se_reshape (Reshape) (None, 1, 1, 1248) 0 ['block6c_s e_squeeze[0][0]']
block6c_se_reduce (Conv2D) (None, 1, 1, 52) 64948 ['block6c_s e_reshape[0][0]']
block6c_se_expand (Conv2D) (None, 1, 1, 1248) 66144 ['block6c_s e_reduce[0][0]']
block6c_se_excite (Multiply) (None, 2, 2, 1248) 0 ['block6c_a ctivation[0][0]', 'block6c_s e_expand[0][0]']
block6c_project_conv (Conv2D) (None, 2, 2, 208) 259584 ['block6c_s e_excite[0][0]']
block6c_project_bn (BatchNorma (None, 2, 2, 208) 832 ['block6c_p roject_conv[0][0]'] lization)
block6c_add (Add) (None, 2, 2, 208) 0 ['block6c_p roject_bn[0][0]', 'block6b_a dd[0][0]']
block6d_expand_conv (Conv2D) (None, 2, 2, 1248) 259584 ['block6c_a dd[0][0]']
block6d_expand_bn (BatchNormal (None, 2, 2, 1248) 4992 ['block6d_e xpand_conv[0][0]'] ization)
block6d_expand_activation (Act (None, 2, 2, 1248) 0 ['block6d_e xpand_bn[0][0]'] ivation)
block6d_dwconv2 (DepthwiseConv (None, 2, 2, 1248) 11232 ['block6d_e xpand_activation[0][0] 2D) ']
block6d_bn (BatchNormalization (None, 2, 2, 1248) 4992 ['block6d_d
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 57 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
wconv2[0][0]'] )
block6d_activation (Activation (None, 2, 2, 1248) 0 ['block6d_b n[0][0]'] )
block6d_se_squeeze (GlobalAver (None, 1248) 0 ['block6d_a ctivation[0][0]'] agePooling2D)
block6d_se_reshape (Reshape) (None, 1, 1, 1248) 0 ['block6d_s e_squeeze[0][0]']
block6d_se_reduce (Conv2D) (None, 1, 1, 52) 64948 ['block6d_s e_reshape[0][0]']
block6d_se_expand (Conv2D) (None, 1, 1, 1248) 66144 ['block6d_s e_reduce[0][0]']
block6d_se_excite (Multiply) (None, 2, 2, 1248) 0 ['block6d_a ctivation[0][0]', 'block6d_s e_expand[0][0]']
block6d_project_conv (Conv2D) (None, 2, 2, 208) 259584 ['block6d_s e_excite[0][0]']
block6d_project_bn (BatchNorma (None, 2, 2, 208) 832 ['block6d_p roject_conv[0][0]'] lization)
block6d_add (Add) (None, 2, 2, 208) 0 ['block6d_p roject_bn[0][0]', 'block6c_a dd[0][0]']
block6e_expand_conv (Conv2D) (None, 2, 2, 1248) 259584 ['block6d_a dd[0][0]']
block6e_expand_bn (BatchNormal (None, 2, 2, 1248) 4992 ['block6e_e xpand_conv[0][0]'] ization)
block6e_expand_activation (Act (None, 2, 2, 1248) 0 ['block6e_e xpand_bn[0][0]'] ivation)
block6e_dwconv2 (DepthwiseConv (None, 2, 2, 1248) 11232 ['block6e_e xpand_activation[0][0] 2D) ']
block6e_bn (BatchNormalization (None, 2, 2, 1248) 4992 ['block6e_d
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 58 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
wconv2[0][0]'] )
block6e_activation (Activation (None, 2, 2, 1248) 0 ['block6e_b n[0][0]'] )
block6e_se_squeeze (GlobalAver (None, 1248) 0 ['block6e_a ctivation[0][0]'] agePooling2D)
block6e_se_reshape (Reshape) (None, 1, 1, 1248) 0 ['block6e_s e_squeeze[0][0]']
block6e_se_reduce (Conv2D) (None, 1, 1, 52) 64948 ['block6e_s e_reshape[0][0]']
block6e_se_expand (Conv2D) (None, 1, 1, 1248) 66144 ['block6e_s e_reduce[0][0]']
block6e_se_excite (Multiply) (None, 2, 2, 1248) 0 ['block6e_a ctivation[0][0]', 'block6e_s e_expand[0][0]']
block6e_project_conv (Conv2D) (None, 2, 2, 208) 259584 ['block6e_s e_excite[0][0]']
block6e_project_bn (BatchNorma (None, 2, 2, 208) 832 ['block6e_p roject_conv[0][0]'] lization)
block6e_add (Add) (None, 2, 2, 208) 0 ['block6e_p roject_bn[0][0]', 'block6d_a dd[0][0]']
block6f_expand_conv (Conv2D) (None, 2, 2, 1248) 259584 ['block6e_a dd[0][0]']
block6f_expand_bn (BatchNormal (None, 2, 2, 1248) 4992 ['block6f_e xpand_conv[0][0]'] ization)
block6f_expand_activation (Act (None, 2, 2, 1248) 0 ['block6f_e xpand_bn[0][0]'] ivation)
block6f_dwconv2 (DepthwiseConv (None, 2, 2, 1248) 11232 ['block6f_e xpand_activation[0][0] 2D) ']
block6f_bn (BatchNormalization (None, 2, 2, 1248) 4992 ['block6f_d
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 59 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
wconv2[0][0]'] )
block6f_activation (Activation (None, 2, 2, 1248) 0 ['block6f_b n[0][0]'] )
block6f_se_squeeze (GlobalAver (None, 1248) 0 ['block6f_a ctivation[0][0]'] agePooling2D)
block6f_se_reshape (Reshape) (None, 1, 1, 1248) 0 ['block6f_s e_squeeze[0][0]']
block6f_se_reduce (Conv2D) (None, 1, 1, 52) 64948 ['block6f_s e_reshape[0][0]']
block6f_se_expand (Conv2D) (None, 1, 1, 1248) 66144 ['block6f_s e_reduce[0][0]']
block6f_se_excite (Multiply) (None, 2, 2, 1248) 0 ['block6f_a ctivation[0][0]', 'block6f_s e_expand[0][0]']
block6f_project_conv (Conv2D) (None, 2, 2, 208) 259584 ['block6f_s e_excite[0][0]']
block6f_project_bn (BatchNorma (None, 2, 2, 208) 832 ['block6f_p roject_conv[0][0]'] lization)
block6f_add (Add) (None, 2, 2, 208) 0 ['block6f_p roject_bn[0][0]', 'block6e_a dd[0][0]']
block6g_expand_conv (Conv2D) (None, 2, 2, 1248) 259584 ['block6f_a dd[0][0]']
block6g_expand_bn (BatchNormal (None, 2, 2, 1248) 4992 ['block6g_e xpand_conv[0][0]'] ization)
block6g_expand_activation (Act (None, 2, 2, 1248) 0 ['block6g_e xpand_bn[0][0]'] ivation)
block6g_dwconv2 (DepthwiseConv (None, 2, 2, 1248) 11232 ['block6g_e xpand_activation[0][0] 2D) ']
block6g_bn (BatchNormalization (None, 2, 2, 1248) 4992 ['block6g_d
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 60 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
wconv2[0][0]'] )
block6g_activation (Activation (None, 2, 2, 1248) 0 ['block6g_b n[0][0]'] )
block6g_se_squeeze (GlobalAver (None, 1248) 0 ['block6g_a ctivation[0][0]'] agePooling2D)
block6g_se_reshape (Reshape) (None, 1, 1, 1248) 0 ['block6g_s e_squeeze[0][0]']
block6g_se_reduce (Conv2D) (None, 1, 1, 52) 64948 ['block6g_s e_reshape[0][0]']
block6g_se_expand (Conv2D) (None, 1, 1, 1248) 66144 ['block6g_s e_reduce[0][0]']
block6g_se_excite (Multiply) (None, 2, 2, 1248) 0 ['block6g_a ctivation[0][0]', 'block6g_s e_expand[0][0]']
block6g_project_conv (Conv2D) (None, 2, 2, 208) 259584 ['block6g_s e_excite[0][0]']
block6g_project_bn (BatchNorma (None, 2, 2, 208) 832 ['block6g_p roject_conv[0][0]'] lization)
block6g_add (Add) (None, 2, 2, 208) 0 ['block6g_p roject_bn[0][0]', 'block6f_a dd[0][0]']
block6h_expand_conv (Conv2D) (None, 2, 2, 1248) 259584 ['block6g_a dd[0][0]']
block6h_expand_bn (BatchNormal (None, 2, 2, 1248) 4992 ['block6h_e xpand_conv[0][0]'] ization)
block6h_expand_activation (Act (None, 2, 2, 1248) 0 ['block6h_e xpand_bn[0][0]'] ivation)
block6h_dwconv2 (DepthwiseConv (None, 2, 2, 1248) 11232 ['block6h_e xpand_activation[0][0] 2D) ']
block6h_bn (BatchNormalization (None, 2, 2, 1248) 4992 ['block6h_d
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 61 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
wconv2[0][0]'] )
block6h_activation (Activation (None, 2, 2, 1248) 0 ['block6h_b n[0][0]'] )
block6h_se_squeeze (GlobalAver (None, 1248) 0 ['block6h_a ctivation[0][0]'] agePooling2D)
block6h_se_reshape (Reshape) (None, 1, 1, 1248) 0 ['block6h_s e_squeeze[0][0]']
block6h_se_reduce (Conv2D) (None, 1, 1, 52) 64948 ['block6h_s e_reshape[0][0]']
block6h_se_expand (Conv2D) (None, 1, 1, 1248) 66144 ['block6h_s e_reduce[0][0]']
block6h_se_excite (Multiply) (None, 2, 2, 1248) 0 ['block6h_a ctivation[0][0]', 'block6h_s e_expand[0][0]']
block6h_project_conv (Conv2D) (None, 2, 2, 208) 259584 ['block6h_s e_excite[0][0]']
block6h_project_bn (BatchNorma (None, 2, 2, 208) 832 ['block6h_p roject_conv[0][0]'] lization)
block6h_add (Add) (None, 2, 2, 208) 0 ['block6h_p roject_bn[0][0]', 'block6g_a dd[0][0]']
block6i_expand_conv (Conv2D) (None, 2, 2, 1248) 259584 ['block6h_a dd[0][0]']
block6i_expand_bn (BatchNormal (None, 2, 2, 1248) 4992 ['block6i_e xpand_conv[0][0]'] ization)
block6i_expand_activation (Act (None, 2, 2, 1248) 0 ['block6i_e xpand_bn[0][0]'] ivation)
block6i_dwconv2 (DepthwiseConv (None, 2, 2, 1248) 11232 ['block6i_e xpand_activation[0][0] 2D) ']
block6i_bn (BatchNormalization (None, 2, 2, 1248) 4992 ['block6i_d
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 62 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
wconv2[0][0]'] )
block6i_activation (Activation (None, 2, 2, 1248) 0 ['block6i_b n[0][0]'] )
block6i_se_squeeze (GlobalAver (None, 1248) 0 ['block6i_a ctivation[0][0]'] agePooling2D)
block6i_se_reshape (Reshape) (None, 1, 1, 1248) 0 ['block6i_s e_squeeze[0][0]']
block6i_se_reduce (Conv2D) (None, 1, 1, 52) 64948 ['block6i_s e_reshape[0][0]']
block6i_se_expand (Conv2D) (None, 1, 1, 1248) 66144 ['block6i_s e_reduce[0][0]']
block6i_se_excite (Multiply) (None, 2, 2, 1248) 0 ['block6i_a ctivation[0][0]', 'block6i_s e_expand[0][0]']
block6i_project_conv (Conv2D) (None, 2, 2, 208) 259584 ['block6i_s e_excite[0][0]']
block6i_project_bn (BatchNorma (None, 2, 2, 208) 832 ['block6i_p roject_conv[0][0]'] lization)
block6i_add (Add) (None, 2, 2, 208) 0 ['block6i_p roject_bn[0][0]', 'block6h_a dd[0][0]']
block6j_expand_conv (Conv2D) (None, 2, 2, 1248) 259584 ['block6i_a dd[0][0]']
block6j_expand_bn (BatchNormal (None, 2, 2, 1248) 4992 ['block6j_e xpand_conv[0][0]'] ization)
block6j_expand_activation (Act (None, 2, 2, 1248) 0 ['block6j_e xpand_bn[0][0]'] ivation)
block6j_dwconv2 (DepthwiseConv (None, 2, 2, 1248) 11232 ['block6j_e xpand_activation[0][0] 2D) ']
block6j_bn (BatchNormalization (None, 2, 2, 1248) 4992 ['block6j_d
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 63 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
wconv2[0][0]'] )
block6j_activation (Activation (None, 2, 2, 1248) 0 ['block6j_b n[0][0]'] )
block6j_se_squeeze (GlobalAver (None, 1248) 0 ['block6j_a ctivation[0][0]'] agePooling2D)
block6j_se_reshape (Reshape) (None, 1, 1, 1248) 0 ['block6j_s e_squeeze[0][0]']
block6j_se_reduce (Conv2D) (None, 1, 1, 52) 64948 ['block6j_s e_reshape[0][0]']
block6j_se_expand (Conv2D) (None, 1, 1, 1248) 66144 ['block6j_s e_reduce[0][0]']
block6j_se_excite (Multiply) (None, 2, 2, 1248) 0 ['block6j_a ctivation[0][0]', 'block6j_s e_expand[0][0]']
block6j_project_conv (Conv2D) (None, 2, 2, 208) 259584 ['block6j_s e_excite[0][0]']
block6j_project_bn (BatchNorma (None, 2, 2, 208) 832 ['block6j_p roject_conv[0][0]'] lization)
block6j_add (Add) (None, 2, 2, 208) 0 ['block6j_p roject_bn[0][0]', 'block6i_a dd[0][0]']
top_conv (Conv2D) (None, 2, 2, 1408) 292864 ['block6j_a dd[0][0]']
top_bn (BatchNormalization) (None, 2, 2, 1408) 5632 ['top_conv[ 0][0]']
top_activation (Activation) (None, 2, 2, 1408) 0 ['top_bn[0] [0]']
Total params: 8,769,374
Trainable params: 8,687,086 Non-trainable params: 82,288
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 64 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
============================================================================ ======================
Model Building
Build your own Architecture on top of the transfer layer. Be sure to have a Flatten layer after your transfer layer and also make sure you have 4 neurons and softmax activation function in your last dense layer
block6e_expand_activation
In [ ]: transfer_layer_EfficientNet = EfficientNet.get_layer('block6e_expand_activation' EfficientNet.trainable = False
x = transfer_layer_EfficientNet.output x = BatchNormalization()(x) x = MaxPooling2D(1,1)(x) x = Dropout(rate = 0.3)(x)
# Add your Flatten layer. x = Flatten()(x)
# Add your Dense layers and/or BatchNormalization and Dropout layers x = Dense(64, activation = LeakyReLU(alpha = 0.2))(x) x = BatchNormalization()(x) x = Dropout(rate = 0.3)(x) x = Dense(512, activation = LeakyReLU(alpha = 0.2))(x) x = BatchNormalization()(x) x = Dropout(rate = 0.3)(x) x = Dense(128, activation = LeakyReLU(alpha = 0.2))(x) x = BatchNormalization()(x) x = Dropout(rate = 0.3)(x)
# Add your final Dense layer with 4 neurons and softmax activation function. pred = Dense(4, activation = 'softmax')(x)
Efficientnetmodel = Model(EfficientNet.input, pred)
Compiling and Training the Model
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 65 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
In [ ]: from keras.callbacks import ModelCheckpoint, EarlyStopping, ReduceLROnPlateau checkpoint = ModelCheckpoint("./Efficientnetmodel.h5", monitor='val_acc', verbose early_stopping = EarlyStopping(monitor = 'val_loss', min_delta = 0, patience = 3, verbose = 1, restore_best_weights = True) # Write your code here. You reduce_learningrate = ReduceLROnPlateau(monitor = 'val_loss', factor = 0.3, patience = 5, verbose = 1, min_delta = 0.01) # Write your code here. You may play callbacks_list = [early_stopping,checkpoint,reduce_learningrate] epochs = 20
In [ ]: Efficientnetmodel.compile(optimizer = Adam(learning_rate = 0.01), loss = 'categorical_crossentropy' # Write your code to compile your Efficientnetmodel. Use categorical crossentropy
In [ ]: history = Efficientnetmodel.fit(x= train_set, validation_data = validation_set
Epoch 1/20
473/473 [==============================] - 44s 78ms/step - loss: 1.3420 - ac curacy: 0.3303 - val_loss: 1.4037 - val_accuracy: 0.2508
Epoch 2/20
473/473 [==============================] - 38s 79ms/step - loss: 1.3449 - ac curacy: 0.3337 - val_loss: 1.3611 - val_accuracy: 0.2608
Epoch 3/20
473/473 [==============================] - 38s 80ms/step - loss: 1.3422 - ac curacy: 0.3342 - val_loss: 1.3381 - val_accuracy: 0.2700
Epoch 4/20
473/473 [==============================] - 36s 75ms/step - loss: 1.3453 - ac curacy: 0.3357 - val_loss: 1.3375 - val_accuracy: 0.3349
Epoch 5/20
473/473 [==============================] - 35s 74ms/step - loss: 1.3380 - ac curacy: 0.3383 - val_loss: 1.3234 - val_accuracy: 0.3693
Epoch 6/20
473/473 [==============================] - 35s 74ms/step - loss: 1.3378 - ac curacy: 0.3463 - val_loss: 1.3747 - val_accuracy: 0.2743
Epoch 7/20
473/473 [==============================] - 37s 78ms/step - loss: 1.3324 - ac curacy: 0.3530 - val_loss: 1.3431 - val_accuracy: 0.3088 Epoch 8/20
473/473 [==============================] - 37s 78ms/step - loss: 1.3298 - ac curacy: 0.3497 - val_loss: 1.3932 - val_accuracy: 0.2532
Epoch 9/20
473/473 [==============================] - 37s 77ms/step - loss: 1.3333 - ac
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 66 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
curacy: 0.3493 - val_loss: 1.3382 - val_accuracy: 0.3066
Epoch 10/20
473/473 [==============================] - 35s 74ms/step - loss: 1.3326 - ac curacy: 0.3521 - val_loss: 1.3304 - val_accuracy: 0.3092
Epoch 11/20
473/473
[==============================] - 35s 73ms/step - loss: 1.3298 - ac curacy: 0.3453 - val_loss: 1.3291 - val_accuracy: 0.3241
Epoch 12/20
473/473 [==============================] - 35s 73ms/step - loss: 1.3329 - ac curacy: 0.3544 - val_loss: 1.3151 - val_accuracy: 0.3502
Epoch 13/20
473/473 [==============================] - 34s 73ms/step - loss: 1.3321 - ac curacy: 0.3461 - val_loss: 1.3225 - val_accuracy: 0.3737
Epoch 14/20
473/473 [==============================] - 34s 72ms/step - loss: 1.3277 - ac curacy: 0.3567 - val_loss: 1.3271 - val_accuracy: 0.3179
Epoch 15/20
473/473
[==============================] - 34s 72ms/step - loss: 1.3271 - ac curacy: 0.3576 - val_loss: 1.3570 - val_accuracy: 0.3349
Epoch 16/20
473/473 [==============================] - 35s 73ms/step - loss: 1.3264 - ac curacy: 0.3512 - val_loss: 1.3571 - val_accuracy: 0.2668
Epoch 17/20
473/473 [==============================] - 35s 73ms/step - loss: 1.3215 - ac curacy: 0.3585 - val_loss: 1.3309 - val_accuracy: 0.3369
Epoch 18/20
473/473 [==============================] - 34s 72ms/step - loss: 1.3209 - ac curacy: 0.3567 - val_loss: 1.3485 - val_accuracy: 0.2526
Epoch 19/20
473/473 [==============================] - 34s 72ms/step - loss: 1.3248 - ac curacy: 0.3601 - val_loss: 1.3418 - val_accuracy: 0.3372 Epoch 20/20
473/473 [==============================] - 34s 72ms/step - loss: 1.3272 - ac curacy: 0.3560 - val_loss: 1.3309 - val_accuracy: 0.3026
Evaluating the EfficientnetNet Model
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 67 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
In [ ]: # Write your code to evaluate the model performance on the test set dict_hist = history.history list_ep = [i for i in range(1, 21)] plt.figure(figsize = (8, 8)) plt.plot(list_ep, dict_hist['accuracy'], ls = '--' , label = 'accuracy') plt.plot(list_ep, dict_hist['val_accuracy'], ls = '--' , label = 'val_accuracy' plt.ylabel('Accuracy') plt.xlabel('Epochs') plt.legend() plt.show()
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 68 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
Observations and Insights:__
This model reacted well to adjustments and seemed to continue learning; however, it became sensitive to overfitting as well, I applied several convo and hidden layers; however, systematically took them out after observing results.
It is learning steady, but slow.
Note: You can even go back and build your own architecture on top of the VGG16 Transfer layer and see if you can improve the performance.
I have worked several models off all these, and still get poor results. I have tried to use the final model, mix it with the earlier models, change hyper parameters, and work through the other elements, the results with the transfer training are continually poor. I was able to get a steady learning rate; however, there is still no improvement in generalization.
Think About It:
What is your overall performance of these Transfer Learning Architectures? Can we draw a comparison of these models' performances. Are we satisfied with the accuracies that we have received?
Do you think our issue lies with 'rgb' color_mode?
Now that we have tried multiple pre-trained models, let's build a complex CNN architecture and see if we can get better performance.
Building a Complex Neural Network Architecture
In this section, we will build a more complex Convolutional Neural Network Model that has close to as many parameters as we had in our Transfer Learning Models. However, we will have only 1 input channel for our input images.
Creating our Data Loaders
In this section, we are creating data loaders which we will use as inputs to the more Complicated Convolutional Neural Network. We will go ahead with color_mode = 'grayscale'.
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 69 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
In [ ]: batch_size = 32 img_size = 48
datagen_train = ImageDataGenerator(horizontal_flip = True, brightness_range = (0., 2.), rescale = 1./255, shear_range = 0.3)
train_set = datagen_train.flow_from_directory(folder_path + "train", target_size = (img_size, img_size color_mode = 'grayscale', batch_size = batch_size, class_mode = 'categorical', classes = ['happy', 'sad', 'neutral' shuffle = True)
datagen_validation = ImageDataGenerator(horizontal_flip = True, brightness_range=(0.,2.), rescale=1./255, shear_range=0.3) # Write your code here
validation_set = datagen_validation.flow_from_directory(folder_path + "validation" target_size = (img_size, img_size color_mode = 'grayscale', batch_size = batch_size, class_mode = 'categorical', shuffle = True) # Write your code here
datagen_test = ImageDataGenerator(horizontal_flip = True, brightness_range=(0.,2.), rescale=1./255, shear_range=0.3) # Write your code here
test_set = datagen_test.flow_from_directory(folder_path + "test", target_size = (img_size, img_size color_mode = 'grayscale', batch_size = batch_size, class_mode = 'categorical', shuffle = True) # Write your code here
Found 15109 images belonging to 4 classes. Found 4977 images belonging to 4 classes. Found 128 images belonging to 4 classes.
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 70 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
Model Building
In this network, we plan to have 5 Convolutional Blocks
Add first Conv2D layer with 64 filters and a kernel size of 2. Use the 'same' padding and provide the input shape = (48, 48, 1). Use 'relu' activation.
Add your BatchNormalization layer followed by a LeakyRelU layer with Leaky ReLU parameter of 0.1
Add MaxPooling2D layer with pool size = 2.
Add a Dropout layer with a Dropout Ratio of 0.2. This completes the first Convolutional block.
Add a second Conv2D layer with 128 filters and a kernel size of 2. Use the 'same' padding and 'relu' activation.
Follow this up with a similar BatchNormalization, LeakyRelU, Maxpooling2D, and Dropout layer like above to complete your second Convolutional Block.
Add a third Conv2D layer with 512 filters and a kernel size of 2. Use the 'same' padding and 'relu' activation. Once again, follow it up with a BatchNormalization, LeakyRelU, Maxpooling2D, and Dropout layer to complete your third Convolutional block.
Add a fourth block, with the Conv2D layer having 512 filters.
Add the fifth block, having 128 filters
Then add your Flatten layer, followed by your Dense layers.
Add your first Dense layer with 256 neurons followed by a BatchNormalization layer, a 'relu' Activation, and a Dropout layer. This forms your first Fully Connected block
Add your second Dense layer with 512 neurons, again followed by a BatchNormalization layer, relu activation, and a Dropout layer.
Add your final Dense layer with 4 neurons.
Compile your model with the optimizer of your choice.
In [ ]: #In this network, we plan to have 5 Convolutional Blocks
#Add first Conv2D layer with 64 filters and a kernel size of 2. Use the 'same' padding
#Add your BatchNormalization layer followed by a LeakyRelU layer with Leaky ReLU parameter #Add MaxPooling2D layer with pool size = 2.
#Add a Dropout layer with a Dropout Ratio of 0.2. This completes the first Convolutional #Add a second Conv2D layer with 128 filters and a kernel size of 2. Use the 'same' #Follow this up with a similar BatchNormalization, LeakyRelU, Maxpooling2D, and Dropout
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 71 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
no_of_classes = 4 model3 = Sequential() # Add 1st CNN Block
model3.add(Conv2D(64, (2,2), padding = 'same', activation = 'relu', input_shape model3.add(BatchNormalization()) model3.add(LeakyReLU(alpha = 0.2)) model3.add(MaxPooling2D(2,2)) model3.add(Dropout(rate = 0.2))
# Add 2nd CNN Block model3.add(Conv2D(128, (2,2), padding = 'same', activation = 'relu')) model3.add(BatchNormalization()) model3.add(LeakyReLU(alpha = 0.2)) model3.add(MaxPooling2D(2,2)) model3.add(Dropout(rate = 0.2))
# Add 3rd CNN Block model3.add(Conv2D(512, (2,2), padding = 'same', activation = 'relu')) model3.add(BatchNormalization()) model3.add(LeakyReLU(alpha = 0.2)) model3.add(MaxPooling2D(2,2)) model3.add(Dropout(rate = 0.2))
# Add 4th CNN Block model3.add(Conv2D(512, (2,2), padding = 'same', activation = 'relu')) model3.add(BatchNormalization()) model3.add(LeakyReLU(alpha = 0.2)) model3.add(MaxPooling2D(2,2)) model3.add(Dropout(rate = 0.2))
# Add 5th CNN Block model3.add(Conv2D(256, (2,2), padding = 'same', activation = 'relu')) model3.add(BatchNormalization()) model3.add(LeakyReLU(alpha = 0.2)) model3.add(MaxPooling2D(2,2)) model3.add(Dropout(rate = 0.2))
model3.add(Conv2D(512, (2,2), padding = 'same', activation = 'relu')) model3.add(BatchNormalization()) model3.add(LeakyReLU(alpha = 0.2)) model3.add(MaxPooling2D(1,1)) model3.add(Dropout(rate = 0.2))
#Add your first Dense layer with 256 neurons followed by a BatchNormalization layer, #Add your second Dense layer with 512 neurons, again followed by a BatchNormalization #Add your final Dense layer with 4 neurons. #Compile your model with the optimizer of your choice.
model3.add(Flatten())
# First fully connected layer model3.add(Dense(256)) model3.add(LeakyReLU(alpha = 0.2)) model3.add(BatchNormalization()) model3.add(Dropout(rate = 0.2))
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 72 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
# Second fully connected layer model3.add(Dense(512)) model3.add(LeakyReLU(alpha = 0.2)) model3.add(BatchNormalization()) model3.add(Dropout(rate = 0.2)) model3.add(Dense(64)) model3.add(LeakyReLU(alpha = 0.2)) model3.add(BatchNormalization()) model3.add(Dropout(rate = 0.2)) model3.add(Dense(no_of_classes, activation = 'softmax')) model3.summary()
Model: "sequential_2"
Layer (type) Output Shape Param # =================================================================
conv2d_14 (Conv2D) (None, 48, 48, 64) 320
batch_normalization_23 (Bat (None, 48, 48, 64) 256 chNormalization)
leaky_re_lu_25 (LeakyReLU) (None, 48, 48, 64) 0
max_pooling2d_16 (MaxPoolin (None, 24, 24, 64) 0 g2D)
dropout_22 (Dropout) (None, 24, 24, 64) 0 conv2d_15 (Conv2D) (None, 24, 24, 128) 32896
batch_normalization_24 (Bat (None, 24, 24, 128) 512 chNormalization)
leaky_re_lu_26 (LeakyReLU) (None, 24, 24, 128) 0
max_pooling2d_17 (MaxPoolin (None, 12, 12, 128) 0 g2D)
dropout_23 (Dropout) (None, 12, 12, 128) 0 conv2d_16 (Conv2D) (None, 12, 12, 512) 262656
batch_normalization_25 (Bat (None, 12, 12, 512) 2048 chNormalization)
leaky_re_lu_27 (LeakyReLU) (None, 12, 12, 512) 0
max_pooling2d_18 (MaxPoolin (None, 6, 6, 512) 0 g2D)
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 73 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
dropout_24 (Dropout) (None, 6, 6, 512) 0
conv2d_17 (Conv2D) (None, 6, 6, 512) 1049088
batch_normalization_26 (Bat (None, 6, 6, 512) 2048 chNormalization)
leaky_re_lu_28 (LeakyReLU) (None, 6, 6, 512) 0
max_pooling2d_19 (MaxPoolin (None, 3, 3, 512) 0 g2D)
dropout_25 (Dropout) (None, 3, 3, 512) 0 conv2d_18 (Conv2D) (None, 3, 3, 256) 524544
batch_normalization_27 (Bat (None, 3, 3, 256) 1024 chNormalization)
leaky_re_lu_29 (LeakyReLU) (None, 3, 3, 256) 0
max_pooling2d_20 (MaxPoolin (None, 1, 1, 256) 0 g2D)
dropout_26 (Dropout) (None, 1, 1, 256) 0 conv2d_19 (Conv2D) (None, 1, 1, 512) 524800
batch_normalization_28 (Bat (None, 1, 1, 512) 2048 chNormalization)
leaky_re_lu_30 (LeakyReLU) (None, 1, 1, 512) 0
max_pooling2d_21 (MaxPoolin (None, 1, 1, 512) 0 g2D)
dropout_27 (Dropout) (None, 1, 1, 512) 0
flatten_4 (Flatten) (None, 512) 0 dense_15 (Dense) (None, 256) 131328
leaky_re_lu_31 (LeakyReLU) (None, 256) 0
batch_normalization_29 (Bat (None, 256) 1024 chNormalization)
dropout_28 (Dropout) (None, 256) 0 dense_16 (Dense) (None, 512) 131584
leaky_re_lu_32 (LeakyReLU) (None, 512) 0
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 74 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
batch_normalization_30 (Bat (None, 512)
chNormalization)
dropout_29 (Dropout) (None, 512)
dense_17 (Dense) (None, 64) 32832
leaky_re_lu_33 (LeakyReLU) (None, 64)
batch_normalization_31 (Bat (None, 64) 256 chNormalization)
dropout_30 (Dropout) (None, 64)
dense_18 (Dense) (None, 4) 260
Total params: 2,701,572
Trainable params: 2,695,940 Non-trainable params: 5,632
Compiling and Training the Model
In [ ]: from keras.callbacks import ModelCheckpoint, ReduceLROnPlateau, CSVLogger epochs = 35 steps_per_epoch = train_set.n//train_set.batch_size validation_steps = validation_set.n//validation_set.batch_size checkpoint = ModelCheckpoint("model3.h5", monitor = 'val_accuracy', save_weights_only = True, model = 'max', verbose reduce_lr = ReduceLROnPlateau(monitor = 'val_loss', factor = 0.1, patience = callbacks = [checkpoint, reduce_lr]
In [ ]: model3.compile(optimizer = Adam(learning_rate = 0.001), loss = 'categorical_crossentropy'
# Write your code to compile your model3. Use categorical crossentropy as the loss
In [ ]: history = model3.fit(x= train_set, validation_data = validation_set, epochs
# Write your code to fit your model. Use train_set as the training data and validation_set
Epoch 1/35 473/473 [==============================] - 23s 43ms/step - loss: 1.5564 - ac curacy: 0.2636 - val_loss: 1.4093 - val_accuracy: 0.2397 Epoch 2/35 473/473 [==============================] - 19s 41ms/step - loss: 1.4251 - ac
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 75 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
2048
0
0
0
=================================================================
curacy: 0.2825 - val_loss: 1.3349 - val_accuracy: 0.3420
Epoch 3/35
473/473 [==============================] - 19s 41ms/step - loss: 1.3328 - ac curacy: 0.3436 - val_loss: 1.2928 - val_accuracy: 0.3112
Epoch 4/35
473/473
[==============================] - 19s 41ms/step - loss: 1.2221 - ac curacy: 0.4174 - val_loss: 1.1964 - val_accuracy: 0.4217
Epoch 5/35
473/473 [==============================] - 19s 41ms/step - loss: 1.1133 - ac curacy: 0.5018 - val_loss: 1.2118 - val_accuracy: 0.4400
Epoch 6/35
473/473 [==============================] - 19s 41ms/step - loss: 1.0361 - ac curacy: 0.5467 - val_loss: 1.0406 - val_accuracy: 0.5144
Epoch 7/35
473/473 [==============================] - 19s 41ms/step - loss: 0.9778 - ac curacy: 0.5785 - val_loss: 1.0650 - val_accuracy: 0.4748
Epoch 8/35
473/473 [==============================] - 19s 41ms/step - loss: 0.9386 - ac curacy: 0.6020 - val_loss: 0.9815 - val_accuracy: 0.5031
Epoch 9/35
473/473 [==============================] - 20s 42ms/step - loss: 0.9186 - ac curacy: 0.6101 - val_loss: 1.0396 - val_accuracy: 0.4890
Epoch 10/35
473/473 [==============================] - 19s 41ms/step - loss: 0.8881 - ac curacy: 0.6290 - val_loss: 0.9885 - val_accuracy: 0.5246
Epoch 11/35
473/473
[==============================] - 19s 41ms/step - loss: 0.8647 - ac curacy: 0.6401 - val_loss: 0.9710 - val_accuracy: 0.5260
Epoch 12/35
473/473 [==============================] - 20s 43ms/step - loss: 0.8372 - ac curacy: 0.6542 - val_loss: 1.0176 - val_accuracy: 0.5208
Epoch 13/35
473/473 [==============================] - 19s 41ms/step - loss: 0.8270 - ac curacy: 0.6566 - val_loss: 0.9610 - val_accuracy: 0.5282 Epoch 14/35
473/473 [==============================] - 19s 40ms/step - loss: 0.8161 - ac curacy: 0.6616 - val_loss: 1.0406 - val_accuracy: 0.5166
Epoch 15/35
473/473 [==============================] - 19s 40ms/step - loss: 0.8004 - ac curacy: 0.6707 - val_loss: 1.0320 - val_accuracy: 0.5101 Epoch 16/35
473/473 [==============================] - 19s 40ms/step - loss: 0.7860 - ac curacy: 0.6753 - val_loss: 0.9720 - val_accuracy: 0.5389
Epoch 17/35
473/473 [==============================] - 19s 41ms/step - loss: 0.7787 - ac curacy: 0.6790 - val_loss: 1.0585 - val_accuracy: 0.5071
Epoch 18/35
473/473
[==============================] - 19s 40ms/step - loss: 0.7700 - ac curacy: 0.6773 - val_loss: 0.9777 - val_accuracy: 0.5385
Epoch 19/35
473/473 [==============================] - 19s 40ms/step - loss: 0.7626 - ac curacy: 0.6879 - val_loss: 0.9908 - val_accuracy: 0.5314 Epoch 20/35
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 76 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
473/473
[==============================] - 19s 40ms/step - loss: 0.7507 - ac curacy: 0.6899 - val_loss: 1.0231 - val_accuracy: 0.5278
Epoch 21/35
473/473 [==============================] - 19s 40ms/step - loss: 0.7338 - ac curacy: 0.6965 - val_loss: 1.0977 - val_accuracy: 0.5073
Epoch 22/35
473/473 [==============================] - 19s 40ms/step - loss: 0.7264 - ac curacy: 0.7035 - val_loss: 0.9980 - val_accuracy: 0.5256
Epoch 23/35
473/473 [==============================] - 19s 40ms/step - loss: 0.7097 - ac curacy: 0.7151 - val_loss: 0.9756 - val_accuracy: 0.5306
Epoch 24/35
473/473 [==============================] - 19s 40ms/step - loss: 0.7057 - ac curacy: 0.7083 - val_loss: 1.0245 - val_accuracy: 0.4983
Epoch 25/35
473/473 [==============================] - 19s 40ms/step - loss: 0.6975 - ac curacy: 0.7166 - val_loss: 1.0362 - val_accuracy: 0.5132
Epoch 26/35
473/473 [==============================] - 19s 40ms/step - loss: 0.7001 - ac curacy: 0.7114 - val_loss: 1.0296 - val_accuracy: 0.5351
Epoch 27/35
473/473 [==============================] - 20s 42ms/step - loss: 0.6755 - ac curacy: 0.7245 - val_loss: 1.0140 - val_accuracy: 0.5324
Epoch 28/35
473/473 [==============================] - 20s 42ms/step - loss: 0.6710 - ac curacy: 0.7268 - val_loss: 1.0479 - val_accuracy: 0.5308
Epoch 29/35
473/473 [==============================] - 19s 41ms/step - loss: 0.6531 - ac curacy: 0.7359 - val_loss: 1.1339 - val_accuracy: 0.5409
Epoch 30/35
473/473 [==============================] - 19s 40ms/step - loss: 0.6520 - ac curacy: 0.7371 - val_loss: 1.0061 - val_accuracy: 0.5489
Epoch 31/35
473/473 [==============================] - 19s 40ms/step - loss: 0.6353 - ac curacy: 0.7455 - val_loss: 1.1552 - val_accuracy: 0.5379 Epoch 32/35
473/473 [==============================] - 19s 40ms/step - loss: 0.6221 - ac curacy: 0.7476 - val_loss: 1.0517 - val_accuracy: 0.5405
Epoch 33/35
473/473 [==============================] - 19s 40ms/step - loss: 0.6237 - ac curacy: 0.7503 - val_loss: 1.1391 - val_accuracy: 0.5314
Epoch 34/35
473/473 [==============================] - 19s 40ms/step - loss: 0.6086 - ac curacy: 0.7565 - val_loss: 1.1430 - val_accuracy: 0.5290
Epoch 35/35
473/473
[==============================] - 19s 40ms/step - loss: 0.5963 - ac curacy: 0.7620 - val_loss: 1.1581 - val_accuracy: 0.5268
Evaluating the Model on Test Set
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 77 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
In [ ]: # Write your code to evaluate the model performance on the test set dict_hist = history.history list_ep = [i for i in range(1, 36)] plt.figure(figsize = (8, 8)) plt.plot(list_ep, dict_hist['accuracy'], ls = '--' , label = 'accuracy') plt.plot(list_ep, dict_hist['val_accuracy'], ls = '--' , label = 'val_accuracy' plt.ylabel('Accuracy') plt.xlabel('Epochs') plt.legend() plt.show()
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 78 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
Observations and Insights:
Although this model is also not generalizing well, it is very responsive to adjustments in the parmeters and hyper-parameters. It seems to have the best results, and the most flexibility. I belive that MODEL3, will likely be the final model for refinement.
Plotting the Confusion Matrix for the chosen final model
In [ ]: # Plot the confusion matrix and generate a classification report for the model from sklearn.metrics import classification_report from sklearn.metrics import confusion_matrix test_set = datagen_test.flow_from_directory(folder_path + "test", target_size = color_mode = 'grayscale' batch_size = 128 class_mode = 'categorical' classes = ['happy' shuffle = True test_images, test_labels = next(test_set)
# Write the name of your chosen model in the blank pred = model3.predict(test_images) pred = np.argmax(pred, axis = 1) y_true = np.argmax(test_labels, axis = 1)
# Printing the classification report print(classification_report(y_true, pred))
# Plotting the heatmap using confusion matrix cm = confusion_matrix(y_true, pred) plt.figure(figsize = (8, 5)) sns.heatmap(cm, annot = True, fmt = '.0f', xticklabels = ['happy', 'sad', 'neutral' plt.ylabel('Actual') plt.xlabel('Predicted') plt.show()
Found 128 images belonging to 4 classes. precision recall f1-score support 0 0.75 0.75
1 0.62 0.66 0.64
2 0.59 0.62 0.61
3 0.96 0.84 0.90
accuracy 0.72 128 macro avg 0.73 0.72 0.72
weighted avg 0.73 0.72 0.72
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 79 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
0.75 32
32
32
32
128
128
Observations and Insights:
As I predicted, Neutral and sad seem to have the toghest diferentiation issues, followed closely by neutral and happy. It is surprising, and speaks well to the model, that happy and surprise are easily differentiated.
Conclusion:
It seems obvious that "Model 3" offers both the best results for this data, and offers the most flexibility in refinement. Although, the results are still not at their final point, I belive that with some more adjustments, and additional training, model 3 can come close to the 90% mark.
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 80 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html
Insights
Refined insights:
What are the most meaningful insights from the data relevant to the problem?
Comparison of various techniques and their relative performance:
How do different techniques perform? Which one is performing relatively better? Is there scope to improve the performance further?
Proposal for the final solution design:
What model do you propose to be adopted? Why is this the best solution to adopt?
10/19/22, 9: 52 AMReference_Notebook_Facial_Emotion_Detection_Milestone+2 (2) Page 81 of 81file:///Users/jeffreydavis/Desktop/Applied%20Data%20Science/Projec…eference_Notebook_Facial_Emotion_Detection_Milestone+2%20(2).html