表情识别--JAFFE数据集3:keras训练CNN网络

在上上一篇表情识别–JAFFE数据集1中,将JAFFE数据集中的人脸区域获取,并转换为.csv文件存储.face.csv

使用keras建立多层CNN网络对表情数据进行训练.
网络结构为:
  [ 48 × 48 ] c o n v 2 d 5 × 5 × 32 p o o l i n g c o n v 2 d [ 5 × 5 ] × 64 p o o l i n g f c [ 12 × 12 × 64 ] × 1024 d r o p o u t s o f t m a x [ 1024 × 7 ]

#!/usr/bin/python
# coding:utf8
import numpy as np
import pandas as pd
from keras.layers import Dense, Conv2D, MaxPooling2D, Dropout, Flatten
from keras.models import Sequential
from keras.preprocessing.image import ImageDataGenerator

emotion ={0:'Angry',1:'Disgust',2:'Fear',3:'Happy',4:'Sad',5:'Surprise',6:'Neutral'}

data = pd.read_csv(r'/home/w/mycode/jaffe/face.csv', dtype='a')
label = np.array(data['emotion'])
img_data = np.array(data['pixels'])
N_sample = label.size
Face_data = np.zeros((N_sample, 48*48))
Face_label = np.zeros((N_sample, 7), dtype=np.float)


for i in range(N_sample):
    x = img_data[i]
    x = np.fromstring(x, dtype=float, sep=' ')
    x = x/x.max()
    Face_data[i] = x
    Face_label[i, int(label[i])] = 1.0

train_num = 200
test_num = 13

train_x = Face_data [0:train_num, :]
train_y = Face_label [0:train_num, :]
train_x = train_x.reshape(-1,48,48,1)

test_x = Face_data [train_num : train_num+test_num, :]
test_y = Face_label [train_num : train_num+test_num, :]
test_x = test_x.reshape(-1,48,48,1)

model = Sequential()

model.add(Conv2D(32, (5, 5), activation='relu', input_shape=(48,48,1)))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Conv2D(64, (5, 5), activation='relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Flatten())
model.add(Dense(1024, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(7, activation='softmax'))

model.compile(loss='categorical_crossentropy', optimizer='rmsprop', metrics=['accuracy'])

# #扩增数据
datagen = ImageDataGenerator(
                            featurewise_center=True,
                            featurewise_std_normalization=True,
                            rotation_range=20,
                            width_shift_range=0.2,
                            height_shift_range=0.2,
                            horizontal_flip=True)
datagen.fit(train_x)
model.fit_generator(datagen.flow(train_x, train_y, batch_size=10),steps_per_epoch=len(train_x), epochs=20)

model.fit(train_x, train_y, batch_size=10, epochs=50)
score = model.evaluate(test_x, test_y, batch_size=10)

model.summary()

输出:

Epoch 1/20
200/200 [==============================] - 25s 125ms/step - loss: 2.0056 - acc: 0.1600
Epoch 2/20
200/200 [==============================] - 24s 120ms/step - loss: 1.8135 - acc: 0.2535
Epoch 3/20
200/200 [==============================] - 27s 136ms/step - loss: 1.5917 - acc: 0.3740
Epoch 4/20
200/200 [==============================] - 26s 132ms/step - loss: 1.4016 - acc: 0.4540
Epoch 5/20
200/200 [==============================] - 29s 145ms/step - loss: 1.2972 - acc: 0.4885
Epoch 6/20
200/200 [==============================] - 28s 141ms/step - loss: 1.1594 - acc: 0.5535
Epoch 7/20
200/200 [==============================] - 29s 147ms/step - loss: 1.0992 - acc: 0.5755
Epoch 8/20
200/200 [==============================] - 29s 145ms/step - loss: 0.9953 - acc: 0.6150
Epoch 9/20
200/200 [==============================] - 32s 159ms/step - loss: 0.9765 - acc: 0.6325
Epoch 10/20
200/200 [==============================] - 29s 144ms/step - loss: 0.9110 - acc: 0.6645
Epoch 11/20
200/200 [==============================] - 34s 168ms/step - loss: 0.8499 - acc: 0.6770
Epoch 12/20
200/200 [==============================] - 35s 176ms/step - loss: 0.8511 - acc: 0.6755
Epoch 13/20
200/200 [==============================] - 34s 169ms/step - loss: 0.8100 - acc: 0.6970
Epoch 14/20
200/200 [==============================] - 44s 221ms/step - loss: 0.7906 - acc: 0.7025
Epoch 15/20
200/200 [==============================] - 40s 199ms/step - loss: 0.7669 - acc: 0.7145
Epoch 16/20
200/200 [==============================] - 37s 183ms/step - loss: 0.7494 - acc: 0.7310
Epoch 17/20
200/200 [==============================] - 37s 184ms/step - loss: 0.7529 - acc: 0.7175
Epoch 18/20
200/200 [==============================] - 37s 184ms/step - loss: 0.7619 - acc: 0.7245
Epoch 19/20
200/200 [==============================] - 39s 194ms/step - loss: 0.6724 - acc: 0.7590
Epoch 20/20
200/200 [==============================] - 40s 199ms/step - loss: 0.7110 - acc: 0.7475
Epoch 1/50
200/200 [==============================] - 4s 20ms/step - loss: 1.7129 - acc: 0.3400
Epoch 2/50
200/200 [==============================] - 3s 16ms/step - loss: 1.1472 - acc: 0.5900
Epoch 3/50
200/200 [==============================] - 4s 19ms/step - loss: 0.7434 - acc: 0.7400
Epoch 4/50
200/200 [==============================] - 3s 17ms/step - loss: 0.6933 - acc: 0.7250
Epoch 5/50
200/200 [==============================] - 4s 18ms/step - loss: 0.6491 - acc: 0.7400
Epoch 6/50
200/200 [==============================] - 4s 20ms/step - loss: 0.4567 - acc: 0.8450
Epoch 7/50
200/200 [==============================] - 4s 20ms/step - loss: 0.5478 - acc: 0.8400
Epoch 8/50
200/200 [==============================] - 4s 20ms/step - loss: 0.4604 - acc: 0.8200
Epoch 9/50
200/200 [==============================] - 4s 18ms/step - loss: 0.3019 - acc: 0.8850
Epoch 10/50
200/200 [==============================] - 3s 16ms/step - loss: 0.3422 - acc: 0.9000
Epoch 11/50
200/200 [==============================] - 3s 15ms/step - loss: 0.2365 - acc: 0.9450
Epoch 12/50
200/200 [==============================] - 3s 17ms/step - loss: 0.2040 - acc: 0.9250
Epoch 13/50
200/200 [==============================] - 4s 20ms/step - loss: 0.2882 - acc: 0.9450
Epoch 14/50
200/200 [==============================] - 4s 20ms/step - loss: 0.1968 - acc: 0.9450
Epoch 15/50
200/200 [==============================] - 4s 20ms/step - loss: 0.1694 - acc: 0.9600
Epoch 16/50
200/200 [==============================] - 4s 20ms/step - loss: 0.1233 - acc: 0.9550
Epoch 17/50
200/200 [==============================] - 4s 20ms/step - loss: 0.2418 - acc: 0.9350
Epoch 18/50
200/200 [==============================] - 4s 21ms/step - loss: 0.0470 - acc: 0.9950
Epoch 19/50
200/200 [==============================] - 4s 20ms/step - loss: 0.1744 - acc: 0.9500
Epoch 20/50
200/200 [==============================] - 4s 20ms/step - loss: 0.0339 - acc: 0.9900
Epoch 21/50
200/200 [==============================] - 4s 20ms/step - loss: 0.0985 - acc: 0.9650
Epoch 22/50
200/200 [==============================] - 4s 20ms/step - loss: 0.0164 - acc: 1.0000
Epoch 23/50
200/200 [==============================] - 4s 20ms/step - loss: 0.2436 - acc: 0.9500
Epoch 24/50
200/200 [==============================] - 4s 20ms/step - loss: 0.0614 - acc: 0.9900
Epoch 25/50
200/200 [==============================] - 3s 17ms/step - loss: 0.0061 - acc: 1.0000
Epoch 26/50
200/200 [==============================] - 3s 16ms/step - loss: 0.3207 - acc: 0.9350
Epoch 27/50
200/200 [==============================] - 3s 16ms/step - loss: 0.0060 - acc: 1.0000
Epoch 28/50
200/200 [==============================] - 4s 18ms/step - loss: 0.2753 - acc: 0.9500
Epoch 29/50
200/200 [==============================] - 5s 25ms/step - loss: 0.0312 - acc: 0.9900
Epoch 30/50
200/200 [==============================] - 5s 24ms/step - loss: 0.0020 - acc: 1.0000
Epoch 31/50
200/200 [==============================] - 4s 20ms/step - loss: 0.0014 - acc: 1.0000
Epoch 32/50
200/200 [==============================] - 4s 20ms/step - loss: 0.5781 - acc: 0.9450
Epoch 33/50
200/200 [==============================] - 4s 20ms/step - loss: 0.0204 - acc: 0.9900
Epoch 34/50
200/200 [==============================] - 4s 20ms/step - loss: 0.0685 - acc: 0.9850
Epoch 35/50
200/200 [==============================] - 4s 20ms/step - loss: 0.0021 - acc: 1.0000
Epoch 36/50
200/200 [==============================] - 4s 20ms/step - loss: 0.0156 - acc: 0.9950
Epoch 37/50
200/200 [==============================] - 4s 20ms/step - loss: 0.2338 - acc: 0.9400
Epoch 38/50
200/200 [==============================] - 4s 20ms/step - loss: 0.0027 - acc: 1.0000
Epoch 39/50
200/200 [==============================] - 4s 20ms/step - loss: 7.6976e-04 - acc: 1.0000
Epoch 40/50
200/200 [==============================] - 4s 20ms/step - loss: 0.1586 - acc: 0.9700
Epoch 41/50
200/200 [==============================] - 4s 20ms/step - loss: 0.0098 - acc: 1.0000
Epoch 42/50
200/200 [==============================] - 4s 20ms/step - loss: 0.2164 - acc: 0.9600
Epoch 43/50
200/200 [==============================] - 4s 20ms/step - loss: 0.0089 - acc: 0.9950
Epoch 44/50
200/200 [==============================] - 4s 20ms/step - loss: 0.0032 - acc: 1.0000
Epoch 45/50
200/200 [==============================] - 4s 20ms/step - loss: 4.5264e-04 - acc: 1.0000
Epoch 46/50
200/200 [==============================] - 4s 20ms/step - loss: 0.1382 - acc: 0.9750
Epoch 47/50
200/200 [==============================] - 4s 20ms/step - loss: 0.0025 - acc: 1.0000
Epoch 48/50
200/200 [==============================] - 4s 20ms/step - loss: 7.0384e-04 - acc: 1.0000
Epoch 49/50
200/200 [==============================] - 4s 20ms/step - loss: 0.1466 - acc: 0.9800
Epoch 50/50
200/200 [==============================] - 4s 20ms/step - loss: 0.0090 - acc: 1.0000
13/13 [==============================] - 0s 11ms/step

最后输出网络结构:

_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
conv2d_1 (Conv2D)            (None, 44, 44, 32)        832       
_________________________________________________________________
max_pooling2d_1 (MaxPooling2 (None, 22, 22, 32)        0         
_________________________________________________________________
conv2d_2 (Conv2D)            (None, 18, 18, 64)        51264     
_________________________________________________________________
max_pooling2d_2 (MaxPooling2 (None, 9, 9, 64)          0         
_________________________________________________________________
flatten_1 (Flatten)          (None, 5184)              0         
_________________________________________________________________
dense_1 (Dense)              (None, 1024)              5309440   
_________________________________________________________________
dropout_1 (Dropout)          (None, 1024)              0         
_________________________________________________________________
dense_2 (Dense)              (None, 7)                 7175      
=================================================================
Total params: 5,368,711
Trainable params: 5,368,711
Non-trainable params: 0
_________________________________________________________________

猜你喜欢

转载自blog.csdn.net/akadiao/article/details/80027675