Acta Informatica Malaysia (AIM)

A DEEP LEARNING MODEL FOR FACE RECOGNITION IN PRESENCE OF MASK

ABSTRACT

A DEEP LEARNING MODEL FOR FACE RECOGNITION IN PRESENCE OF MASK

Journal: Acta Informatica Malaysia (AIM)
Author: Kalembo Vikalwe Shakrani, Ngonidzashe Mathew Kanyangarara, Prince Tinashe Parowa, Vibhor Gupta, Rajendra Kumar

This is an open access article distributed under the Creative Commons Attribution License CC BY 4.0, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited

Doi:10.26480/aim.02.2022.38.41

Image classifications and object detection are common study topics in the rapidly expanding technological advancements to identify and detect real-time problems in major federal fields like public places, airports and army bases using webcams and surveillance cameras opensource platforms. The goal of this study is to suggest Open Source Computer Vision (OpenCV) and Convolutional Neural Network (CNN) techniques for identifying a person in presence of face mask from image datasets and real-time (live streaming video). For experimental purpose a parent directory consisting of three main directories (i.e., training, testing and validation sets) and two sub directories inside those containing Mask (M) and No Mask (N), respectively are used. Mask subdirectories have images of people wearing masks and the vice versa is for Non Mask. Total 1006 images are used including 503 Mask and 503 No-Mask. The data augmentation pre-processing method is used to increase the dataset size to improve the accuracy of the suggested model. The proposed system uses a camra inbuilt on drone to capture real-time image for recognition using Conventional Neural Network (CNN). The proposed model is constructed, compiled and trained using Tensor flow and Keras. The final training accuracy recorded is 0.93, while the validation accuracy recorded is 0.94, the training loss is 0.17, the validation loss here observed is 0.1672, and the test loss is 0.15. The classification accuracy of the proposed system observed is 0.95.

Pages 38-41
Year 2022
Issue 2
Volume 6

Download