Publication: Emotion, Age and Gender Prediction Through Masked Face Inpainting
| dc.contributor.author | Islam, Md Baharul | |
| dc.contributor.author | Hosen, Md Imran | |
| dc.contributor.editor | Rousseau, J.-J. | |
| dc.contributor.editor | Kapralos, B. | |
| dc.contributor.institution | Islam, Md Baharul, Department of Computer Engineering, Bahçeşehir Üniversitesi, Istanbul, Turkey, College of Data Science and Engineering, American University of Malta, Cospicua, Malta | |
| dc.contributor.institution | Hosen, Md Imran, Department of Computer Engineering, Bahçeşehir Üniversitesi, Istanbul, Turkey | |
| dc.date.accessioned | 2025-10-05T15:07:57Z | |
| dc.date.issued | 2023 | |
| dc.description.abstract | Prediction of gesture and demographic information from the face is complex and challenging, particularly for the masked face. This paper proposes a deep learning-based integrated approach to predict emotion and demographic information for unmasked and masked faces, consisting of four sub-tasks: masked face detection, masked face inpainting, emotion, age, and gender prediction. The masked face detector module provides a binary decision on whether the face mask is available or not by applying pre-trained MobileNetV3. We use the inpainting module based on U-Net embedding with ImageNet weights to remove the face mask and restore the face. We use the convolutional neural networks to predict emotion (e.g., happy, angry). Besides, VGGFace-based transfer learning has been used to predict demographic information (e.g., age, gender). Extensive experiments on five publicly available datasets: AffectNet, UTKFace, FER-2013, CelebA, and MAFA, show the effectiveness of our proposed method to predict emotion and demographic identification through masked face reconstruction. © 2023 Elsevier B.V., All rights reserved. | |
| dc.identifier.conferenceName | 26th International Conference on Pattern Recognition, ICPR 2022 | |
| dc.identifier.conferencePlace | Montréal, QC | |
| dc.identifier.doi | 10.1007/978-3-031-37660-3_3 | |
| dc.identifier.endpage | 48 | |
| dc.identifier.issn | 16113349 | |
| dc.identifier.issn | 03029743 | |
| dc.identifier.scopus | 2-s2.0-85171577831 | |
| dc.identifier.startpage | 37 | |
| dc.identifier.uri | https://doi.org/10.1007/978-3-031-37660-3_3 | |
| dc.identifier.uri | https://hdl.handle.net/20.500.14719/8244 | |
| dc.identifier.volume | 13643 LNCS | |
| dc.language.iso | en | |
| dc.publisher | Springer Science and Business Media Deutschland GmbH | |
| dc.relation.source | Lecture Notes in Computer Science | |
| dc.subject.authorkeywords | Emotion Prediction | |
| dc.subject.authorkeywords | Face Detection | |
| dc.subject.authorkeywords | Face Inpainting | |
| dc.subject.authorkeywords | Deep Learning | |
| dc.subject.authorkeywords | Emotion Recognition | |
| dc.subject.authorkeywords | Forecasting | |
| dc.subject.authorkeywords | Neural Networks | |
| dc.subject.authorkeywords | Population Statistics | |
| dc.subject.authorkeywords | Age Predictions | |
| dc.subject.authorkeywords | Demographic Information | |
| dc.subject.authorkeywords | Emotion Predictions | |
| dc.subject.authorkeywords | Face Inpainting | |
| dc.subject.authorkeywords | Face Masks | |
| dc.subject.authorkeywords | Faces Detection | |
| dc.subject.authorkeywords | Gender Predictions | |
| dc.subject.authorkeywords | Inpainting | |
| dc.subject.authorkeywords | Integrated Approach | |
| dc.subject.authorkeywords | Subtask | |
| dc.subject.authorkeywords | Face Recognition | |
| dc.subject.indexkeywords | Deep learning | |
| dc.subject.indexkeywords | Emotion Recognition | |
| dc.subject.indexkeywords | Forecasting | |
| dc.subject.indexkeywords | Neural networks | |
| dc.subject.indexkeywords | Population statistics | |
| dc.subject.indexkeywords | Age predictions | |
| dc.subject.indexkeywords | Demographic information | |
| dc.subject.indexkeywords | Emotion predictions | |
| dc.subject.indexkeywords | Face inpainting | |
| dc.subject.indexkeywords | Face masks | |
| dc.subject.indexkeywords | Faces detection | |
| dc.subject.indexkeywords | Gender predictions | |
| dc.subject.indexkeywords | Inpainting | |
| dc.subject.indexkeywords | Integrated approach | |
| dc.subject.indexkeywords | Subtask | |
| dc.subject.indexkeywords | Face recognition | |
| dc.title | Emotion, Age and Gender Prediction Through Masked Face Inpainting | |
| dc.type | Conference Paper | |
| dcterms.references | International Journal of Computer Science and Network Security Ijcsns, (2021), Masked Face Recognition for Secure Authentication Arxiv Preprint Arxiv, (2020), Arxiv, (2017), Batagelj, Borut, How to correctly detect face-masks for COVID-19 from visual information?, Applied Sciences (Switzerland), 11, 5, pp. 1-24, (2021), Ud Din, Nizam, A Novel GAN-Based Network for Unmasking of Masked Face, IEEE Access, 8, pp. 44276-44287, (2020), undefined, Happy, S. L., Automatic facial expression recognition using features of salient facial patches, IEEE Transactions on Affective Computing, 6, 1, pp. 1-12, (2015), He, Kaiming, Deep residual learning for image recognition, Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2016-December, pp. 770-778, (2016), Howard, Andrew, Searching for mobileNetV3, Proceedings of the IEEE International Conference on Computer Vision, pp. 1314-1324, (2019), Liu, Ziwei, Deep learning face attributes in the wild, Proceedings of the IEEE International Conference on Computer Vision, 2015 International Conference on Computer Vision, ICCV 2015, pp. 3730-3738, (2015) | |
| dspace.entity.type | Publication | |
| local.indexed.at | Scopus | |
| person.identifier.scopus-author-id | 57204631897 | |
| person.identifier.scopus-author-id | 57904892500 |
