Araştırma Çıktıları | WoS | Scopus | TR-Dizin | PubMed
Permanent URI for this communityhttps://hdl.handle.net/20.500.14719/1741
Browse
2 results
Search Results
Publication Metadata only A comparison of geometrical facial features for affect recognition, Duygu tanima i̇çi̇n geometri̇k yüz özni̇teli̇kleri̇ni̇n karşilaştirilmasi(2011) Ulukaya, Sezer; Erdem, Cigdem Eroglu; Ulukaya, Sezer, Bahçeşehir Üniversitesi, Istanbul, Turkey; Erdem, Cigdem Eroglu, Bahçeşehir Üniversitesi, Istanbul, TurkeyIn this work, we compare two different geometric feature extraction methods derived from coordinates of facial points tracked by Active Appearance Models. The compared feature extraction methods differ in their use of coordinates or distances between facial points and whether they use the information of a neutral facial expression. Experiments on the extended Cohn-Kanade database show that the coordinate-based features using the neutral frame information gives the best emotion recognition results (%94) using a SVC classifier with a polynomial kernel. © 2011 IEEE. © 2011 Elsevier B.V., All rights reserved.Publication Metadata only A hybrid facial expression recognition method based on neutral face shape estimation, Yüz i̇fadesi̇ tanima i̇çi̇n nötr yüz şekli̇ni̇n kesti̇ri̇lmesi̇ne dayali hi̇bri̇t bi̇r yöntem(2012) Ulukaya, Sezer; Erdem, Cigdem Eroglu; Ulukaya, Sezer, Boğaziçi Üniversitesi, Bebek, Turkey, Bahçeşehir Üniversitesi, Istanbul, Turkey; Erdem, Cigdem Eroglu, Bahçeşehir Üniversitesi, Istanbul, TurkeyIn order to recognize the facial expression of a person, the knowledge of the neutral facial expression of that person is useful but may not always be available.We present a method based on Gaussian mixture models (GMM) to estimate the unknown neutral facial expression of an expressive face. The estimated neutral face is then subtracted from the features of the expressive image and classified using support vector classifiers (SVC). Experimental results on the extended Cohn-Kanade (CK+) database give an emotion recognition rate of 88% using geometric features only and 92% if appearance based features are also included. © 2012 IEEE. © 2012 Elsevier B.V., All rights reserved.
