Towards robust automatic affective classification of images using facial expressions for practical applications

Zhang, Ligang, Tjondronegoro, Dian W., Chandran, Vinod, & Eggink, Jana (2016) Towards robust automatic affective classification of images using facial expressions for practical applications. Multimedia Tools and Applications, 75(8), pp. 4669-4695.

[img] Accepted Version (PDF 1MB)
Administrators only until April 2017 | Request a copy from author

View at publisher


Affect is an important feature of multimedia content and conveys valuable information for multimedia indexing and retrieval. Most existing studies for affective content analysis are limited to low-level features or mid-level representations, and are generally criticized for their incapacity to address the gap between low-level features and high-level human affective perception. The facial expressions of subjects in images carry important semantic information that can substantially influence human affective perception, but have been seldom investigated for affective classification of facial images towards practical applications. This paper presents an automatic image emotion detector (IED) for affective classification of practical (or non-laboratory) data using facial expressions, where a lot of “real-world” challenges are present, including pose, illumination, and size variations etc. The proposed method is novel, with its framework designed specifically to overcome these challenges using multi-view versions of face and fiducial point detectors, and a combination of point-based texture and geometry. Performance comparisons of several key parameters of relevant algorithms are conducted to explore the optimum parameters for high accuracy and fast computation speed. A comprehensive set of experiments with existing and new datasets, shows that the method is effective despite pose variations, fast, and appropriate for large-scale data, and as accurate as the method with state-of-the-art performance on laboratory-based data. The proposed method was also applied to affective classification of images from the British Broadcast Corporation (BBC) in a task typical for a practical application providing some valuable insights.

Impact and interest:

0 citations in Scopus
Search Google Scholar™

Citation counts are sourced monthly from Scopus and Web of Science® citation databases.

These databases contain citations from different subsets of available publications and different time periods and thus the citation count from each is usually different. Some works are not in either database and no count is displayed. Scopus includes citations from articles published in 1996 onwards, and Web of Science® generally from 1980 onwards.

Citations counts from the Google Scholar™ indexing service can be viewed at the linked Google Scholar™ search.

ID Code: 84135
Item Type: Journal Article
Refereed: Yes
Keywords: Affective classification, Facial expression recognition, Image, Application
DOI: 10.1007/s11042-015-2497-5
ISSN: 1380-7501
Divisions: Current > Schools > School of Electrical Engineering & Computer Science
Current > Schools > School of Information Systems
Current > QUT Faculties and Divisions > Science & Engineering Faculty
Copyright Owner: Copyright 2015 Springer Science+Business Media New York
Copyright Statement: The final publication is available at Springer via
Deposited On: 14 May 2015 01:27
Last Modified: 17 May 2016 05:59

Export: EndNote | Dublin Core | BibTeX

Repository Staff Only: item control page