Using temporal features of observers' physiological measures to distinguish between genuine and fake smiles

Md. Zakir Hossain, Tom Gedeon, Ramesh Sankaranarayana

Research output: Contribution to journalArticle

Abstract

Future affective computing research could be enhanced by enabling the computer to recognise a displayer's mental state from an observer's physiological states, and using this information to improve recognition algorithms. In this paper, an observer's physiological signals are analysed to distinguish displayers' genuine from fake smiles. Overall, thirty smile videos were collected from four benchmark database and classified as showing genuine/fake (half/half) smiles. Overall, forty observers viewed videos. We generally recorded four physiological signals: pupillary response (PR), electrocardiogram (ECG), galvanic skin response (GSR), and blood volume pulse (BVP). A number of temporal features were extracted after a few processing steps, and minimally correlated features between genuine and fake smiles were selected using NCCA (canonical correlation analysis with neural network) system. Finally, classification accuracy was found to be as high as 98.8% from PR features using a leave-one-observer-out process. In comparison, the best current image processing technique [1] on the same video data was 95% correct. Observers were 59% (on average) to 90% (by voting) correct by their conscious choices. Our results demonstrate that humans can non-consciously recognise the quality of smiles 4% better than current image processing techniques and 9% better than conscious choices of groups.

Original languageEnglish
JournalIEEE Transactions on Affective Computing
DOIs
Publication statusAccepted/In press - Jan 1 2018

Fingerprint

Image processing
Electrocardiography
Skin
Blood
Display devices
Neural networks
Processing

Keywords

  • Affective Computing
  • Databases
  • Electrocardiography
  • Face
  • Fake Smile
  • Feature extraction
  • Genuine Smile
  • Observers
  • Physiological Signals
  • Physiology
  • Temporal Features
  • Videos

ASJC Scopus subject areas

  • Software
  • Human-Computer Interaction

Cite this

@article{fbbacaaea9394f17a74e3319048df4be,
title = "Using temporal features of observers' physiological measures to distinguish between genuine and fake smiles",
abstract = "Future affective computing research could be enhanced by enabling the computer to recognise a displayer's mental state from an observer's physiological states, and using this information to improve recognition algorithms. In this paper, an observer's physiological signals are analysed to distinguish displayers' genuine from fake smiles. Overall, thirty smile videos were collected from four benchmark database and classified as showing genuine/fake (half/half) smiles. Overall, forty observers viewed videos. We generally recorded four physiological signals: pupillary response (PR), electrocardiogram (ECG), galvanic skin response (GSR), and blood volume pulse (BVP). A number of temporal features were extracted after a few processing steps, and minimally correlated features between genuine and fake smiles were selected using NCCA (canonical correlation analysis with neural network) system. Finally, classification accuracy was found to be as high as 98.8{\%} from PR features using a leave-one-observer-out process. In comparison, the best current image processing technique [1] on the same video data was 95{\%} correct. Observers were 59{\%} (on average) to 90{\%} (by voting) correct by their conscious choices. Our results demonstrate that humans can non-consciously recognise the quality of smiles 4{\%} better than current image processing techniques and 9{\%} better than conscious choices of groups.",
keywords = "Affective Computing, Databases, Electrocardiography, Face, Fake Smile, Feature extraction, Genuine Smile, Observers, Physiological Signals, Physiology, Temporal Features, Videos",
author = "Hossain, {Md. Zakir} and Tom Gedeon and Ramesh Sankaranarayana",
year = "2018",
month = "1",
day = "1",
doi = "10.1109/TAFFC.2018.2878029",
language = "English",
journal = "IEEE Transactions on Affective Computing",
issn = "1949-3045",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - JOUR

T1 - Using temporal features of observers' physiological measures to distinguish between genuine and fake smiles

AU - Hossain, Md. Zakir

AU - Gedeon, Tom

AU - Sankaranarayana, Ramesh

PY - 2018/1/1

Y1 - 2018/1/1

N2 - Future affective computing research could be enhanced by enabling the computer to recognise a displayer's mental state from an observer's physiological states, and using this information to improve recognition algorithms. In this paper, an observer's physiological signals are analysed to distinguish displayers' genuine from fake smiles. Overall, thirty smile videos were collected from four benchmark database and classified as showing genuine/fake (half/half) smiles. Overall, forty observers viewed videos. We generally recorded four physiological signals: pupillary response (PR), electrocardiogram (ECG), galvanic skin response (GSR), and blood volume pulse (BVP). A number of temporal features were extracted after a few processing steps, and minimally correlated features between genuine and fake smiles were selected using NCCA (canonical correlation analysis with neural network) system. Finally, classification accuracy was found to be as high as 98.8% from PR features using a leave-one-observer-out process. In comparison, the best current image processing technique [1] on the same video data was 95% correct. Observers were 59% (on average) to 90% (by voting) correct by their conscious choices. Our results demonstrate that humans can non-consciously recognise the quality of smiles 4% better than current image processing techniques and 9% better than conscious choices of groups.

AB - Future affective computing research could be enhanced by enabling the computer to recognise a displayer's mental state from an observer's physiological states, and using this information to improve recognition algorithms. In this paper, an observer's physiological signals are analysed to distinguish displayers' genuine from fake smiles. Overall, thirty smile videos were collected from four benchmark database and classified as showing genuine/fake (half/half) smiles. Overall, forty observers viewed videos. We generally recorded four physiological signals: pupillary response (PR), electrocardiogram (ECG), galvanic skin response (GSR), and blood volume pulse (BVP). A number of temporal features were extracted after a few processing steps, and minimally correlated features between genuine and fake smiles were selected using NCCA (canonical correlation analysis with neural network) system. Finally, classification accuracy was found to be as high as 98.8% from PR features using a leave-one-observer-out process. In comparison, the best current image processing technique [1] on the same video data was 95% correct. Observers were 59% (on average) to 90% (by voting) correct by their conscious choices. Our results demonstrate that humans can non-consciously recognise the quality of smiles 4% better than current image processing techniques and 9% better than conscious choices of groups.

KW - Affective Computing

KW - Databases

KW - Electrocardiography

KW - Face

KW - Fake Smile

KW - Feature extraction

KW - Genuine Smile

KW - Observers

KW - Physiological Signals

KW - Physiology

KW - Temporal Features

KW - Videos

UR - http://www.scopus.com/inward/record.url?scp=85055709183&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85055709183&partnerID=8YFLogxK

U2 - 10.1109/TAFFC.2018.2878029

DO - 10.1109/TAFFC.2018.2878029

M3 - Article

AN - SCOPUS:85055709183

JO - IEEE Transactions on Affective Computing

JF - IEEE Transactions on Affective Computing

SN - 1949-3045

ER -