Skip to main content

Research publications repository

    • čeština
    • English
  • English 
    • čeština
    • English
  • Login
View Item 
  •   CU Research Publications Repository
  • Fakulty
  • Faculty of Humanities
  • View Item
  • CU Research Publications Repository
  • Fakulty
  • Faculty of Humanities
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Determination of “Neutral”–“Pain”, “Neutral”–“Pleasure”, and “Pleasure”–“Pain” Affective State Distances by Using AI Image Analysis of Facial Expressions

original article
Creative Commons License IconCreative Commons BY Icon
published version
  • no other version
Thumbnail
File can be accessed.Get publication
Author
Prossinger, Hermann
Hladký, Tomáš
Boschetti, SilviaORCiD Profile - 0000-0002-8048-4062WoS Profile - HKN-7812-2023Scopus Profile - 57830722300
Říha, DanielORCiD Profile - 0000-0001-5142-4485Scopus Profile - 57195970660
Binter, JakubORCiD Profile - 0000-0001-5304-2130WoS Profile - GRB-0175-2022Scopus Profile - 56281373300

Show other authors

Publication date
2022
Published in
Technologies [online]
Volume / Issue
10 (4)
ISBN / ISSN
ISSN: 2227-7080
Metadata
Show full item record
Collections
  • Faculty of Humanities
  • Faculty of Science

This publication has a published version with DOI 10.3390/technologies10040075

Abstract
(1) Background: In addition to verbalizations, facial expressions advertise one's affective state. There is an ongoing debate concerning the communicative value of the facial expressions of pain and of pleasure, and to what extent humans can distinguish between these. We introduce a novel method of analysis by replacing human ratings with outputs from image analysis software. (2) Methods: We use image analysis software to extract feature vectors of the facial expressions neutral, pain, and pleasure displayed by 20 actresses. We dimension-reduced these feature vectors, used singular value decomposition to eliminate noise, and then used hierarchical agglomerative clustering to detect patterns. (3) Results: The vector norms for pain-pleasure were rarely less than the distances pain-neutral and pleasure-neutral. The pain-pleasure distances were Weibull-distributed and noise contributed 10% to the signal. The noise-free distances clustered in four clusters and two isolates. (4) Conclusions: AI methods of image recognition are superior to human abilities in distinguishing between facial expressions of pain and pleasure. Statistical methods and hierarchical clustering offer possible explanations as to why humans fail. The reliability of commercial software, which attempts to identify facial expressions of affective states, can be improved by using the results of our analyses.
Keywords
image processing, artificial intelligence, facial expressions, affective state expression, facial pain expression, facial pleasure expression, BDSM videos, hierarchical agglomerative clustering, autoencoder neural network
Permanent link
https://hdl.handle.net/20.500.14178/1687
Show publication in other systems
WOS:000845304900001
SCOPUS:2-s2.0-85147557847
License

Full text of this result is licensed under: Creative Commons Uveďte původ 4.0 International

Show license terms

xmlui.dri2xhtml.METS-1.0.item-publication-version-

DSpace software copyright © 2002-2016  DuraSpace
Contact Us | Send Feedback
Theme by 
Atmire NV
 

 

About Repository

About This RepositoryResearch outputs typologyRequired metadataDisclaimerCC Linceses

Browse

All of DSpaceCommunities & CollectionsWorkplacesBy Issue DateAuthorsTitlesSubjectsThis CollectionWorkplacesBy Issue DateAuthorsTitlesSubjects

DSpace software copyright © 2002-2016  DuraSpace
Contact Us | Send Feedback
Theme by 
Atmire NV