Logo do repositório

Facial expressions to identify post-stroke: A pilot study

dc.contributor.authorOliveira, Guilherme C. [UNESP]
dc.contributor.authorNgo, Quoc C.
dc.contributor.authorPassos, Leandro A. [UNESP]
dc.contributor.authorOliveira, Leonardo S. [UNESP]
dc.contributor.authorPapa, João P. [UNESP]
dc.contributor.authorKumar, Dinesh
dc.contributor.institutionUniversidade Estadual Paulista (UNESP)
dc.contributor.institutionRoyal Melbourne Institute of Technology
dc.date.accessioned2025-04-29T18:36:11Z
dc.date.issued2024-06-01
dc.description.abstractBackground and Objective: Timely stroke treatment can limit brain damage and improve outcomes, which depends on early recognition of the symptoms. However, stroke cases are often missed by the first respondent paramedics. One of the earliest external symptoms of stroke is based on facial expressions. Methods: We propose a computerized analysis of facial expressions using action units to distinguish between Post-Stroke and healthy people. Action units enable analysis of subtle and specific facial movements and are interpretable to the facial expressions. The RGB videos from the Toronto Neuroface Dataset, which were recorded during standard orofacial examinations of 14 people with post-stroke (PS) and 11 healthy controls (HC) were used in this study. Action units were computed using XGBoost which was trained using HC, and classified using regression analysis for each of the nine facial expressions. The analysis was performed without manual intervention. Results: The results were evaluated using leave-one-our validation. The accuracy was 82% for Kiss and Spread, with the best sensitivity of 91% in the differentiation of PS and HC. The features corresponding to mouth muscles were most suitable. Conclusions: This pilot study has shown that our method can detect PS based on two simple facial expressions. However, this needs to be tested in real-world conditions, with people of different ethnicities and smartphone use. The method has the potential for a computerized assessment of the videos for use by the first respondents using a smartphone to perform screening tests, which can facilitate the timely start of the treatment.en
dc.description.affiliationSchool of Sciences São Paulo State University
dc.description.affiliationSchool of Engineering Royal Melbourne Institute of Technology
dc.description.affiliationUnespSchool of Sciences São Paulo State University
dc.description.sponsorshipStiftelsen Promobilia
dc.description.sponsorshipCoordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
dc.description.sponsorshipFundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
dc.description.sponsorshipIdCAPES: 001
dc.description.sponsorshipIdFAPESP: 2013/07375-0
dc.description.sponsorshipIdFAPESP: 2019/07665-4
dc.description.sponsorshipIdFAPESP: 2023/10823-6
dc.description.sponsorshipIdFAPESP: 2023/14197-2
dc.description.sponsorshipIdFAPESP: 2023/14427-8
dc.identifierhttp://dx.doi.org/10.1016/j.cmpb.2024.108195
dc.identifier.citationComputer Methods and Programs in Biomedicine, v. 250.
dc.identifier.doi10.1016/j.cmpb.2024.108195
dc.identifier.issn1872-7565
dc.identifier.issn0169-2607
dc.identifier.scopus2-s2.0-85191663309
dc.identifier.urihttps://hdl.handle.net/11449/298117
dc.language.isoeng
dc.relation.ispartofComputer Methods and Programs in Biomedicine
dc.sourceScopus
dc.subjectFacial action unit
dc.subjectFacial expression
dc.subjectMachine learning
dc.titleFacial expressions to identify post-stroke: A pilot studyen
dc.typeArtigopt
dspace.entity.typePublication
relation.isOrgUnitOfPublicationaef1f5df-a00f-45f4-b366-6926b097829b
relation.isOrgUnitOfPublication.latestForDiscoveryaef1f5df-a00f-45f4-b366-6926b097829b
unesp.author.orcid0000-0002-9698-2445 0000-0002-9698-2445[1]
unesp.author.orcid0000-0002-8071-5342[2]
unesp.author.orcid0000-0003-3602-4023[6]
unesp.campusUniversidade Estadual Paulista (UNESP), Faculdade de Ciências, Baurupt

Arquivos