Please use this identifier to cite or link to this item: https://rda.sliit.lk/handle/123456789/1408
Full metadata record
DC FieldValueLanguage
dc.contributor.authorNadeeshani, M.-
dc.contributor.authorJayaweera, A.-
dc.contributor.authorSamarasinghe, P.-
dc.date.accessioned2022-02-25T09:32:42Z-
dc.date.available2022-02-25T09:32:42Z-
dc.date.issued2020-12-10-
dc.identifier.isbn978-1-7281-8412-8-
dc.identifier.urihttp://rda.sliit.lk/handle/123456789/1408-
dc.description.abstractWith the recent advancements in deep learning techniques, attention has been given to training and testing facial emotions through highly complex deep learning systems. In this paper we apply machine learning techniques which require less resources to produce comparable results for emotion prediction. As the underlying technique for the emotion prediction in this research is based on clinically recognized Facial Action Coding System (FACS), a further analysis is given on the contribution of each of the Action Units (AUs) for the predicted emotion. This analysis would complement, strengthen and be a main resource for addressing many different health issues related to facial muscle movements.en_US
dc.language.isoenen_US
dc.publisher2020 2nd International Conference on Advancements in Computing (ICAC), SLIITen_US
dc.relation.ispartofseriesVol.1;-
dc.subjectFacial Action Coding Systemen_US
dc.subjectAction Unitsen_US
dc.subjectemotion predictionen_US
dc.subjectmachine learningen_US
dc.subjectK Nearest Neighbor classifieren_US
dc.titleFacial Emotion Prediction through Action Units and Deep Learningen_US
dc.typeArticleen_US
dc.identifier.doi10.1109/ICAC51239.2020.9357138en_US
Appears in Collections:2nd International Conference on Advancements in Computing (ICAC) | 2020
Research Publications -Dept of Information Technology

Files in This Item:
File Description SizeFormat 
Facial_Emotion_Prediction_through_Action_Units_and_Deep_Learning.pdf
  Until 2050-12-31
427.25 kBAdobe PDFView/Open Request a copy


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.