Please use this identifier to cite or link to this item: https://rda.sliit.lk/handle/123456789/3292
Title: Qualitative Analysis of Automated Visual Tracking of Objects Through Head Pose Estimation
Authors: Abeysinghe, A
Arachchige, I. D
Samarasinghe, P
Dhanawansa, V
Velayuthan, M
Keywords: Qualitative Analysis
Automated Visual Tracking
Objects
Pose Estimation
Through Head
Issue Date: 9-Dec-2022
Publisher: IEEE
Citation: A. Abeysinghe, I. D. Arachchige, P. Samarasinghe, V. Dhanawansa and M. Velayuthan, "Qualitative Analysis of Automated Visual Tracking of Objects Through Head Pose Estimation," 2022 4th International Conference on Advancements in Computing (ICAC), Colombo, Sri Lanka, 2022, pp. 369-374, doi: 10.1109/ICAC57685.2022.10025053.
Series/Report no.: 2022 4th International Conference on Advancements in Computing (ICAC);
Abstract: An automated approach for object tracking and gaze estimation via head pose estimation is crucial, to facilitate a range of applications in the domain of -human-computer interfacing, this includes the analysis of head movement with respect to a stimulus in assessing one’s level of attention. While varied approaches for gaze estimation and object tracking exist, their suitability within such applications have not been justified. In order to address this gap, this paper conducts a quantitative comparison of existing models for gaze estimation including Mediapipe and standalone models of Openface and custom head pose estimation with MTCNN face detection; and object detection including models from CSRT object tracker, YOLO object detector, and a custom object detector. The accuracy of the aforementioned models were compared against the annotations of the EYEDIAP dataset, to evaluate their accuracy both relative and non-relative to each other. The analysis revealed that the custom object detector and the Openface models are relatively more accurate than the others when comparing the number of annotations, absolute mean error, and the relationship between x displacement-yaw, and y displacement-pitch, and thereby can be used in combination for gaze tracking tasks.
URI: https://rda.sliit.lk/handle/123456789/3292
ISSN: 979-8-3503-9809-0
Appears in Collections:4th International Conference on Advancements in Computing (ICAC) | 2022
Department of Information Technology
Research Papers - IEEE
Research Papers - SLIIT Staff Publications
Research Publications -Dept of Information Technology

Files in This Item:
File Description SizeFormat 
Qualitative_Analysis_of_Automated_Visual_Tracking_of_Objects_Through_Head_Pose_Estimation.pdf
  Until 2050-12-31
868.57 kBAdobe PDFView/Open Request a copy


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.