A FRAMEWORK FOR MARKERLESS FULL BODY HUMAN 3D MONOCULAR POSE ESTIMATION

TOMI, AZF AR (2014) A FRAMEWORK FOR MARKERLESS FULL BODY HUMAN 3D MONOCULAR POSE ESTIMATION. Masters thesis, Universiti Teknologi PETRONAS.

[thumbnail of 2014 - INFORMATION TECHNOLOGY - A FRAMEWORK FOR MARKERLESS FULL BODY HUMANS 3D MONOCULAR POSE ESTIMATION - AZFAR BIN TOMI - MASTER.pdf] PDF
2014 - INFORMATION TECHNOLOGY - A FRAMEWORK FOR MARKERLESS FULL BODY HUMANS 3D MONOCULAR POSE ESTIMATION - AZFAR BIN TOMI - MASTER.pdf
Restricted to Registered users only

Download (2MB)

Abstract

Pose estimation IS an important pre-processing step m computer vision-based
automatic capture and analysis human motion. Despite its high efficiency in handling
the ambiguities situation, multiple view approach of pose estimation is costly incurs
high computational cost due to more complex system. Recently, most of the work
focusing in a low cost and practical monocular view approach due to its suitability for
a common user and low complex system. However, several monocular view issues
arise with regard to self-occlusion which leads into problem in body part extraction,
and the undetermined value in human pose reconstruction focusing on upper and
lower limbs reconstruction that caused the reconstruction problem especially in high
noise movement. Thus, this thesis project presents a framework for a real time
markerless motion capture to track human full-body movement for monocular 3D
pose estimation. The proposed framework comprises of a combination of top-dov.'!1
and bottom-up approach toward 3D pose estimation in monocular view based on endeffector
driven. The proposed framework is built as a three-stage framework.

Item Type: Thesis (Masters)
Subjects: Q Science > Q Science (General)
Departments / MOR / COE: Sciences and Information Technology > Computer and Information Sciences
Depositing User: Mr Ahmad Suhairi Mohamed Lazim
Date Deposited: 15 Sep 2021 20:08
Last Modified: 15 Sep 2021 20:08
URI: http://utpedia.utp.edu.my/id/eprint/21124

Actions (login required)

View Item
View Item