In English

Human Interaction Solutions for Intuitive Motion Generation of a Virtual Manikin

Luca Caltagirone
Göteborg : Chalmers tekniska högskola, 2014. Diploma work - Department of Applied Mechanics, Chalmers University of Technology, Göteborg, Sweden, ISSN 1652-8557; 2014:60, 2014.
[Examensarbete på avancerad nivå]

The aim of this work was to develop a motion capture algorithm for the Kinect sensor, which can provide robust and real time tracking, even in those situations where the Kinect algorithm performs poorly. The proposed method belongs to the family of model based algorithms which work by establishing correspondences between an acquired point cloud and a custom-built body model. Specifically, an extension to articulated bodies of the point cloud registration algorithm known as iterative closest point (ICP) was used in combination with the Gauss-Newton minimization algorithm. The virtual manikin IMMA (Intelligently Moving Manikins) allows for the conduction of ergonomic studies in a simulated assembly line. To perform motion verification of the solutions found by the simulation software, a motion capture system could be integrated within its platform. Furthermore, this could facilitate the analysis of some complex assembly operations which may be troublesome to solve with the current version of IMMA. Most of the available commercial devices are expensive, difficult to operate, and require specialized equipment and software. However, in recent years, Microsoft has introduced to the market an RGB-D camera, the Kinect 1.0, which provides 3D information without the need for triangulation, and furthermore integrates a sophisticated motion tracking system based on machine learning algorithms. Unfortunately, this system performs rather poorly in the settings commonly found in assembly lines where self-occlusions and occlusions are commonplace. By measuring the normalized residual error per point (NREP), one can determine how well the articulated iterative closest point (AICP) system matches the body model to the point cloud acquired from the sensor. The obtained results show that the AICP system is more robust than the Kinect algorithm by producing an NREP approximately seven times smaller on average relative to a set of 30 unconstrained motion sequences involving occlusions and self-occlusions. Furthermore the developed system allows the tracking of a wider range of motions in an extended range. This makes it a potentially better solution for performing motion capture in assembly lines.

Nyckelord: Computer vision, RGB-D sensor, Gauss-Newton algorithm, articulated ICP, nearest neighbor search

Publikationen registrerades 2015-09-25. Den ändrades senast 2016-03-22

CPL ID: 223202

Detta är en tjänst från Chalmers bibliotek