The need to understand the control strategies utilized by humans in their everyday activity requires the measurement of several variables, which pertain to different aspects of the interaction with the environment. This work focuses on validation and uncertainty quantification of a sensor-fusion suite for the acquisition of biometric information that can be used for the estimation of the control strategy in a variety of everyday tasks. Specifically, the acquisition of biomechanical parameters such as those involving estimations of human ankle mechanical impedance is addressed. The sensor fusion suite that we devised consists of a centralized program that can synchronize and integrate the information coming from several transducers. In this work the system combines a set of two computer vision cameras. The cameras utilize a stereovision algorithm that transforms the markers acquired in Cartesian space with a model in the camera space. Furthermore, a biomechanical model of the human is integrated in the system for the reconstruction of the tracked movements in the joint space. The architecture of the systems allows for the integration of transducers beyond those implemented to date, including inertial measurement units, electromyography and pressure sensors. The system was able to capture the dynamics of an inverted pendulum on a cart mimicking human walking on a treadmill there the mechanical parameters of the ankles where estimated simulating a single stand phase.

This content is only available via PDF.
You do not currently have access to this content.