The development was carried out at the hardware level through the manufacture of a glove with flexible resistive sensors of the reference SEN-10264 (it is a sensor with resistive cells that varies its value in ohms depending on the flexion to which it is subjected) two per each finger, for a total of 10 sensors adhered to the surface of the glove, after going through a coupling stage (to avoid large amounts of noise) the signals generated by the sensors are acquired through the data capture module Analogs of a Microcontroller. In terms of software, the Microcontroller was first programmed with a digital filter stage for the signals acquired by the SEN-10264 flex sensor sensors and a machine learning algorithm for the processing of these and in this way obtain the individual metacarpophalangeal and interphalangeal joints angles of the hand evaluated. Finally, obtain the sensor data in terms of angles and export it to a flat text file to be graphed.
For a total of 10 people, multiple acquisitions were made with the developed system and in this way, similar patterns were determined in the angles of the metacarpophalangeal and interphalangeal between the test subjects in basic tasks such as hold different types of objects. Finally, the development of the sensor system was validated with two types of wearable sensors: the 5DT Data Glove device and the Optitrack Flex 3 high-speed cameras, these two types of devices have acquisition protocols predetermined by their manufacturers.
Given that the variables to obtain are the metacarpophalangeal and interphalangeal joints angles, the first 5DT Data Glove device delivers only one output signal for each finger of the evaluated hand, basically the software of the manufacturer performs a mathematical interpolation and approximations depending on the postures of the fingers, which makes it incapable by means of this technique to determine if a flexion is occurring from the metacarpophalangeal and interphalangeal joints separately, the Optitrack Flex 3 high-speed cameras generate real-time data graphs of the position in 3d and the trajectories of each of the joints of the hand, the problem is that the system performs the capture of data through markers located in the hand and first of all the size of these makes them crash in a task of grip in addition to making the task of characterizing the ag arrest because the viewing angles of the cameras must detect each marker located in the joints of the hand.
The individual angles will be used later to carry out a simulation project of a postural biological synergy of some basic object grabs (with a control command generating movement in several degrees of freedom) through the implementation of an artificial intelligence algorithm.