HAPTICS FOR TEXTURE IDENTIFICATION

Goals and Scope

  1. Investigate the use of vibrotactile haptic feedback to improve texture recognition for upper limb amputees wearing prostheses. Due to time and resource limitations, we aim to emulate such an experience on able-bodied subjects holding a plastic hand.
  2. Build a prototype that provides haptic feedback for texture identification for at least three different textures.

Technical Description

The concept for the texture identifying prototype is shown as a block diagram in Figure1. The final design of the device included a vibration sensor placed on the proxy hand that simulated the experience of using a prosthetic hand. The vibration sensor (LDT0-028K, Measurement Specialties) sense vibrations across different surface textures and send this information as an input voltage signal to the Arduino (UNO, Arduino). The firmware written in the Arduino sent corresponding signals to excite the eccentric rotating mass (ERM) vibration motor (10mm) at various amplitude-frequency pairs. The vibration motor was attached to a wearable band worn on the forearm.

We designed a firmware that runs a peak-detection algorithm on the sensor reading. The peak detection values were sampled and a running average of the values was computed. The running average created different thresholds based on the amplitude of detection of different textures. In the final design, we set three thresholds based on three different surface textures (smooth, medium, and rough). These thresholds sent corresponding motor vibration amplitude-frequency pair signals.

null

Figure 1. Vibration sensor on the proxy arm inputs signals to the Arduino, which then commands the motor driver to drive vibrotactile actuators on the band worn at the forearm.

Design Iterations

Before the final design, several iterations of the prototypes were made. Initial tests of the hardware setup involved studying the functionality of the vibration sensors and the ERM motor. In the first design iteration, the motor was connected to a haptic motor driver (DRV2605, Texas Instruments). The preliminary testing of circuitry was implemented on a breadboard (Figure 2).

nullnull

Figure 2. Circuit for testing performance of Vibration sensor and Vibration motor

The second design iteration removed the haptic motor driver as the motor was unresponsive. We found out that the ERM motor does not require a motor driver for independent control of vibration amplitude and frequency, therefore deeming the motor driver unnecessary. During this iteration, we also considered using a force sensor to normalize forces based on how hard the user pressed down on the proxy hand. Although we attempted to integrate a force sensor, the force sensor exhibited odd behavior when used together with the vibration sensor; the vibration sensor and force sensors became coupled such that whenever one had a signal, the other would have the same signal output but with a DC offset of about 30V. We were unable to resolve this issue and had to discard the force sensor.

Evaluation Strategy

After attaching the armband motor, every subject was briefed about the various steps that would be taken through during the experiment.

  1. First, a brief training phase was conducted. During this phase, the experimenter held the rubber hand and used it on each of the four test textures, while the subject was able to see the corresponding test texture and feel the associated motor vibration.
  2. Next, the subject was equipped with noise-cancelling headphones and blindfolded with an eye mask to ensure that the only type of feedback from the textures was tactile.
  3. Finally, the experimenter progressing through the four different textures randomly, each time asking the subject to identify the texture based on motor vibrations.

IMG_4020

Figure 3: Hardware Setup for Testing

Key Technical Results

The final prototype developed examined the ability of users to distinguish between four different textures: large marbles, small fluffy balls, small beads and a smooth wooden board. By allowing the experimenter to perform the movement atop the textures, a constant velocity was maintained.On performing a short user study, we found that 76.36% of subjects were successfully able to distinguish the four textures. The textures were tested in a predetermined order, known only to the experimenters, to avoid any discrepancy in testing. We used a simple survey to put down the user’s name as well as if they got the texture correct. The survey is shown below in figure 4. The Lion king characters are decorative and are only placeholders to the textures the users experienced and to indicate if they got it right or not. Starting from the left, if the user got the first texture right they would get a check or if they got it wrong they would get an x. The total is out of 5. The subjects fill in their names and then experimenters fill in the correctness of the texture perceived. In total, we had 11 subjects with an average number of correct guesses being 3.8181. This shows that we can feel texture to some degree even with non-optimal motors.

On performing a short user study, we found that 76.36% of subjects were successfully able to distinguish the four textures. The textures were tested in a predetermined order, known only to the experimenters, to avoid any discrepancy in testing. We used a simple survey to put down the user’s name as well as if they got the texture correct. The survey is shown below in figure 4. The Lion king characters are decorative and are only placeholders to the textures the users experienced and to indicate if they got it right or not. Starting from the left, if the user got the first texture right they would get a check or if they got it wrong they would get an x. The total is out of 5. The subjects fill in their names and then experimenters fill in the correctness of the texture perceived. In total, we had 11 subjects with an average number of correct guesses being 3.8181. This shows that we can feel texture to some degree even with non-optimal motors.

Scoresheet.png

Figure 4. Scoresheet for five textures

Key Technical Problems

We had three main technical problems with our project. The first was an unusual coupling between the force sensor readings and vibrational sensor readings. When used alone each sensor worked fine and gave accurate readings. However, when used together, the outputs would couple with each other and result in spikes during testing. The assumption was that this was due to use of a common power source, which created a common DC offset.

The second issue was approximately a three-second delay between the input and output parameters. This was a result of the sampling parameter used, which could be eliminated in a subsequent version of the prototype. The sampling parameter is set for the moving average, which could either be optimized to adjust during sampling, or be changed altogether. Additional latency was also likely added because of time the ERM motor took to reach the desired speed. Finally, a delay was also introduced by the mechanical nature of the vibration sensor, as the sensor continued to vibrate for a period of time after the hand was no longer in contact with the surface.

During testing, we found that the vibrational response of the system was highly dependent on the velocity of hand movement. This is due to the change in vibration patterns that results from running the hand over the same texture at different speeds. If the surface was rough, but with minimal distance between each gradient in the texture, then the amplitudes were smaller and more consistent, even at different velocities. Velocity effects were the most pronounced when the distance between surface gradients was large (e.g. in the case of the large marbles) because there was an abrupt change and a stronger force applied to the hand, which causes the vibration sensor to be more variant. To address this issue the test procedure used a metronome so that the experimenter maintained a consistent velocity across and within trials. We determined by trial and error that moving in an elliptical pattern that spanned the texture surface boards yielded consistent readings.

Finally, the protocol initially had the subject control the proxy hand; however, we discovered that vibrations from proxy hand were not entirely eliminated, making it too easy to “feel” the texture through the plastic hand. In further rounds of testing the proxy hand was controlled by the experimenter. Although this does not ideally represent a usage pattern for a prosthetic, it was adopted to try and isolate whether or not the vibrotactile motor could be used for users to distinguish texture after training.

Discussion and Future Work

One of our key takeaways was a deeper appreciation for the level of detail perceived by our hands. As we progressed with the experiment, it quickly became apparent that our fingertips were expert at quickly discerning the difference between multiple texture types. Initially, we started with an array of twelve textures that were easily distinguishable by hand (four sizes of spheres, three sizes of cotton balls, three grits of sand, and three diameters of jute rope). However, the amplitudes measured by the vibration sensor were only adequate to distinguish very roughly between surfaces with different degrees of coarseness. Representations of surface texture would have likely been improved by taking the frequency domain information of the vibration sensor into account and translating this information into corresponding motor frequency and amplitude. However, this type of texture representation would require a more sophisticated motor that could decouple its frequency and amplitude (e.g. an LRA).

A more compelling design could add additional sensors to allow subjects more freedom of movement and less of a dependency on a fixed speed and pressure. Specifically, a force sensor could be incorporated to normalize a user’s downward pressure. Additionally, an accelerometer could be incorporated to take velocity into account and remove the metronome constraint. A more complicated algorithm design would take these parameters into account when translating texture output to the motor.

The current algorithm uses a combination of peak detection and a running average to determine texture coarseness thresholds. This approach results in the noticeable delay issue as well as imperfect texture detection. Further work would improve the algorithm such that the delay problem would be fixed and texture reproduced more accurately. For example, force sensor data could have been used to determine the presence of any downward force; if no force was detected, the system could have determined that the hand was in free space and shut off the motor more effectively. To fix the reproducibility of texture we would have to further research other texture algorithms that take frequency information into account. Incorporating frequency information would require the use of a more easily controllable motor, such as an LRA, as the ERM motor was not able to keep up with rapid vibration changes.

Despite the limited nature of the texture feedback provided by the ERM motor, preliminary results indicate that subjects were still able to successfully distinguish between the three thresholds of coarseness with minimal training. Further iterations of the design with an improved algorithm, sensors, and motor feedback could provide a more intuitive representation of remapped texture.

Links and Data

Below are the links for the major components of this project:

Vibration motor datasheet

https://cdn.sparkfun.com/datasheets/Robotics/B1034.FL45-00-015.pdf

Vibration sensor datasheet

http://cdn.sparkfun.com/datasheets/Sensors/ForceFlex/LDT_Series.pdf

Arduino Uno Documentation

https://www.arduino.cc/en/Main/ArduinoBoardUno

Data

null

Data Figure 1: Boxplots of running average data from one test session (fourteen textures).

null

Data Figure 2: Boxplots of raw data from one test session (Test User 1, fourteen textures tested).

Surface Textures

Test data was collected for fourteen textures. In addition to the twelve textures below (four different types of materials in different sizes), data was also collected for wood and foam board. Several textures resulted in similar amplitudes, so user testing was conducted with the subset described in the experimental procedure.

image

Code (Github):

https://github.com/nkalavak/Haptics-for-Texture-Identification–using-Prosthesis-


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s