Human beings at capable of having five senses that we exercise in our everyday lives: sight or vision, hearing or audition, smell olfaction, taste gustation, and touch or tactition. This last sense, also known as tactile sensation, is an incredibly important part in how humans perceive their reality. While technology has allowed us to immerse ourselves in a world of sights and sounds from the comfort of our home, but there's something missing: touch.
In a recent study published in IEEE Transactions on Haptics, researchers at the USC Viterbi School of Engineering have developed a new method for computers to achieve that true texture -- with the help of human beings. Called a preference-driven model, the framework uses our ability to distinguish between the details of certain textures as a tool in order to give these virtual counterparts a tune-up. The research was carried out by three USC Viterbi Ph.D. students in computer science, Shihan Lu, Mianlun Zheng and Matthew Fontaine, as well as Stefanos Nikolaidis, USC Viterbi assistant professor in computer science and Heather Culbertson, USC Viterbi WiSE Gabilan Assistant Professor in Computer Science.
"We ask users to compare their feeling between the real texture and the virtual texture," Lu, the first author, explained. "The model then iteratively updates a virtual texture so that the virtual texture can match the real one in the end."
Using this preference-driven model, the user is first given a real texture, and the model randomly generates three virtual textures using dozens of variables, from which the user can then pick the one that feels the most similar to the real thing. Over time, the search adjusts its distribution of these variables as it gets closer and closer to what the user prefers. According to Fontaine, this method has an advantage over directly recording and "playing back" textures, as there's always a gap between what the computer reads and what we feel.
In the future, real textures may not even be required for the model, Lu explained. The way certain things in our lives feel is so intuitive that fine-tuning a texture to match that memory is something we can do inherently just by looking at a photo, without having the real texture for reference in front of us.
One current example of haptic technology is The Hug Project sponsored by Cox. As stated on their website, the project “gives the opportunity for loved ones to embrace with a virtual hug no matter the distance. Partnering with technology innovator, CuteCircuit, Cox manufactured an innovative wearable “HugShirt” programmed to emulate a HUG. Haptic sensors in the shirt connect two people. Isolated individuals will be able to feel the touch of a husband or a grandchild just as if they are there with them.”
What further advancements will scientists make with haptic technology? Only time will tell, and this is why we science!
As always, keep doing science & keep looking up!
Sources: Sensory Trust, IEEE Transactions on Haptics, Cox (1), Cox (2)