Photo from Ching-fu Chen
Researchers from the UC San Diego and San Diego State University have developed a pair of 4-D goggles intended to synchronize the feeling of being touched with visual input from a VR device. The creation is based on the neuroscience team’s new study that maps the regions of the brain dedicated to multisensory integration.
The researchers used functional magnetic resonance imaging (fMRI) to detect oxygenated blood flow–an indicator of neural activity–to different brain areas to detect which ones were activated by a combination of visual and tactile input. Once they had the relevant regions localized, the team was able to gauge what timing of sights and “touches” elicited the most activity.
The VR simulation was quite simple: subjects saw a ball in their virtual environment, which began moving towards them. A puff of air was delivered to the same side of the face. When the ball started moving at the exact moment the air puff was delivered, subjects didn’t perceive the two as being in-sync. But delivering the air puff 800 to 1,000 milliseconds after the ball started moving did give the impression of synchrony, with the most brain activity occurring in response to these “lateralized” stimuli. This timing seems to mimic the real-world experience of seeing an object as it passes us by, and feeling the breeze from its movements just after. The ability to implement such small details of stimulus perception could prove to be a big step for realism in VR games and other applications.