Swedish robotics specialists in Gleechi are working on a system to allow game developers to deliver realistic, real-time and dynamic interaction animations, so that your VR hands can finally grasp in-game objects like it should.
For last eight years Gleechi was working on how to enable robots to use their hands more efficiently, and when VR made it big they realized that virtual reality have same problem robots have with using their hands. Some solutions for this problem in VR was that hands disappear, object jumps into your hands, unrealistic griping angles and even hands getting into object itself while holding.
Now, Gleechi claim to be well on their way to resolving these issues. They say that VirtualGrasp technology resolves the need for labor intensive manual animations for the hands by using a “predictive and adaptive algorithm” which analyses the ‘physical’ properties of a virtual object, and then using Grasp Taxonomies to decipher the most appropriate and realistic grip formation for the in-game hand model and snapping to that position. The software is still in an early state, but it really does seem to work and seeing it in action you realise just how poor most in-game interactions look. Gleechi explains their technology in this quote:
“Gleechi resolves these problems through a predictive and adaptive algorithm, taking into account physical constraints from the virtual objects and environments that the developers work with. With our award-winning software technology VirtualGrasp™ we fully automate the hand animation process.”
The plan for VirtualGrasp is to make this technology available to other developers and to improve overall hand interaction in VR. Their point of view is that in next few years VR will go on two separate directions, first one having more realistic world with fully functional hand control, and other, gaming direction where developers will focus on the game and gameplay but with less realistic hand control.