IBM and Unity are teaming up to bring Watson’s AI to VR and AR games - knowledge

Post Top Ad

Responsive Ads Here

IBM and Unity are teaming up to bring Watson’s AI to VR and AR games

Share This
IBM and Unity are teaming up to bring Watson’s AI to VR and AR games

IBM and Unity today announced a partnership to bring Watson’s AI functionality to the world’s most popular gaming engine, with built-in VR/AR features.
The IBM Watson Unity SDK, available for free on the Unity Asset store, gives Unity developers access to the Watson suite of AI functions. This means millions of developers can now integrate AI into VR and AR games, with relative ease, directly within the game engine.
Popular games like Star Trek Bridge Crew and Pokemon Go are developed in Unity, though few other VR or AR games have managed to crack the mainstream due to the high cost of hardware and lack of premium titles.
This partnership could change that. With new technologies set to enter the ‘mixed reality’ market soon, such as Microsoft’s Hololens (for which software is almost exclusively developed in Unity) and Magic Leap’s goggles, that do not require a tethered computer, the VR/AR space is set to heat up finally.
Of course we’ve all heard that before — 2016 and 2017 were both supposed to be the breakout year for VR but we still haven’t seen mainstream adoption. We asked Michael Ludden, Director of Product of IBM Watson Developers Labs & AR/VR Labs, about the problem. He told TNW:
2016 was an immensely successful year for VR. The world’s first real VR/AR headsets hit the market, I mean you could say there was VR in the 90s, but not really … I think the problem was that VCs got very excited and poured a bunch of money into VR and AR. After development they were like “where’s my billions?”
The specific AR/VR use cases for Watson, the IBM AI that’s done everything from power experiences at the Grammy awards show to making its own movies, are numerous and go far beyond just playing games.
Ludden pointed out that, for example, a practicing surgeon could stay immersed in a surgery simulator by using voice control. It’s an immersion breaker for a user to have to turn and either wait for menu popups or stop what they’re doing and grab a game pad to access menus and change ‘tools’ during an exercise.
With Watson on board the same hypothetical surgery simulator would function much more like the real world. The user could simply say “Hand me a sponge” and the game engine could process that command using Watson’s speech processing ability.
Watson’s voice recognition, speech-to-text, and image recognition features make for a promising addition to the Unity game engine and will, hopefully, propel VR/AR into the mainstream.

For more information on Watson for Unity you can visit this page

No comments:

Post a Comment

Post Bottom Ad

Responsive Ads Here

Pages