ARCore's New Depth API Will Bring The Next Generation Of Hide-and-Seek To Phones
AR apps on Android have always struggled to properly sense depth and distinguish between front-end and back-end in the physical world. Each time an AR object is added, it will simply be placed on top of the entire scene in front of your viewer, whether something is to realistically block your view or not. After an in-depth preview phase introduced last year, Google released its new API Depth for ARCore for all developers using Android and Unity.
The new interface helps to differentiate the foreground and real-world background, so digital objects are properly occluded while at the same time, a depth-from-motion algorithm has been used by the API similar to Google's Camera bokeh Portrait mode for creating the depth map. The other great feature enabled by depth sensing is the ability to simulate physics, like the ability to throw a virtual object down real stairs and bounce it realistically.
In addition to Google Search, ARCore Depth Lab, and a domino app specifically designed to highlight the new API, the first product to receive an update that takes advantage of occlusion is Houzz, an application that allows you to equip your home with AR furniture. There's also the TeamViewer Pilot app, which helps you take advantage of AR to remotely help those who aren't computer savvy. Five Nights at Freddy's is the first game to take advantage of the API, allowing certain characters to hide behind real-world objects to become more frightened. Additionally, Snapchat has updated its Dancing Hotdog and Undersea World lenses to take advantage of the occlusion.