Google shows off stunning new AR features coming to web and mobile apps soon

Google shows off stunning new AR features coming to web and mobile apps soon

Google has been quietly working to enhance its augmented reality platform, ARCore, since its official launch early last year. Now, the corporate says it’s able to unveil a number of next-generation upgrades to depth detection and physics it’s achieved, which promise to create AR experiences seem rather more realistic within the future.

The upgrades, a part of ARCore’s all-new Depth API, will soon allow developers to perform what’s referred to as occlusion, which is when a synthetic object will be blocked from view by other real-world objects in an exceedingly scene. Place a virtual cat in your front room, for example, and you'll see it disappear from view after you angle your camera in an exceedingly way that places a bed or table or another object in between.

The result is a more believable scene, because the depth detection occurring under the hood means your smartphone better understands every object in an exceedingly scene and the way far apart each object is from each other. Google says it’s able to do that through optimizing existing software, so you won’t need a phone with a selected sensor or style of processor. It’s also all happening on the device itself, and not wishing on any help from the cloud. ciao as you've got a phone that supports ARCore, which is just about every new Android phone released within the previous few years, you’ll be able to access these new features.

We’ve seen occlusion on mobile phones before. Pokémon Go creator Niantic showed off a video of an occlusion demo featuring a little virtual pikachu darting around an urban plaza, dashing in between objects and blending seamlessly with the environment. But it absolutely was just a video, not a demo that members of the press could see running on a tool and operating in real time.

During a gathering with members of Google’s AR division, it was able to fiddle with real-time demos the team built to indicate off the new depth technology. Granted, it absolutely was in an exceedingly test environment Google had arranged for the demo, but the technology does work. In fact, it’ll be available, as a part of updates to home design app and Google’s own AR in Search feature.


It has been worked out on some demos of the Depth API’s new capabilities that won’t be turning up in commercial apps or services, but Google says those advancements are going to be made available to developers within the future after it works more closely with developers and other collaborators to shine a number of its approaches.

These transcend occlusion and into more realistic physics and 3D mapping. Google has developed some way for AR objects to interact with the important world more realistically, move through an environment the way a real-world 3D object would, and interact with surfaces such as you might expect physical matter would. As an example, within the demo it has to be compelled to experience, it was used to be able to create colorful shaped blocks out of void that would bounce off virtually any surface, even the handlebars of an exercycle.

Google also made a mini-game of sorts showing off the flexibility for AR objects to maneuver through an environment by going around and over real-world objects and also the new Depth API’s surface interaction capabilities. It involved a cooking robot that engages during a food fight with you that takes into consideration the furniture and walls of the environment, with desserts leaving realistic splatters on surfaces.

Google isn’t making these demos available to the general public, but the corporate says it hope app makers will make almost like vastly improved experiences when it's able to release the updated Depth API to all or any developers. The corporate doesn’t have a timeline for when it does expect to release this toolset more broadly, but it’s likely these capabilities are going to be appearance in apps and AR web experiences your time next year.