Google Shows Off Stunning New AR Features Coming to the Web and Mobile Apps Soon

 

ARCore, Google’s SDK for creating augmented reality experiences was launched in the year 2018. Ever since that time, Google was working silently to enhance and improve its AR platform. Google now reveals that it is ready to showcase some next-gen features and upgrades that it has come up with. These features would make augmented reality experiences more immersive and realistic in the coming years.

The new upgrades that Google brings to the ARCore will enable app developers to carry out occlusion. This means that using the Depth API, a virtual object when placed in the real-world environment can be obstructed from sight by the real physical object. For example, if a virtual animal like a cat or a dog is placed in your bedroom, you can actually see the virtual animal disappear from your view if your smartphone’s camera is angled in such a way that other physical objects like a chair, table, or bed come in the middle.

The outcome of this is more realistic scenes in AR. While experiencing AR, the smartphone is now capable of differentiating much more efficiently between two objects in a scene. Also, the distance between the two can be easily known. Google reveals that this new occlusion feature has been incorporated by improving the existing software. Hence, there won’t be a need for a smartphone, which has special sensors or any specific processor type. The occlusion is taking place on the smartphone itself without any dependency on the cloud. As long as your existing Android smartphone is compatible with ARCore, you can explore these novel features without any problem.

All of us have experienced Occlusion on smartphones before. Remember the popular augmented reality game Pokemon Go, developed by Niantic? Back then in July 2018, Niantic showcased a video where the power of occlusion could be seen with the little Pikachu virtual moving all around in an urban scene and mingled efficiently with the real-world environment. This was only a video that was shown and not even a true demo which press member could see.

In a meeting that Google’s augmented reality division had arranged, it was possible to view several real-time demonstrations that the AR team had developed to show how the latest depth-based technology functioned. This new technology would already link up with Houzz, the home design mobile app and the search results of Google too.

The home décor options in the Houzz app that can be found with the tool ‘View in My Room 3D’ will now have the occlusion feature as well. According to Google, 200 million + Android devices would also get the occlusion feature for objects that have AR models available in the Google search results.

Google also revealed that the new potential of Depth API’s won’t be available for commercial applications at present. However, those will be released for developers sometime in the future. This would be done after working closely with the Android developers along with other associates in order to refine the features even further. These would be one step ahead from occlusion, into three-dimensional mapping and realistic physics.

Google has come up with an effective way for the interaction of augmented reality virtual objects more naturally with the real-world environment. The three-dimensional object would move about in the real environment just how you would see it in the physical space around you. For example, Google made it possible to create blocks of different colours just from thin air. Surprisingly, these colourful blocks could bounce virtually from any surface.

Google also came up with a type of a small game that showcased the potential of movements of AR objects. These virtual objects would go over and around natural world objects. The kind of a mini-game the Google came up with featured a robo-chef that would be involved in a food tussle with you. It considers the home décor in the real-world environment, with the sweet treat and leaves realistic marks on the surfaces.

Google has not revealed any specific timeline as to when these upgraded APIs would be available for developers.