Apple reveals ARKit 3 with RealityKit and Reality Composer

The previous two Worldwide Developers Conferences have been effectively used by Apple as a platform to showcase the powerful and innovative augmented reality SDKs for iOS - ARKit 2.0 and ARKit along with a first-hand demonstration. In WWDC 2019, Apple exhibited novel AR inventive products such as RealityKit, ARKit 3, and Reality Composer. All these tools make it easier for developers to develop AR apps.

Lets us see what three powerful tools are all about.


It is a highly sophisticated and powerful framework that enables iOS developers to mingle the virtual elements in the real-world environment easily. Also, the virtual objects are scaled automatically to fit the screen size of various iOS devices and function seamlessly on each one, even while providing multi-user AR experiences. RealityKit offers iOS developers access to top-notch functionalities such as spatial audio, animations, physics, and world-class rendering. Native ARKit integration and Swift API of RealityKit makes augmented reality quicker and easier than before.

ARKit 3

It is a disruptive AR platform with state-of-the-art capabilities. The two standout additions in ARKit 3 include people occlusion and camera motion capture in real time. ARKit 3 makes multiple tracking of faces a reality. Face tracking is possible for three people at the same time by making use of TrueDepth cameras available on various iOS devices. Also, the simultaneous access of the world and face tracking both on the back and front cameras at once opens up a new world of possibilities for AR developers. The people occlusion feature offers a more realistic experience to the users and makes things easier for developers.

Reality Composer

It is a robust tool that enables developers to create highly interactive and engaging augmented reality experiences. It is capable of converting the existing virtual objects and 3D models into the USDZ file format. This enables the 3D models to work seamlessly on all AR-enabled iOS devices. It allows developers to choose the actions that would happen when a user taps on a virtual object or comes in close proximity to it. They can even make use of the spatial audio feature. Reality Composer can be run by developers in Xcode. They can also run it as an application for iOS.