At the WWDC 2017, a new framework, Apple ARKit was introduced in iOS 11 to let the developers to create augmented reality experiences for iPad and iPhone via apps. Apple is expecting this framework to take its apps beyond the screen by allowing them to communicate with the real world in new ways completely. It offers Xcode app templates, Scale estimation, Ambient lighting estimation, Plane estimation with basic boundaries, stable motion tracking and much more.
Don’t miss: Every MacOS user needs to know this trick
Apple ARKit Highlights
- ARKit uses Visual Inertial Odometry (VIO) to accurately track the world around it. VIO fuses camera sensor data with CoreMotion data. These two inputs allow the device to sense how it moves within a room with a high degree of accuracy, and without any additional calibration.
- With ARKit, iPhone and iPad can analyze the scene presented by the camera view and find horizontal planes in the room. ARKit can detect horizontal planes like tables and floors, and can track and place objects on smaller feature points as well.
- ARKit also makes use of the camera sensor to estimate the total amount of light available in a scene and applies the correct amount of lighting to virtual objects.
- ARKit runs on the Apple A9 and A10 processors. You can take advantage of the optimizations for ARKit in Metal, SceneKit, and third-party tools like Unity and Unreal Engine.
Share us your views in the comments section below. We’d love to hear about it from you.
You might also like our TUTEZONE section that contains exclusive articles on how you can make your life simpler using technology. Trust me. You will be glad that you paid a visit there.