During the event on October 13, 2020, Apple introduced the new iPhone 12, which in the PRO and PRO MAX versions further immerses the general public into the world of augmented reality with the integration of LiDAR technology and 5G.

 

LiDAR technology and sensors: what it’s all about

 

LiDAR technology is comparable to an environment scanner that allows for quick and comprehensive mapping of our surroundings. It can capture up to 100 images at once and automatically estimate the physical dimensions of objects we capture through the phone’s camera. Apple devices with LiDAR scanners provide access to Depth API, enabling seamless integration of real-world context and virtual objects in the environment. LiDAR technology becomes even more powerful when combined with 5G, supported by iPhone 12 PRO, as it opens up additional opportunities such as streaming the scans.

 

How does LiDAR technology work?

 

It detects the quantity and quality of light through sensors that measure the time it takes for light to reach an object and bounce back. There is no longer a need to move the phone to indicate to our iPhone where the flat surface is for placing virtual objects, and lighting conditions will no longer pose a problem for space detection.

 

Sensori LiDAR iPhone 12

 

WHAT IS THE IMPACT OF LIDAR ON OUR AUGMENTED REALITY EXPERIENCES?

 

Incidentally, the inclusion of LiDAR sensors in iPhones will benefit photography. The front and rear cameras achieve DSLR quality with just one millimeter of focal length, playing a fundamental role in augmented reality. It enables better autofocus and the ability to “see in the dark.”

A second aspect relates to the integration of LiDAR with Snapchat, an application that will go beyond facial filters and enrich itself with additional augmented content.

But, most importantly, with iPhone 12 and LiDAR, it will be possible to experience much more immersive augmented reality content. Until now, we have seen 3D objects inserted between the screen of our phone and the image captured by the camera. Now, digital content can be placed in front of or behind people or objects.

A final technical note concerns the Neural Engine (the new A14 Bionic chip), which is six times more powerful than its predecessor. This allows for faster surface tracking for augmented reality, further enhancing the user experience.

Only iPhone Pro and iPhone Pro Max (available from November 13) are equipped with LiDAR. However, Apple’s future idea is evident: to include this functionality in all iPhone models and most likely in the highly anticipated augmented reality glasses set to be released in early 2022.

 

Sensori LiDAR - camere e scanner iPhone 12

 

The Benefits of LiDAR Sensors shortly

 

Until now, we have experienced Augmented Reality (AR) as a form of entertainment and fun for tech enthusiasts. Thanks to the latest generation of smartphones and upcoming devices, it is getting closer to the point where AR will become an integral part of our everyday lives. It’s no coincidence that Apple, along with other vendors, is taking the next step beyond phones and introducing its own augmented reality glasses that will incorporate LiDAR scanning technology and ARKit libraries to facilitate the development of immersive experiences in various fields. But even with the new iPhone 12 Pro and Max, we can already imagine exciting new scenarios.

For example, the publishing industry, has long been seeking an AR solution to support new services and experiences. Advanced lettering recognition through the phone’s camera could allow publishers to add additional elements without the need for a specific application. Readers could insert bookmarks or personalized augmented reality information into their books, whether they are physical or digital.

Engineers in the automotive industry have been studying intelligent environment mapping solutions for autonomous vehicles for several years, allowing them to navigate without human intervention. Augmented reality is also expected to be integrated inside vehicles, directly on the car’s dashboard, enabling augmented driving with visualized instructions displayed on the windshield, eliminating the need for a separate headset.

In the retail sector, the possibilities offered by LiDAR sensors will further enhance the user experience and virtual selection of items before visiting the physical store. Ikea has already made strides in this area by offering an application that allows users to recreate furniture products directly in their homes with precise space measurements. LiDAR will simplify the rendering of environments, adding value to the shopping experience.

Furthermore, shopping itself will be transformed. Assisted by LiDAR sensors, grocery shopping could change dramatically. Imagine being able to scan any product in the supermarket and receive additional information such as origin, price, and popularity among consumers and or being able to see loyalty points associated with a product through a rewards program. You could also visualize all the ingredients of displayed bakery or deli products.

We will witness an imminent evolution of augmented reality glasses in the industrial sector as well. With the precision of an advanced scanner like LiDAR, it will be possible to provide better remote assistance for machinery repairs. The same applies to the medical field.

If LiDAR can recognize objects from a simple photograph, we can expect developments in the film industry as well. Shortly, we may see additional content that is not necessarily narrated by the story’s protagonists.

 

Apple is striving to secure an important role in the immersive technology landscape by incorporating advanced technologies into the tools we use every day. As consumerization has shown many times before, we believe that through small steps, we will increasingly learn to overlay the real and digital dimensions and navigate hybrid environments. Companies need to prepare early on to understand how to capitalize on a phenomenon destined to become pervasive.