Technical insights, news, videos
Mar 11, 2022

If you are an Apple enthusiast or just have been paying close attention to recent headlines there’s a chance you’ve heard the word LiDAR come up a lot in recent announcements. While LiDAR as a technology has existed for well over 50 years its adoption by Apple’s iPhone lineup is undeniably one of the main reason behind its recent popularization in headlines all over the world, but some of you might be wondering what it means for the company and what is LiDAR in the first place.

The first thing we need to understand is that LiDAR is not an Apple-exclusive feature, but it is a technology the company is heavily pushing on its catalog and with good reason. When we and Apple say that their new models have LiDAR technology what it mostly means is that they come equipped with a LiDAR sensor on their camera array. If you have an iPhone 12 Pro, iPhone 13 Pro, or iPad Pro you can easily identify this sensor as the black dot near the other lenses. This seemingly inconspicuous dot is a sensor that allows your camera to better identify depth, run Augmented Reality apps, and even increase the quality of your photos, so Apple’s dedication to LiDAR iPhone integration is easily justified in that sense.

However, that does not completely answer what LiDAR is and what functions it provides for an Apple device, so let’s try to unwrap each question one at a time.

What is LiDAR?

The first thing we need to properly explain is what is LiDAR technology and how to use LiDAR in a practical setting, so let’s begin with the basics. LiDAR is an acronym that stands for “Light detection and ranging” and while this name might be confusing at first it covers everything you need to know about the technology.

Imagine for a second a traditional photograph, while modern photography is highly accurate and detailed when seen as a source of information there’s one completely absent element, and that is depth. For both machines and humans, it’s impossible to tell the accurate depth of an object relative to the camera or even to each other, which is a problem for more advanced tasks like computer modeling or even face scanning.

LiDAR can be best seen as an answer to this challenge. A LiDAR system is designed to send light pulses towards its surroundings, these pulses then bounce off the nearby objects and back into the LiDAR sensor. Through this process, a LiDAR system can tell how fast each pulse came back and uses this information to calculate the distance between the sensor and each area the light pulse bounced from. This ultimately results in the system obtaining X, Y, and Z coordinates for each spot in its surrounding area, which can then be converted into a proper 3D model of the scanned location.

In short, if traditional photography is a way to obtain visual information, it’s easier to think of LiDAR as a way to obtain spatial information. However, the most important detail to remember is that you can use both technologies together, and that is exactly what the LiDAR camera iPhone is promoting does.

How does iPhone work with LiDAR?

So now that we have a clear understanding of how LiDAR and what is meant to do, let’s go deeper into how Apple’s take on LiDAR works. As we mentioned before Apple’s take on LiDAR is mostly present on their 2020 and further models, and its presence can be easily identified by taking a look at the camera array on your phone or tablet.

On top of the traditional lenses, all smartphones count with you’ll be able to find a small black circle towards the bottom of the array, this is the LiDAR sensor iPhone is promoting and is the key to all the new functions Apple is bringing to the table.

When using your camera or an AR-enabled application the sensor will activate and begin sending waves of light pulses towards its surroundings. These waves are completely invisible to the human eye (though they can be detected with night vision) and as such won’t result in any discomfort or negative impact on your photos. After sending these pulses each one of them will bounce back towards the source, and through this travel information and with support from its infrared sensor the phone will be able to provide a proper measure of depth to each item on its photographs.

Reference: https://developer.apple.com/news/?id=qwhaoe0x

Application areas

By now we have covered everything you could need to know about how the iPhone’s LiDAR technology works, so it’s time to get more practical and answer what the main iPhone LiDAR uses that Apple has planned for its devices; after all a technology is only as useful as it’s practical applications.

The first area where you’ll be able to tell the difference is in the iPhone’s camera. Apple uses the information gathered with LiDAR to improve the quality of photos through the analysis of the subjects in the photo and the ideal setting for each picture. Through LiDAR an iPhone can more accurately identify what is in front of its camera and adjust its focus accordingly even in low-light conditions, considerably improving the quality of your shots.

LiDAR also plays a large role when it comes to AR applications and it’s here where the full capabilities of the technology become readily apparent. AR apps simply load faster on Apple’s LiDAR equipped devices thanks to the superior input of information and the general speed of the technology. Since most AR apps require building a map of their surroundings it can take a considerable time for a traditional smartphone to boot them up, Apple’s LiDAR models on the other hand can realize this process in a much briefer window and enhance the AR experience from the very first minute.

But perhaps the most exciting part is the fact that 3D mapping is now viable on any iPhone and the potential applications of this advance are limitless. LiDAR and other 3D mapping technologies have been essential for architecture, engineering, and geolocation. There was a time when novel ideas like Google Maps were only possible with dedicated cars with LiDAR technology, but now everybody will be able to walk and map with LiDAR enabled phones right in their pocket.

Dioram LiDAR technology vs Apple

While compact lidars can indeed bring completely new possibilities to mobile devices, they have a number of significant limitations.

LIDARs have a limited range, do not work well in direct sunlight, are significantly more expensive than conventional visual sensors for computer vision, and require increased power consumption.

The use of conventional camera sensors requires very complex algorithms and the latest scientific approaches in computer vision and machine learning. This is more difficult to develop, but this approach can get rid of problems with lidars.

Dioram SLAM One, when using a conventional phone camera and inertial sensors (IMU), is able to generate a depth map with an accuracy comparable to Apple LiDAR SLAM.

When implementing a hybrid sensor fusion approach (tight /loose coupling of lidar and camera data), the quality of the map and positioning accuracy increase even more!