If you have an iPhone 12 Pro or Pro Max, chances are that you may have heard a lot about LiDAR. That large black dot accompanying the triple camera system is the reason Apple claims it has offers camera performance. Before the iPhone 12 Pro series, Apple included the LiDAR sensor on the iPad Pro. In every formal presentation, Apple is always proud to show off the advanced AR apps using the world as a canvas.
Hence, based on Apple’s claims, LiDAR seems to have lots of applications in mobile devices. Moreover, with leaks suggesting the arrival of LiDAR on all iPhone 13 models this year, it seems we need to get used to the idea of the LiDAR sensor in general. Moreover, it is expected that Android phones could eventually end up getting the LiDAR sensor soon.
So, what is LiDAR and why is it such a big deal? Let’s take a closer look.
What is LiDAR?
LiDAR is an abbreviation for Light Detection and Ranging, which in layman’s English means a sensor relying on laser to measure distances. A LiDAR sensor essentially throws out laser on surfaces and waits for the waves to reflect back, eventually counting the delay in the process to map an environment digitally.
The technology is similar to the laser-based FaceID system but LiDAR can scan distances up to 5 meters. This allows the phone to see and locate itself virtually in space.
Are iPhones the only products to use LiDAR?
Honestly, LiDAR has requirements in so many aspects of our lives rather than sitting on the back of an iPhone or iPad. All those self-driving Tesla vehicles you see rely on LiDAR sensors to map the road ahead and help the onboard computers to drive themselves. LiDAR has been long in use with autonomous vehicles.
LiDAR also has applications in the field of robotics as well as AR glasses, the latter scanning rooms to place virtual objects within the limits.
How does LiDAR help the iPhone 12 Pro?
To explain in simply words, LiDAR is an efficient way to map depths. That means the iPhone 12 Pro can use it for its camera focusing instead of securing it for AR applications alone. On the Pro models, the LiDAR system helps with autofocus in low light and even the portrait mode photos – an area where the standard iPhone 12 struggles at times.
Then there’s AR that relies on this sensor to improve the accuracy of tracking. There are several apps on the App Store for scanning a room or offering nearly accurate measuring tools. Moreover, there are several AR based games that can now make use of the dedicated hardware to improve their experiences.
Could LiDAR become a necessity for phones in future?
The chances for that happening are low. Apple has been pushing Augmented Reality on its ecosystem hard and so far, the benefits are pretty limited. Apart from the AR based games and some of those scanning apps, AR as a concept itself is yet to reach its full capacity. Currently, AR is restricted to casual games, education, and basic scanning applications in Apple world.
However, Google has proven that you don’t need dedicated hardware to achieve similar results. On Android phones, you can check out AR figures in the virtual space on almost all modern devices. Google’s smart algorithms can rely on intelligence and just a single camera sensor to do decent AR applications. Pokemon Go is a fine example that works just as good on an affordable Android device as an iPhone with the LiDAR sensor. Of course, having the LiDAR sensor may help developers map the virtual world of Pokemons better in low light conditions. Similarly, Google Maps also relies on AR to show directions in the AR space, if you enable it.
Hence, even though LiDAR seems useful, Android manufacturers could simply rely on Google’s smart algorithms to do all the necessary AR stuff.
Does that mean LiDAR is gimmick?
At this stage, LiDAR has limited applications in the world of smartphones. Other than scanning 3D spaces and placing game elements, there’s no requirement that seeks LiDAR hardware. That said, Apple’s widespread adoption of LiDAR could give developers the incentive to make their apps rely more on this advanced hardware for improving the end-user experience.