What the Hell is LiDAR? | Hendrik Erz

Abstract: There is a rumour around that Apple will allegedly introduce LiDAR sensors into the iPhone 13. In this article, I want to debunk that mystery. If Apple is reasonable, they won’t introduce LiDAR sensors to their iPhones. Here, I give you the reasons why that is the case. I explain what LiDAR sensors are, why the rumour might have begun, and why it doesn’t hold.


A few minutes ago, the most recent Apple keynote event ended, starting a week of the annual WWDC (World Wide Developers Conference). I have been meaning to write this article last week already, but because I managed to compose another iteration of my “How I Work” series, a piece on organisation made the cut. However, I’m freshly inspired from Apple’s keynote event (and still weirded out by this American tendency to wrap everything in “awesomeness”) and really want to get this out: Why the f*** is there the rumour that Apple will (allegedly) include a LiDAR sensor in its newest iPhones?!

What is LiDAR?

LiDAR is an acronym for “Light Detection and Ranging” and does not randomly remind of RADAR (Radio Detection and Ranging). While Radars use radio waves in order to determine the distance and speed of objects (such as aircraft, the most common usage), LiDAR use light. The principle is very easy: The sensor sends out a light impulse and measures the time it takes the light beam to be reflected off some surface and return to the emitter. As you might know, light travels at approximately 300,000 meters per seconds, so if the light beam takes half a second to get back at you, you know that it has been reflected off of something 150 kilometres away.

The technology has three distinct weaknesses that are intrinsic and cannot be overcome with any update. First, it is extremely old. Second, it’s extremely difficult to handle, especially in automated contexts. And lastly, it’s utterly expensive. I will be referring a lot to a report I wrote last year for the Institute for Peace Research and Security Policy Hamburg (IFSH). In case you didn’t know: My roots are not just in Sociology, but also in Peace and Conflict Research. Thus, I spent a year researching military soft- and hardware! You can find the report online here (unfortunately, German only – but Google Translate should do the job).

Let us begin with the first weakness: history. The technology is pretty old and originates somewhere in the 1960s. Thus, it might not surprise you to hear that it’s heavily used for military ends. Missiles, jets, and other military assets use LiDAR regularly in order to find and target enemy vehicles. You might have heard of “laser guided missiles” — that’s a clear indicator that LiDAR is being used. Of course, it is also being used to civilian ends. One of the most prominent use cases might be laser measuring to estimate distances (e.g. the size of rooms), and such devices are available at many tool shops.

LiDAR, however, is (at least when it’s deployed at scale) much more difficult to handle than Radar. The benefit of Radar as opposed to LiDAR is that Radar can send out a signal that covers much more space. While LiDAR sensors are restricted to a single light beam, Radars can literally shoot off their radio waves into the void and anything in the beam will reflect the waves back to the sensor so that the Radar appliance can tell you what’s out there. LiDAR sensors are restricted to a single, directed beam of light. In other words: If you want to hide something from a LiDAR sensor, it’s easy: Just hope that your adversary doesn’t point the LiDAR directly at you. Hiding from Radar is much more difficult.

If that example is too complex, here’s a handy experiment (I hope that it’s dark wherever you are): Turn on the flashlight of your smartphone. What area does the flashlight cover? Only the part in front of your smartphone, right? And now take a look at the signal bars of your smartphone (please, don’t be in a basement for this): It should tell you it has signal. But the broadcasting tower or router to which your smartphone is connected is probably at a very different angle. (If you’re right now pointing your smartphone at your router, please turn around for a second. Do you still have signal? Yes? I thought so.) Your flashlight uses light (obviously), but WiFi and cellular networks use radio waves. Radio waves have the benefit that they are not restricted to a certain direction. Ever wondered why it was called radio wave? Well, because the waves are emitted radially.

The last drawback when using LiDAR sensors is that they are incredibly expensive. I just did a quick search on Amazon, and I wasn’t disappointed: While gyroscopes (those sensors that estimate the rotation of your smartphone and are used on every aircraft in order to stabilise it) come in four-packs for ten bucks, a single, low-quality LiDAR sensor costs at least fourty Dollars. Thus, LiDARs are very expensive, especially given the quality standards of Apple.

Does Apple Introduce LiDAR in their iPhones?

Given all the info above, does Apple actually introduce LiDAR sensors in their iPhone 13? Honestly, I hope they don’t. I mean, it won’t really impact their insane prices either way, but it’s not really necessary anymore. The reason is that you can use a much cheaper sensor to measure distances: cameras.

“But how can you measure distances with camera images? Aren’t these only two-dimensional?!”

Two words: Machine Learning.

Okay, that probably sounded not as cool for you as it sounded in my head, but yes: Machine learning is the answer here. And the best is, you don’t even need machine learning, if all you want is a crude approximation instead of measuring by the millimetre. In case you have an iPhone, feel free to open the “Measure” app on your phone. It will make use of your iPhone’s camera and allow you to set a starting and an end point. The iPhone will give you a pretty accurate measurement of distance between those two points. But how does it do that?! Well, if you hold your iPhone still while opening the app, it will ask you to move your iPhone around for a bit before you can start to mark your points. The reason is that the iPhone is using so-called Parallax-effects in order to be able to measure distances.

You know parallax effects from travelling by car or train (or, sometimes, even plane): Farther away objects appear to move slower than things that are directly next to the street or railway. In planes, you will notice that the tip of high(er) objects will move a tad faster than the base of the object. By calculating the difference of several objects in the camera view from frame to frame, your iPhone is able to estimate how far these are away and allow you to measure the distance between points by simply using the camera.

Using machine learning, you can increase this precision by a huge margin. Tesla introduced this technology back in 2019 (see, again, my report here, page 17) and it allows their cars to move fairly good using distance measurements continuously taken by analysing the camera images. Basically, machine learning in this case is being used to step up distance measuring, because instead of using simple formulae to measure the difference between frames, it can take into account far more parameters that might help.

And iPhones already have the computing power necessary to facilitate this: With the so-called “Neural Engine” (a fancy name for a glorified GPU chip) it has the capabilities of analysing camera images using machine learning. And as such, all that’s necessary to get distance measurements as accurate as a LiDAR sensor is a simple software update. It is, in fact, difficult to design and train such a neural network, but as Tesla has proven, it’s certainly possible. So if Apple isn’t completely off here, it will not introduce LiDAR into their iPhones, and instead provide a model that enables your iPhone to take accurate distance measurements using only your camera.

Why Is The Rumour Out There?

This leaves a last question: Why is this rumour out there in the first place? Well, from the sources I’ve read it’s because, allegedly, LiDAR allows for much better integration of VR capabilities — yet another over-qualified name for something not-that-fancy. On Android devices, it might actually make sense to include a LiDAR sensor for scanning in three dimensions, but only if those devices have less computing power. If they have an equivalent to Apple’s “Neural Engine”, they won’t need a LiDAR sensor as well. Another source I’ve read claims that a LiDAR sensor would “improve the portrait mode” — again, nothing you couldn’t solve with a software update.

I think the rumour is out there, because (a) many people don’t have a clue what they’re talking about, and (b) it does have a fancy tone to it. Imagine the tag line: “iPhone 13: Now with LiDAR enhanced capabilities!” Fancy, right? Makes you immediately want to buy it. Additionally, judging from the history of claims that included a reference to “military proven design”, it might make for a better sale to have people pay for something that’s “military” because the military obviously needs high-quality and long-enduring devices, right? But purely from a pragmatic viewpoint, including a LiDAR sensor in your iPhone does not have any benefits that couldn’t be achieved with what’s already there.

Instead, Apple might finally want to re-integrate their camera bump into the case again. That’s one interesting engineering challenge to solve!

Suggested Citation

Erz, Hendrik (2021). “What the Hell is LiDAR?”. hendrik-erz.de, 7 Jun 2021, https://www.hendrik-erz.de/post/what-the-hell-is-lidar.

Did you enjoy this article? Leave a tip on Ko-Fi!

← Return to the post list