The iPhone 12 Pro has a new sensor on its rear camera cluster, a LIDAR. You’ll also hear this referred to by other names like 3D ToF (Time of Flight) sensor or ToF dot projector. From a technical point of view, this is an amazing thing especially in such a small package. It sends out beans of (infrared “invisible to human eyes”) light – and measures how much time it takes for the light to reflect of what it hits and bounce back. This is a mindbogglingly short amount of time, because light travels mindbogglingly fast – 3×10^8 or 300,000,000m per second (that’s slightly over 1 billion kmh!). That means the time it takes to travel 1m is about 0.000,000,003 of a second, and to resolve 1cm differences you need to be able to measure the difference between 0.000,000,003 and 0.000,000,003,01 seconds.
LIDAR itself as a technology is not new. You might have heard of it in the context of self driving cars, where they use it to build a real time 3D models of the world you can see from a car, but those are tens of thousands of dollars each, and mostly fairly large with moving parts like spinning mirrors which consume a lot of power.
LIDAR in phones is also not new. The iPhone10 3 years ago had a LIDAR sensor on its front facing camera, which it uses to great effect for accurate and secure “Face ID”. Many Android phones have LIDAR sensors as well. These are all much smaller than self driving car LIDAR devices, and require much lower power. The trade off is in range mostly, a self driving car needs to “see” at ranges of 100m or more. A phone has different requirements. Instead of a spinning mirror scanning the laser beam around like most self driving car LIDARs, phones (and years ago, Microsoft’s Connect Xbox accessory!) use “dot projectors” where they send out hundreds or thousands of laser pointer-like “dots” at the same time. The density of these dots determines the resolution the device is capable of.
The difference between this new iPhone 12 LIDAR sensor and the one that first appeared on the iPhone 10s front facing “Face ID” one is range and density. The Face ID sensor is optimised for short ranges of around half to one meter, and to have the necessary resolution to discern facial features accurately (like your eyes and sockets, the edges and length of your nose, your mouth and lips lips, and your jaw line). The new sensor is best described as “room scale”. It’ll measure distances out to around 5m, but with a lower dot density – so each dot measurement is several cm apart (and further at the far end of the 5m range. This means it can’t do accurate facial models of people in your doorway, but what it can do is accurately measure medium and large objects. In practice, you can accurately map walls, doorways, windows, furniture, cats – things like that.
iFixit, the notorious tech device teardown crew, have a really enlightening short section in one of their youtube videos using an ir camera to show the differences between the iPhone FaceID LIDAR and the iPad Pro rear facing LIDAR (a few months older, but very similar to the iPhone 12 Pro). The LIDAR-related section of their video starts here: https://youtu.be/xz6CExnGw9w?t=102
This is the rear facing LIDAR dot pattern. Note how on the wall, the dots are around 10 or so cm apart, while on the much closer face they’re still a few cm apart:
This is the FaceID LIDAR dot pattern. These dots are so dense on the face that they’re difficult to distinguish (it’s much more obvious in the moving video if you followed the link above). You can sort of see on the base that the dots are only a few mm apart. The back wall dots are only a cm or so apart, but these are too far away for the FaceID system to reliable measure the range.
And, of course, because the internet loves cat videos – they also did this:
What I expect to see.
Since for now, this tech is limited to a very small number of phones in people’s pockets, I suspect in the short term we’ll see (at least) four emerging use cases for it.
- Promotional “tech demos”
- Interior design and architecture apps
- Professional/business tools
- Magical camera applications
Update 20201019: Are Technica have a really good deep dive into LIDAR, and how Apple have gone from a $75,000 price point to something they can ship in phones today. If you want a primer on Vertical Cavity Surface Emitting Lasers and Single Photon Avalanche Diodes, check this out: https://arstechnica.com/cars/2020/10/the-technology-behind-the-iphone-lidar-may-be-coming-soon-to-cars/