The new iPhone 12 Pro has a rear facing “room scale” LIDAR sensor. It can now create point clouds and 3D models at ranges up to around 5m, and resolve objects down to about the size of a chair with enough accuracy to create a 3D model.

Annotated screen shot from https://www.apple.com/au/iphone-12-pro/

Since for now, this tech is limited to a very small number of phones in people’s pockets, I suspect in the short term we’ll see (at least) four emerging use cases for it.

  1. Promotional “tech demos”: I’m sure there are many people just like me pitching ideas about LIDAR enabled AR to companies. If IKEA doesn’t have an amazing demo showing which of their products fits in your lounge room/bedroom before Xmas, I’ll be very disappointed with my industry colleagues.
  2. Interior design and architecture apps: I can easily imagine that things like high-end interior architecture or home extension projects could include an iPhone 12 Pro for the customer as a small part of the project budget, allowing the architects to share 3D modelled AR with clients who could see in-progess designs and how they’ll look in their existing rooms. Things like how the current longeroom will look when the staircase for the new 2nd story extension is added, or how a project to remove and interior wall to create a large open kitchen dining room will look, complete with switchable choices of bench top material and appliance choice and finish – you’d be able to “look through the walls” of your existing lounge room to see how the combined view with the open plan kitchen would look from anywhere. Initially, these sorts of applications will be limited to high budget projects, where the $1500 for a phone is a reasonable expense to increase customer satisfaction and improve communication of complex 3D modelled concepts.
  3. Professional/business tools: While app developers won’t be able to assume most users have a phone capable of this any time soon, they know that for lots of use cases, other businesses will be perfectly happy to buy an iPhone 12 Pro just for this capability. I imagine lots of apps aimed for use by professionals like interior designers, painters, carpet layers, home theatre installers – anyone for whom accurate room measurements taken extremely quickly and easily will save them time and money. A painter could wave a phone around and get back exact square meter measurements of all wall surfaces meaning they’ll be safely able to order the exact amount of paint required. A carpet layer can create accurate enough floor plans to both quote the job and pre cut carpet before arriving to save time on the job while installing. A home theatre installer can get cm accurate measurements of positions for the screen, projector, speakers, and all ancillaries like amps wiring and power. This would allow them configure speaker delays, room correction EQ, projector lens focus and throw config, and to cut speaker hdmi and power cables to length before turning up to install.
  4. Magical camera applications (or “cheating”, according to many of my old school photographer friends): This is what Apple are using this for first, portrait mode now detects where people/heads are in any shot, and selectively processes areas of the image based in that, so the background gets that famous bokeh blur that “real photographers” need to spend years and thousands of dollars on lenses learning how to reliably achieve in a broad range of settings. Wedding photographers are gonna hate this. Instagram models and your selfie shooting friends are gonna _love_ it. 3rd party camera app developers are going to have a field day finding innovative new uses for this, adding LIDAR 3D dot clouds into their computational photography algorithms – especially with the enormous procession power on new phones – will produce some spectacular results (as well and many many failed experiments and cheesy new Snapchat-like filters). I’m really looking forward to seeing what people like https://halide.cam come up with here.

LEAVE A REPLY

Please enter your comment!
Please enter your name here