Lidar used to cost $75,000—here’s how Apple brought it to the iPhone
At Tuesday’s unveiling of the iPhone 12, Apple touted the capabilities of its new lidar sensor. Apple says lidar will enhance the iPhone’s camera by allowing more rapid focus, especially in low-light situations. And it may enable the creation of a new generation of sophisticated augmented reality apps.
Tuesday’s presentation offered little detail about how the iPhone’s lidar actually works, but this isn’t Apple’s first device with lidar. Apple first introduced the technology with the refreshed iPad in March. And while no one has done a teardown of the iPhone 12 yet, we can learn a lot from recent iPad teardowns.
Lidar works by sending out laser light and measuring how long it takes to bounce back. Because light travels at a constant speed, the round-trip time can be translated into a precise distance estimate. Repeat this process across a two-dimensional grid and the result is a three-dimensional “point cloud” showing the location of objects around a room, street, or other location.