Apple will be bullish on lidar, a technology new to the iPhone 12 family, specifically on. ( with Pro Max following in a few weeks.)
A closer look at one of the new models of the iPhone 12 Pro, or the, and you will see a small black dot near the camera lenses, about the size of a flash. That’s the shutter sensor, and it’s a new type of deep-sensing that can make a difference in a number of interesting ways.
If Apple has a way, fulfillment will be a term you will start to hear today, so let’s separate what we know, what Apple will use it for and where technology can follow.
What does fulfillment mean?
Lidar means light detection and coverage, and is around for a while. It uses lasers to ping objects and return to the laser source, measuring distance by time travel, or flying, of a light pulse.
How does lidar work to understand depth?
Lidar is a type of time-of-flight camera. Some other smartphones measure the depth with a single light pulse, while a smartphone with this type of tech will perform sending waves of light pulse into a spray of infrared dots and can measure each of its sensors, creates a field of points that maps the distance and can “mesh” the dimensions of a space and the objects within it. Light pulses are invisible to the human eye, but you can see them using a night vision camera.
Isn’t it like Face ID on the iPhone?
It is, but has a longer scope. The idea is the same: Apple’salso shoots a range of infrared lasers, but can only operate up to a few feet away. The rear sensor mounts on the iPad Pro and iPhone 12 Pro work in a range of up to 5 meters.
Lidar is in many other tech
Lidar is a tech that is popping up everywhere. It was used for, o . It was used for at . Augmented reality headsets like have similar tech, mapping room spaces before placing 3D virtual objects on them. But it also has a long, long history.
The old depth of Xbox sensing in Xbox accessories, the, is also a camera that also has infrared depth scanning. In fact, PrimeSense, the company that helped make Kinect tech, . Now, we have Apple’s TrueDepth face scan sensors and rear camera shutter.
The iPhone 12 Pro camera can work better on the shutter
The time-of-flight cameras on smartphones tend to be used to improve accuracy and focus speed, and the iPhone 12 Pro will do the same. Apple promises better focus of low-light lighting, up to 6x faster in lighter conditions. Deepening of the depth of the image will also be used to improve the effects of the night portrait mode.
Better focus is a plus, and there’s also a chance that the iPhone 12 Pro could add more 3D image data to images, too. Although that component has not yet been laid out, the front-facing, Apple front-facing TrueDepth camera is used in a similar way to apps.
It will also enhance augmented reality
Lidar will allow the iPhone 12 Pro to start AR apps faster, and build a quick map of a room to add more detail. Lots oftakes advantage of the shutter to hide virtual objects behind real objects (called occlusions), and place virtual objects inside more complex mapping rooms, such as a table or chair.
But there is a lot of potential beyond that, with a longer tail. Many companies dream of headsets mixing virtual objects and real: AR glasses,, , , , at etc., relies on the availability of advanced 3D world maps to layer virtual objects.
Those 3D maps are now built with special scanners and equipment, much like the scanning version of the world of cars on Google Maps. But there is a possibility that people’s own devices could help crowddsource that information, or add extra on-the-fly data. Again, AR headsets like Magic Leap and HoloLens have already prescan your environment before putting things in it, and AR works with Apple Coverar tools the same way. At that point, the iPhone 12 Pro and iPad Pro are just like AR headsets without a headset … and that could be the way for Apple to make its own glasses later.
3D scanning could be the killer app
Lidar can be used to mesh 3D objects and rooms and layers of the imagery above, a technique called photogrammetry. That may be the next wave of capture tech for such practical use, or even social media and journalism. The ability to capture 3D data and share that information with others can turn phones and tablets equipped with this lidar device into 3D content capture tools. Lidar can also be used without camera elements to get measurements for objects and spaces.
Apple is not the first to explore tech like this in a phone
Google came up with this idea– an early AR platform – was created. The advanced camera array also has infrared sensors and can map rooms, creating 3D scans and depth maps for AR and for measuring indoor spaces. Phones equipped with Google Tango are short-lived, replaced by computer vision algorithms that perform approximate depth sensing on cameras without the need for the same hardware. But the Apple 12 iPhone looks like a more advanced successor.