Home https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ Technology https://server7.kproxy.com/servlet/redirect.srv/sruj/smyrwpoii/p2/ Apple wants to make lidar a big deal on the iPhone 12 Pro and beyond. What it is and why it matters

Apple wants to make lidar a big deal on the iPhone 12 Pro and beyond. What it is and why it matters



apple-iphone12pro-back-camera-10132020.jpg

The lidar sensor of the iPhone 12 Pro – the black circle on the bottom right of the camera unit – opens up AR possibilities.

Apple

Apple will be bullish on lidar, a technology new to the iPhone 12 family, specifically on the iPhone 12 Pro and iPhone 12 Pro Max. (The iPhone 12 Pro is on sale now, along with the iPhone 12 Pro Max with Pro Max following in a few weeks.)

A closer look at one of the new models of the iPhone 12 Pro, or the latest iPad Pro, and you will see a small black dot near the camera lenses, about the size of a flash. That’s the shutter sensor, and it’s a new type of deep-sensing that can make a difference in a number of interesting ways.

If Apple has a way, fulfillment will be a term you will start to hear today, so let’s separate what we know, what Apple will use it for and where technology can follow.

What does fulfillment mean?

Lidar means light detection and coverage, and is around for a while. It uses lasers to ping objects and return to the laser source, measuring distance by time travel, or flying, of a light pulse.

How does lidar work to understand depth?

Lidar is a type of time-of-flight camera. Some other smartphones measure the depth with a single light pulse, while a smartphone with this type of tech will perform sending waves of light pulse into a spray of infrared dots and can measure each of its sensors, creates a field of points that maps the distance and can “mesh” the dimensions of a space and the objects within it. Light pulses are invisible to the human eye, but you can see them using a night vision camera.

ipad-pro-ar

The iPad Pro released in the spring also has lidar.

Scott Stein / CNET

Isn’t it like Face ID on the iPhone?

It is, but has a longer scope. The idea is the same: Apple’s Enabling TrueDepth Face ID camera also shoots a range of infrared lasers, but can only operate up to a few feet away. The rear sensor mounts on the iPad Pro and iPhone 12 Pro work in a range of up to 5 meters.

Lidar is in many other tech

Lidar is a tech that is popping up everywhere. It was used for self-driving cars, o helps with driving. It was used for robotics at drones. Augmented reality headsets like HoloLens 2 have similar tech, mapping room spaces before placing 3D virtual objects on them. But it also has a long, long history.

The old depth of Xbox sensing in Xbox accessories, the Kinect, is also a camera that also has infrared depth scanning. In fact, PrimeSense, the company that helped make Kinect tech, was acquired by Apple in 2013. Now, we have Apple’s TrueDepth face scan sensors and rear camera shutter.

XBox_One_35657846_03.jpg

Remember Kinect?

Sarah Tew / CNET

The iPhone 12 Pro camera can work better on the shutter

The time-of-flight cameras on smartphones tend to be used to improve accuracy and focus speed, and the iPhone 12 Pro will do the same. Apple promises better focus of low-light lighting, up to 6x faster in lighter conditions. Deepening of the depth of the image will also be used to improve the effects of the night portrait mode.

Better focus is a plus, and there’s also a chance that the iPhone 12 Pro could add more 3D image data to images, too. Although that component has not yet been laid out, the front-facing, Apple front-facing TrueDepth camera is used in a similar way to apps.

lidar-powered-snapchat-lens.png

Snapchat has enabled AR lenses using the iPhone 12 Pro lid.

Snapchat

It will also enhance augmented reality

Lidar will allow the iPhone 12 Pro to start AR apps faster, and build a quick map of a room to add more detail. Lots of Apple’s AR updates to iOS 14 takes advantage of the shutter to hide virtual objects behind real objects (called occlusions), and place virtual objects inside more complex mapping rooms, such as a table or chair.

But there is a lot of potential beyond that, with a longer tail. Many companies dream of headsets mixing virtual objects and real: AR glasses, employed by Facebook, Qualcomm, Snapchat, Microsoft, Magic Leap at most likely Apple etc., relies on the availability of advanced 3D world maps to layer virtual objects.

Those 3D maps are now built with special scanners and equipment, much like the scanning version of the world of cars on Google Maps. But there is a possibility that people’s own devices could help crowddsource that information, or add extra on-the-fly data. Again, AR headsets like Magic Leap and HoloLens have already prescan your environment before putting things in it, and AR works with Apple Coverar tools the same way. At that point, the iPhone 12 Pro and iPad Pro are just like AR headsets without a headset … and that could be the way for Apple to make its own glasses later.

occipital-canvas-ipad-pro-lidar.png

A 3D room scan from Occipital’s Canvas app, powered by lidar depth perception on the iPad Pro. Expect the same for the iPhone 12 Pro, and maybe more.

In the meantime

3D scanning could be the killer app

Lidar can be used to mesh 3D objects and rooms and layers of the imagery above, a technique called photogrammetry. That may be the next wave of capture tech for such practical use home improvement, or even social media and journalism. The ability to capture 3D data and share that information with others can turn phones and tablets equipped with this lidar device into 3D content capture tools. Lidar can also be used without camera elements to get measurements for objects and spaces.

google-tango-lenovo-1905-001.jpg

Do you remember Google Tango? It had a deep sense as well.

Josh Miller / CNET

Apple is not the first to explore tech like this in a phone

Google came up with this idea Project Tango – an early AR platform on only two phones – was created. The advanced camera array also has infrared sensors and can map rooms, creating 3D scans and depth maps for AR and for measuring indoor spaces. Phones equipped with Google Tango are short-lived, replaced by computer vision algorithms that perform approximate depth sensing on cameras without the need for the same hardware. But the Apple 12 iPhone looks like a more advanced successor.






Played:
Watch this:

Explained iPhone 12, iPhone 12 Mini, Pro and Pro Max




9:16


Source link