Friday, April 26, 2024
HomeTechnology NewsAppleVirtual photos can also be taken in a single shot. How does...

Virtual photos can also be taken in a single shot. How does the new iPhone SE camera Quality do it?

The use of a single camera to achieve virtual photography is nothing new, the previous iPhone XR and earlier Google Pixel 2 and now iPhone SE camera quality has had similar attempts.

Apple’s new iPhone SE, its camera element is too old, and the main credit is still in the new algorithm.

What you think of iPhone SE Camera Quality, is copied from XR?

iphone-camera-lens
From left to right, the camera sensors of the new iPhone SE, iPhone 8 and iPhone XR

The practice of “new wine in old bottles” is not uncommon for iPhone SE. Back to four years ago, when the first generation of the iPhone SE also featured a 5s look and most of the hardware, only to keep the 6s on the chip, giving users a lower-cost performance experience of the flagship machine at a lower price.

You may argue, isn’t the iPhone XR also a single-shot blur, isn’t SE a plan to copy it?

In theory, when copying the same camera hardware, the camera characteristics of the two should not be much different but iPhone XR is totally different hardware.

From this point of view, the iPhone SE camera quality can take portrait mode photos is very special: first, it does not take multiple shots, second, it does not have Face ID as iPhone XR has, there is no possibility of hardware support.

iPhone SE and 8 both have one rear lens

iPhone SE Vs iPhone 8



iPhone 8 does not support taking small deep-field photos with a clear subject and blurred background, which is what we often call “portrait mode.”

But when you look at Apple ’s support page, you will find that the portrait mode that is not supported by the iPhone 8 is supported by the new iPhone SE-even if both have only one rear lens and the specifications are the same.

Under normal circumstances, mobile phones take “portrait mode” such as virtual photos have to rely on double-camera to complete – like the human eyes, two different positions of the lens will get two different angles of the picture, and when combined with the difference in viewing angle to estimate the depth of field, to achieve the background virtualization, to maintain the body.

The Plus series on the list, or the X, XS, and 11 in recent years, basically rely on the multi-camera system to complete the portrait blur shooting.

So how does the iPhone SE front-facing single camera-shot work? 

matrix-projector

The core lies in the infrared dot matrix projector in the Face ID system. It can also obtain sufficiently accurate depth data, which is equivalent to an “auxiliary lens”.

Apple has made some changes at the software level that we cannot see.

Recently, Ben Sandofsky, the developer of the third-party camera application Halide, revealed the technical principles, explaining why the new iPhone SE uses the same single-lens specifications as the iPhone 8, but it can achieve the portrait photo mode that the latter cannot achieve.

They said that the new iPhone SE is likely to be “the first iPhone that can generate a portrait blur effect using only a single 2D image. “

However, the disassembly proves that the cameras of iPhone SE and iPhone XR are not consistent, which also leads to technical differences between the two.

As for the new iPhone SE, because its sensors are too old, Halide claims that it cannot rely on the sensors to obtain parallax maps, and basically can only rely on the machine learning algorithm provided by the A13 Bionic chip to simulate and generate depth data images.

The explanation in one sentence is that the portrait blur shooting of iPhone SE is entirely realized by software and algorithms (A13 Bionic Chip).

The new 13-inch MacBook Pro is released: the initial storage is doubled, faster performance and Magic Keyboard is returned.

iPhone SE Camera Quality Test

puppy
Directly face this photo with iPhone XR and new iPhone SE

The iPhone XR and the new iPhone SE to take a picture of a puppy (not a real shot, just to take a “photo”), and then compared the depth data of the two pictures.

They found that the iPhone XR just did a simple image segmentation to pull out the main body, but did not correctly recognize the puppy’s ears.

Puppy-Depth-data-map
Depth data graph, iPhone XR on the left, new iPhone SE on the right

But on the new iPhone SE, with the new algorithm provided by the A13 chip, we got a depth map completely different from XR. Not only does it correctly recognize the puppy’s ears and overall outline, but it also does layered processing for different backgrounds.

This kind of depth map is not 100% accurate. Halide said that the accuracy of the cutout and blurring of the new iPhone SE when shooting non-face-faced blurred photos is not as accurate as when taking portraits.

Especially in the case where some subjects and background images are very blurry, the advantage of multiple cameras will be more obvious at this time.

iPhone SE Camera Quality because of A13 Bionic Chip!

In general, the portrait blurring achieved by this new iPhone SE is the limit that a single-camera phone can achieve by software optimization. Strictly speaking, this is actually due to the A13 chip. If it did not bring the latest machine learning algorithm, relying solely on an outdated camera, the SE shooting experience has to be a half-fold.

I don’t know if I can wait another four years when the next generation of iPhone SE comes out, will single-camera still have a place in the mobile phone industry?

Gaurav Verma
Gaurav Vermahttps://techigtv.net
Anything is possible I don't think limits, Beat the Best

Most Popular

Recent Comments