Tech & Gaming

FaceTime feature in iOS 13 feigns eye contact during video calls – Ars Technica

Look into my eyes —

Is this FaceTime feature a nifty improvement for video chats, or is it just creepy?


Front of an iPhone XS Max
Enlarge /

The cameras on iPhones are getting (selectively) smarter.

Apple introduced several of the headlining features of its upcoming iOS 13 during WWDC, but people playing with the closed beta version have uncovered some additional tools. One newly found addition is FaceTime Attention Correction, which adjusts the image during a FaceTime video call to make it look like a person is looking into the camera rather than at their device’s screen.

In practice, that means that while both you and your contact are looking at each other’s faces, you’ll both appear to be making direct eye contact. Mike Rundle and Will Sigmon were the first to tweet about the find, and they describe it as uncanny, “next-century shit.” Another beta tester, Dave Schukin, posited that the feature relies on ARKit to make a map of a person’s face and use that to inform the image adjustments.

Guys – “FaceTime Attention Correction” in iOS 13 beta 3 is wild.

Here are some comparison photos featuring @flyosity: https://t.co/HxHhVONsi1 pic.twitter.com/jKK41L5ucI

— Will Sigmon (@WSig) July 2, 2019

The feature appears to only be rolling out to the iPhone XS and iPhone XS Max with the current beta testing. It will get a wider release to the general public when iOS 13 officially goes live, which will likely be sometime this fall.

Apple has been introducing more and more features centered on automatically changing images. It has been giving its cameras tools like Smart HDR, which analyzes and composites multiple frames for the “best” shot or automatic reductions in the effect of shaky hands. Usually, these tools are optional, although you may need to dig around in your device’s settings to make sure the tools are off rather than on by default.

It’s a slick application of Apple’s augmented reality tools, which are admittedly impressive and powerful. But how many people are clamoring for this feature? Is there any real benefit to making it seem like we’re staring into each other’s windows to the soul when we FaceTime?

How iOS 13 FaceTime Attention Correction works: it simply uses ARKit to grab a depth map/position of your face, and adjusts the eyes accordingly.

Notice the warping of the line across both the eyes and nose. pic.twitter.com/U7PMa4oNGN

— Dave Schukin ? (@schukin) July 3, 2019

There’s an argument in favor of making the experience of video chatting feel more natural, but there’s an equal argument against forcing an appearance of intimacy or attention. Eye contact is one of those subtle cues that varies from person to person. If someone who rarely meets my gaze seems to suddenly be doing so all the time on a video call, that’s going to make our interaction more surreal, not less.

There’s no reason we should pretend that digital photography is exactly like film photography. The simplicity of point-and-click on a smartphone lets anyone capture a scene without special hardware or knowledge. And with editing tools ranging from Snapchat filters to Lightroom, you can tweak an image to fit whatever aesthetic you might desire.

But adding more and more tools to automatically make a visual look more “perfect” is only useful if you believe in that definition of perfection. Do you want idealized or do you want real? That’s not a judgment everyone wants their smartphone to make for them.

Read More

Related Articles

Back to top button
>