Readit News logoReadit News
LorenDB · a year ago
I'm genuinely shocked. I assumed that Apple would have foreseen this possibility and locked the Persona's eyes somewhere as long as the user was typing, at least for passwords.
generalizations · a year ago
Whole point of the digital face is to look real though, and freezing the gaze would look unnervingly fake.
dwallin · a year ago
I'm confident they could come up with a filler eye animation algorithm that was convincing enough to pass muster for short periods of time. Even if hand coding something didn't quite work out, they certainly have tons of eye tracking data internally they could use to train a small model, or optimize parameters.
LorenDB · a year ago
But you could at least dampen out or randomize eye travel while looking at the keyboard. Fully reproducing eye output is a recipe for disaster, and that should have been obvious.
kobalsky · a year ago
add sunglasses to the avatar while typing
magicalhippo · a year ago
Just have them close their eyes? That's what I do when I have to recall my password anyway.
JamesSwift · a year ago
Just do the same thing the external display does and do a 'cloudy eyes' version when they user is interacting w/ the keyboard.
sli · a year ago
If I were implementing it and wanted to obscure, I'd blur the whole screen momentarily, probably with a small message. I really doubt that's ideal for a commercial offering, though. I'm not really worried about unnerving people if I'm using an avatar, that comes with the territory as it is.
bsza · a year ago
Why? Most people are capable of fixating at a single point with basically no perceptible eye movement.
throw10920 · a year ago
It would, wouldn't it?

I'd suggest blurring the face in a "password input context" (like password fields on the web with their redacted display text), but I suspect that that'd go against what Apple wants the Vision Pro experience to look like.

darby_nine · a year ago
Then it shouldn't be used for secure input.
talldayo · a year ago
> I assumed

Oh man, this is my favorite part of the Apple Design Cycle!

1. Apple announces a new feature that is suspiciously invasive and only marginally useful (eg. iCloud Screening, Find My, OCSP, etc.)

2. Self-conscious, Apple releases a security whitepaper that explains how things should work but doesn't let anyone audit their system

3. Users assume that things are okay because the marketing tells them it is okay, and do not ever consider the potential for an exploit

4. The data leaks, either to advertisers, Apple employees, warrantless government allies, government adversaries or OEM contractors

5. Apple customers attempt to absolve themselves of responsibility ("How was I supposed to know?")

I've seen this process so many times at this point that I'm just apathetic to it all. Maybe one day people will learn to stop assuming the best when there is literally no evidence corroborating it.

KerrAvon · a year ago
What data leaks? What are you talking about?
KerrAvon · a year ago
Please note this is fixed:

> The researchers alerted Apple to the vulnerability in April, and the company issued a patch to stop the potential for data to leak at the end of July

KaiserPro · a year ago
They released airtags without thinking about stalking, so I'm not that shocked.
thebruce87m · a year ago
This has to be a lie made on purpose since it is so easily proven wrong.

Here is the keynote: https://www.youtube.com/live/JdBYVNuky1M?si=46vw7FG3SjWWBezn

9.25 is when they talk about unwanted tracking.

nicolas_17 · a year ago
What?? It had much better anti-stalking features at launch than its competitors like Tile.

Deleted Comment

bgirard · a year ago
If you look at the video, it's not only the eyes here. There's a huge head movement too. Having a keyboard so large in your FOV that you have to turn your head to type something is a contributing factor.

I wonder what the accuracy is if you drop the eye tracking and only do head tracking on that demo.

dagmx · a year ago
It would be interesting to see both isolated.

I don’t think eye tracking alone would give you the necessary bounds for inferring the keyboard size. For one, eyes flit around more and also are harder to see.

I also wonder how easily this attack is foiled by different key clusters. E.g it looks like they’re relying on large head movements at opposite ends of the keyboard to infer the bounds.

But keyboard use can be very clustered which would foil the ability to know how wide the user has the keyboard.

I imagine it also breaks when the user moves the keyboard

sparsely · a year ago
Finally all those banks with randomised input grids on their websites are validated!
generalizations · a year ago
It'd be pretty cyberpunk if the mitigation to this is to have your eyes digitally obscured when typing in sensitive data.
steve1977 · a year ago
And we know the only viable option would be simulated mirror shades
wrboyce · a year ago
But then a would-be attacker could simply read what you type in the reflections!
KineticLensman · a year ago
And perhaps replaced with a cartoon ‘x’ if your life signs terminate while you are using the device
adolph · a year ago
Shades of the Lotus Notes “Visual Hash”

https://security.stackexchange.com/questions/41247/changing-...

fidotron · a year ago
This deserves a separate submission.

That is so bad it almost has to be a deliberate method to extract passwords.

tambourine_man · a year ago
This is remarkable. Enterprise software is its own microcosmos of pain.
yodon · a year ago
Eye tracking data is incredibly sensitive and privacy-concerning.

HN tends to dislike Microsoft, but they went to great lengths to build a HoloLens system where eye tracking was both useful and safe.

The eye tracking data never left the device, and was never directly available to the application. As a developer, you registered targets or gestures you were interested in, and the platform told you when the user for example looked to activate your target.

Lots of subtlety and care went into the design, so yes, the first six things you think of as concerns or exploits or problems were addressed, and a bunch more you haven't thought of yet.

If this is a space you care about, read up on HoloLens eye tracking.

It's pretty inexcusable if Apple is providing raw eye tracking streams to app developers. The exploits are too easy any too prevalent. [EDIT ADDED: the article is behind a paywall but it sounds from comments here like Apple is not providing raw eye tracking streams, this is about 3rd parties watching your eyes to extract your virtual typing while you are on a conference call]

simondw · a year ago
> if Apple is providing raw eye tracking streams to app developers

Apple is not doing that. As the article describes, the issue is that your avatar (during a FaceTime call, for example) accurately reproduces your eye movements.

makeitdouble · a year ago
Isn't it the a distinction without a difference ? Apple isn't providing your real eye movements, but an 1 to 1 reproduction of what it tracks as your eye movements.

The exploit requires analysing the avatar's eyes, but as they're not the natural movements but replicated ones, there should be a lot less noise. And of course as you need to intentionally focus on specific UI targets, these movements are even less natural and fuzzy than if you were looking at your keyboard while typing.

taneq · a year ago
This is a great example of why ‘user-spacey’ applications from the OS manufacturer shouldn’t be privileged beyond other applications: Because this bypasses the security layer while lulling devs into a false sense of security.
FrustratedMonky · a year ago
But the technology is there. That is the concern.
diggan · a year ago
Does HoloLens also use a keyboard you can type into with eye movement? If not, seems to be unrelated to this attack at all. If yes, then how would it prevent this attack where you can see the persons eyes? Doesn't matter if the tracking data is on-device only or not as you're broadcasting an image of the face anyways.
voidUpdate · a year ago
Not when I used it, you had to "physically" press a virtual keyboard with your hands

Deleted Comment

modeless · a year ago
I disagree strongly. I don't want big tech telling me what I can and can't do with the device I paid for and supposedly own "for my protection". The prohibition on users giving apps access to eye tracking data and MR camera data is paternalistic and, frankly, insulting. This attitude is holding the industry back.

This exploit is not some kind of unprecedented new thing only possible with super-sensitive eye tracking data. It is completely analogous to watching/hearing someone type their password on their keyboard, either in person when standing next to them or remotely via their webcam/mic. It is also trivial to fix. Simply obfuscate the gaze data when interacting with sensitive inputs. This is actually much better than you can do when meeting in person. You can't automatically obfuscate your finger movements when someone is standing next to you while you enter your password.

KaiserPro · a year ago
You are an expert user, so of course you will demand extra powers.

The vast majority of people are not expert users, so for them having safe defaults is critical to their safety online.

> It is completely analogous to watching/hearing someone type their password on their keyboard,

Except the eye gaze vector is being delivered in high fidelity to your client so it can render the eyes.

Extracting eye gaze from normal video is exceptionally hard. Even with dedicated gaze cameras, its pretty difficult to get <5 degrees of certainty (without training or optimal lighting.)

spease · a year ago
Apple does not provide eye tracking data. In fact, you can’t even register triggers for eye position information, you have to set a HoverEffectComponent for the OS to highlight them for you.

Video passthrough also isn’t available except to “enterprise” developers, so all you can get back is the position of images or objects that you’re interested in when they come into view.

Even the Apple employee who helped me with setup advised me not to turn my head, but to keep my head static and use the glance-and-tap paradigm for interacting with the virtual keyboard. I don’t think this was directly for security purposes, just for keeping fatigue to a minimum when using the device for a prolonged period of time. But it does still have the effect of making it harder to determine your keystrokes than, say, if you were to pull the virtual keyboard towards you and type on it directly.

EDIT: The edit is correct. The virtual avatar is part of visionOS (it appears as a front camera in legacy VoIP apps) and as such it has privileged access to data collected by the device. Apparently until 1.3 the eye tracking data was used directly for the gaze on the avatar, and I assume Apple has now either obfuscated it or blocks its use during password entry. Presumably this also affects the spatial avatars during shared experiences as well.

Interestingly, I think the front display blanks out your gaze when you’re entering a password (I noticed it when I was in front of a mirror) to prevent this attack from being possible by using the front display’s eye passthrough.

FrustratedMonky · a year ago
"privacy-concerning"

Like checking out how you are zeroing in on the boobs. What would sponsored adds look like, once they also know what you are looking at every second. Even some medical add, and the eyes checkout the actresses body.

"Honey, why am I suddenly getting adds for Granny Porn?".

jayd16 · a year ago
Hololens 2 certainly has support for passing gaze direction, not sure about the first one.

I think the headsets are pretty much in alignment that it's a feature that needs permissions but they'll provide it to the app with focus.

Apple is a lot more protective.

lostmsu · a year ago
I personally view this as gatekeeping, which should be outright illegal.
dopylitty · a year ago
As far as I know eye tracking isn’t available in VisionOS[0]

This article snippet is behind a paywall but it seems like it’s talking about the eyes that are projected on the outside of the device.

So basically it’s no more of an exploit than just tracking someone’s actual eyes.

0: https://forums.developer.apple.com/forums/thread/732552

bookofjoe · a year ago
Go behind the paywall here: https://archive.ph/44zwN
voidUpdate · a year ago
the article is talking about avatars in conference calls which accurately mirror your eye position. Someone else on that call could record you and extract your keyboard inputs from your avatar.

Enabling "reader mode" bypasses the paywall in this instance

jrockway · a year ago
I think the underlying flaw here is that pointing your eyes at a virtual keyboard in space to type passwords is just a poor input method. Take away the VR headset and do the same thing and the flaw still exists.

Now I want to make a keyboard where you shine a laser pointer at the key you want to press, and your cat jumping up is what actually triggers the button press.

karlgkk · a year ago
> I think the underlying flaw here is that pointing your eyes at a virtual keyboard in space to type passwords is just a poor input method

fwiw while you can do that, it's much easier to just poke the keys or use Siri

a folding bluetooth keyboard with built in trackpad has become a must have travel accessory for me :)

iwontberude · a year ago
I don’t have letters on any of my keys and switch between keyboard layouts frequently. I never look at my keyboard, am I still vulnerable?
jrockway · a year ago
Definitely not. It seems that the keyboard on Apple Vision Pro is an onscreen keyboard you type with using your eyes. The Vision Pro also broadcasts your eye movements to a screen on the front of your headset, and the combination of the two is what leaks your password. If you are just in VR looking at a virtual keyboard to type, it's no big deal. If you are typing on a physical keyboard and people are videotaping your eyes, it's no big deal. The combination of the two is the problem.
LelouBil · a year ago
> I never look at my keyboard

The article title :

> Gaze estimation

It doesn't seem like it

AfurikanTedoku · a year ago
First author here. https://www.arxiv.org/abs/2409.08122 Here is our pre-print. I am happy to answer questions in this thread. :)

Deleted Comment