Amazing how good eye tracking works on my phone (15 Pro).
Unfortunately, there seems to be no way to press buttons via blinking, only by "dwelling" on an item for a few seconds, which makes using my phone feel quite hectic and prone to to inadvertent inputs to me.
I think interfering with a biological necessity that is there to maintain eye health probably isn't a good candidate for HID input. I suspect the user would end up with very dry eyes, as they subconsciously/consciously refrained from blinking, even if it were only long blinks (which I do often if my eyes feel the need).
Now I could see a wink working! Left wink, right wink. And, with a wink, you don't lose tracking during the action (just half the signal).
It's really bad on my iPhone 13. Surprised they released it here. After one or two clicks it needs to recalibrate. There doesn't seem to be a way to not click on things either. No way to change apps. After the third recalibration, I selected "yes" to "would you like to disable eye tracking", and while eye tracking was disabled, I also lost access to swiping down on the control center or swiping up on the app switcher. Had to restart the phone to get things back to a usable place.
Just tried it on a 16 Pro Max and it’s insanely good. I really expected it to be a lot more random and buggy, but had no problem navigating the phone and “clicking” buttons etc.
This is basically the sort of “future tech” I imagined as a kid. We can now talk to computers and they talk back (ChatGPT, Apple Intelligence, etc) and flawless navigate our portable super computers just by looking at them.
It is amazing how well iOS supports these accessibility features but doesn't consider blocking video autoplay on websites, something that is incredibly distracting for people with ADHD.
Since when? Safari used to be the only ones who forced user interaction prior to autoplay. You sure you didn't manually activated a feature flag or have an extension installed?
Cursor tracks ok, but the implementation seems to replace a low-level pointing device. I.e., it's very precise and jittery - all attribution and no salience.
Also maybe like Siri it should be modal. E.g., dwell away to silence, and then dwell leading corner to say "Hey, listen..."
Holding the phone seemed to cause problems ("you're not holding it right"). Probably best with fixed positioning, e.g., attached to a screen (like a continuity camera, assuming you're lying down with a fixed head position.
Tracking needs a magnetic (gravity well?) behavior, where the pointer is drawn to UI features (and the user can retract by resisting). Salience weighting could make it quite useful.
It's possible that weighting could piggyback on existing accessibility metadata, or it might require a different application programming model.
Similarly, it would be interesting to combine it with voice input that prioritized things near where you are looking.
I'm willing to try, and eager to see how it gets integrated with other features.
I too tried this for a short while and was not impressed. However, I can’t help but wonder how ‘good’ I could get at using it if I invested more time in it. Would love to hear from someone who truly uses this tool regularly. Flying a plane is also quite cumbersome the first 15 minutes.
There are some downright neat things in the iOS accessibility options. Example: you can set it so that a triple tap on the rear of the phone turns the flashlight on/off. People think it’s witchcraft how fast I can pull the phone out and switch it on without looking down.
I'm on Android and just wish I had a simple button (or switch) dedicated to the flashlight.
Back when I ran LineageOS it was easy to commandeer a physical button and repurpose it for this. It is still difficult to do so under stock OS? (Note I'm already using double tap of the power button to activate the camera)
Eg. It was frustrating how for the longest time Samsung wouldn't let you repurpose the Bixby button.
That way when a website blinds me at night because they didn't implement dark mode, I do a quick triple-tap and magic-presto, poor man's dark mode! And after I close it, another triple-tap to go back to normal.
I'm really just waiting for the OS and browser to do "fake dark mode" whenever it detects something with a white/light background. Seems like it's about time.
The physical switch on my old iPhone to toggle silent mode stopped working and sometimes it will toggle itself. I had to setup the triple tap to toggle silent mode because the alternative is like 20 clicks deep in settings.
On Android I get the flashlight by double pressing the "lock" button. It's my single most useful shortcut, my flashlight is on literally before my phone is out of my pocket.
I love how many neat and handy features are baked into the iOS accessibility settings. Apple should make more noise about these features. I recently discovered there’s a white noise generator baked in which meant I could get rid of the annoying white noise app I had been using to keep my napping baby asleep. Another recent discovery was the on screen anti-motion sickness things that move around with the accelerometer. I get terrible motion sickness and they make a massive difference.
They do mention them in their press releases, they’re featured during accessibility day and they’re occasionally in ads too. I’m not sure what more noise would be relevant for these features.
The "reduce white point" is a great hidden feature in Accessibility that dims your backlight even further for nighttime reading. You can even assign it to a triple-click or backtap (another great hidden feature).
I played around with this a bit. Doesn’t work amazing on my iPhone model (SE 3rd gen), but it’s pretty cool. I don’t think there’s an API to use it in apps yet, but I would love to make an eye controlled mobile game.
I know of at least 1! “Before your eyes” though i played it on a PC with a webcam, and I believe the mobile version is only available for Netflix subscribers, but I would strongly recommend it! It has a well told story with an very unique means of interaction
Unfortunately, there seems to be no way to press buttons via blinking, only by "dwelling" on an item for a few seconds, which makes using my phone feel quite hectic and prone to to inadvertent inputs to me.
Now I could see a wink working! Left wink, right wink. And, with a wink, you don't lose tracking during the action (just half the signal).
This is basically the sort of “future tech” I imagined as a kid. We can now talk to computers and they talk back (ChatGPT, Apple Intelligence, etc) and flawless navigate our portable super computers just by looking at them.
Go to Settings > Accessibility > Motion and turn on "Reduce Motion"
Also audio doesn’t autoplay on any websites. You have to interact with the page first for that to happen
Also maybe like Siri it should be modal. E.g., dwell away to silence, and then dwell leading corner to say "Hey, listen..."
Holding the phone seemed to cause problems ("you're not holding it right"). Probably best with fixed positioning, e.g., attached to a screen (like a continuity camera, assuming you're lying down with a fixed head position.
Tracking needs a magnetic (gravity well?) behavior, where the pointer is drawn to UI features (and the user can retract by resisting). Salience weighting could make it quite useful.
It's possible that weighting could piggyback on existing accessibility metadata, or it might require a different application programming model.
Similarly, it would be interesting to combine it with voice input that prioritized things near where you are looking.
I'm willing to try, and eager to see how it gets integrated with other features.
Thought wow cool! Quick flashlight! Instead the flashlight would frequently turn itself on while in my pocket.
And on the flip side; when I wanted the flashlight to come on the phone frequently wouldn’t recognize my tapping.
It just ended up being quicker and more convenient to turn on the flashlight from the lock screen.
Back when I ran LineageOS it was easy to commandeer a physical button and repurpose it for this. It is still difficult to do so under stock OS? (Note I'm already using double tap of the power button to activate the camera)
Eg. It was frustrating how for the longest time Samsung wouldn't let you repurpose the Bixby button.
That way when a website blinds me at night because they didn't implement dark mode, I do a quick triple-tap and magic-presto, poor man's dark mode! And after I close it, another triple-tap to go back to normal.
I'm really just waiting for the OS and browser to do "fake dark mode" whenever it detects something with a white/light background. Seems like it's about time.
[1] https://darkreader.org/
https://www.beforeyoureyesgame.com
https://apps.apple.com/us/app/theparallaxview/id1352818700