It's funny how I've witnessed a complete 180° change from how, in the 1990's, software was supposed to be entirely discoverable and it also came with a manual that documented literally everything, to now there's a ton that's hidden, often without any manual whatsoever... but you can Google everything you need to know.
Probably the single most useful skill I've had to learn in my life is, whenever you wonder something, just Google it. If you don't think your software does X... don't just assume it. Look it up.
I've been astonished at how often a feature was added 3 years ago to a program I've used for 8 years, or there's a secret swipe that avoids a bunch of menus, or an unofficial command-line flag.
And it really leaves me feeling deeply conflicted. Because on the one hand, I still believe in the virtue of learning your tools inside and out. I would read the manual and be proud I knew exactly what every program/language could and couldn't do. But on the other hand, is that really just a waste of time? Programs have so many features now, that instead of learning all the things via discoverability or a manual, we just learn them by... querying how to do things when we need them.
I don't really like the idea of so fundamentally relying on Google and forums and tutorials and Reddit and YouTube videos as the main way of learning how to use software. But at the same time, software does so much now and adds new things so quickly that it appears to be the only reasonable way.
> we just learn them by... querying how to do things when we need them.
The problem with this is that you don’t learn any new features that you wouldn’t have come up with on your own.
I loved reading the well-written manuals and “on-line” (integrated into the software) help of the 90s (and 80s) that explained all the concepts and features of the particular software. You’d learn a ton of new ideas and possibilities. It was really fun and exciting, and it gave you the perception of a well-designed and well thought-out whole. You built up a consistent mental model of the software, and you thus ended up with the feeling of mastery and control over the software.
Nowadays it feels more like poking the software with a stick in an attempt to build a mental model and discover its capabilities by trial and error, often remaining in doubt about the actual intent of how exactly things are supposed to work.
As annoying as they can sometimes be, the little tips that pop up every now with "Hey did you know you can do this?" are a neat little solution to this. Really like the unobtrusive ones like the GitHub "ProTip!" at the bottom of the pull requests page
It does get a bit ridiculous though. There's entire sites and content creators focused solely on VS Code tips. At what point are we gonna end up with engineers working on adding a feature to VS Code that they didn't know already existed?
I swiped up to dismiss the Safari browser on my iPhone this morning and accidentally invoked a half-hight bookmarks view. I wasn’t sure if it’s a new gesture in iOS 16 or if it’s been there for a while. I wasn’t able to reproduce it and a brief web search didn’t find mention of it in the first few hits. I didn’t really care, so I gave up looking.
If anyone knows how to invoke this feature, I’d love to know. Just not enough to try to find out from checking more than about three search results. Anyway, ahhh discoverability. I remember you…
Haha, yes! I'm a software engineer and it took me a decade of smartphone usage before I realized that long-pressing various keys (e.g. vowels, symbols) shows a pop-up allowing you to enter related characters (e.g. á, æ, ¡, ₹, ±, ·, ½, ²)†. Blew my tiny little mind! If I were using a language that required the variants menu in its native keyboard layout, I might have discovered this a decade before by explicitly searching for how to enter certain characters.
† Typed these on mobile, of course. Android, in case you're wondering.
I feel like I’m reading more manuals than ever. Libraries, CLI tools and other specialized software have well written and very useful docs. Same with pro camera software, music gear software, and more.
For “consumer” software I usually don’t need manuals, I can find what I need via poking; this often includes power user features (when I use something sufficiently often).
Is it possible that what you’re feeling is nostalgia? Have you actually read the manual for your phone?
Because modern software engineering ( cough ) doesn't do well designed or well thought out software any more. It is all about "Move fast and break things", while the phase was coined by Mark Zuckerberg, ( or attributed to him ), that mentality or philosophy largely came from Google during the Web 2.0 era.
This make some sense for Web Product development where you want to find a market fit, VC and media were all over it. But it doesn't work as well for Desktop and other type of Apps. Or in System Programming.
And it is funny we start seeing push back against it 10 - 15 years later.
Have you tried googling "iphone manual"? The first result is the iPhone User Guide [1], which is a comprehensive manual that does document literally everything.
The manual is also available as an 843-page ebook [2], and is accessible via the Tips app, which is preinstalled on the home screen.
It's not quite like the days when a paper manual would be right in the box, but it's close. And I think most people would be unhappy if their sleek new phone came with an 843-page tome (not to mention how much paper that would waste).
That doesn't seem to have a search feature? Not that a paper manual has, but flicking through quickly is a lot easier in paper than on screen.
I tried to find the answer to a problem I had and it's not there. I inherited an iphone where the side ringer mute toggle bounces back after setting, it's a mechanical error. There is no software "unmute" override - unless you use assistive touch. It took quote a while of wading through google results and well-intentioned-but-clueless advice from people before I found someone who revealed this. The manual just says assistive touch can help adjust volume so it is technically true I guess but the keyword "mute" is not there.
I did learn about the triple-click shortcut from than online manual just now, so I should thank it for that (even though it neglects to mention it needs to be enabled first).
It's hard to Google things you don't know exist. And it's hard to Google things that changed. If you've ever had a problem on macos, Googling is only useful if the problem is new; if it's been an issue in older versions, you're going to find all those questions and maybe some answers but they likely don't apply because Apple broke it differently now. Other vendors are not immune to this either, of course. I understand Apple actually does have manuals, they just don't print them and if they reference them, many people (including me) never noticed.
Google only helps if you know what you're looking for and it's a search that it happens to give you good results on. I've increasingly found the latter to be a problem, because when it comes to the point that I need to search online, it's probably not something easily found.
Apple actually does have an extensive and up to date manual on their products which includes all those obscure convenience features. I don’t think much has really changed in this regard. The manual exists, is very easy to read and understand, but people don’t care because they get by without it just fine.
All too often the search results are padded with advice for some other version than the one I am having trouble with, and it is difficult or impossible to craft a query specific enough to weed them out as the information simply isn't there.
Then there is the problem of authors using product- and version-specific jargon that I haven't mastered yet, precisely because one can usually pick up the basic operations without reading a manual - "1. From the customization agent..."
As an aside, the ability to stop yourself and know when to google something is what’s starting to separate the truly “smart” from everyone now - whether it’s coding, or general knowledge and discussions, or even when you’re just sitting in the toilet musing about some random question.
Especially in coding I see this as the major difference between a good and great engineer - google things at the right time to find the best solutions. You don’t have to build it all yourself.
Macs in the 90s, so System 6, 7 and 8, were full of hidden stuff. Clicking stuff while holding option, you would discover all kinds of interesting features and easter eggs.
I think you just articulated what my #1 blocker to mastering vim and/or emacs has been: they're both designed for the "mastery thorough thick manual" era. Dashing off to Google answers my immediate query, but often is far less instructive than a well-organized manual with an index at letting me discover the adjacent thing I actually need.
OK, that's my #2 blocker. My #1 blocker is that I am no longer 15 and have things to do other than just poke about until I master a new text editor.
> My #1 blocker is that I am no longer 15 and have things to do other than just poke about until I master a new text editor.
100% this. Once upon the time my attitude would have been a bratty "you just don't want to do the work to master your tools!" My response to that now is "no the fuck I do NOT, I want problem number 8131 solved yesterday so I can move on to problem number 8132 which btw requires entirely different tooling."
For me, those are ancient tools for ancient times, designed when things really were different. We have other capabilities now, nobody is forced to work on the CLI anymore - I also refuse to spend weeks to „master“ my editor.
Typing code is literally the least bad thing slowing me down when creating software.
Yes it's great that we can Google/DDG most stuff at a whim. And there are often many great tutorials both written and video searchable online on a lot of software.
That does NOT solve the original discoverability problem.
>>I've been astonished at how often a feature was added 3 years ago to a program I've used for 8 years, or there's a secret swipe that avoids a bunch of menus, or an unofficial command-line flag.
THAT is the problem right there. You're a likely typical example of someone who is smart, knowledgeable, experienced, and motivated, probably beyond what used to be called a "power user". Yet great swaths of features, and the work of scores of designers, developers, & testers for years, simply escapes our notice.
The result is that vast amounts of productivity go squandered until we discover by chance accident or encounter with some "You didn't know this!" post.
I've occasionally seen some introduction wizards be somewhat helpful, but they are never in any kind of depth. It's like they are afraid to show you what the software can really do — but the real case is some marketing dude thinks (likely sometimes correctly) that if the intro is too long, people will think it's complicated and won't use it. If that's the case, we're again being dragged down by the lowest (of most ordinary) common denominator.
Solutions? How about the Quick Tour, and even the outline/TOC of a full manual, so we can at least skim it and see if some feature exists and then Google/DDG it...?
I think relying on google might be too far, but there is certainly a middle group. My linux tools usage vastly improved by switching to a newer shell , with auto complete, pop up man pages, syntax checking arguments before executing, etc. Same with vim and tmux. Also using modern cli tool replacements that rust made popular. I wasted so much time when I was younger remembering things that provide a very small to no advantage if you do the above instead.
So, I'd say, google can be wrong, the ideal case is modify your existing tools as needed to maximize discoverability and minimize friction to actual documentation
This pattern is quite global. Everything is "more available" now, real time streamed, there's no apparent need for apriori organization since we can correct on the fly. Or so it seems.
Games are patched live, browser in constant update etc etc
> I've been astonished at how often a feature was added 3 years ago to a program I've used for 8 years,
This looks suspiciously similar to lifecycle of a social media, especially how they grow new features and eventually dies when users migrate.
Maybe it's a viable strategy for a social media, mobile OS, MMORPG, anything aaS to not explain features or educate users, but to grow as users consume added mechanics, until it becomes impossible for new users to catch up and old users to keep up?
I actually miss picking up books like the Macintosh Bible where you can read through UI features. There was something fun about reading and then trying it in front of the keyboard.
I'm very, very tired of the size of the backlog of stuff I probably ought to Google. It's too freaking long. Just make it easy to do without looking it up!
I really LOVE how rust approaches this: provide as much help as possible in the compiler error messages. Not sure if they were inspired by elm that does this too, but it means you do a lot less context switching between your program and “the manual”/stackoverflow/whatever else.
In the 1990s, plenty of people felt obligated to drive to the mall, walk into B Dalton, and purchase a 300 page book to learn how to use their new software. There was a whole section for it.
Today, they can find the equivalent information for free without leaving the device where the software is running.
Hard to say though if those books were in fact published by opportunists that surmised (correctly) that there was an audience out there that were either intimidated by the software manual (or assumed they would be without actually trying to read it) or who had pirated the software and so had no manual.
I haven't encountered a professional software product without a manual yet. What you describe applies to software for mass consumption only. It might change in the future, but it would be a bit ridiculous for a company to sell a product for a professional without documentation.
The G in "GUI" stands for "graphical." People forget that when they defend shitty, undiscoverable UI. The examples in this article are right on point. Apple (and others) far too often simply punt and give up on doing the necessary design work, burying functionality behind some bullshit like "long presses" or "gestures" or modifier keys or modifier-key-clicking on a text label that isn't even demarcated as a control.
UI design takes time and thought; that's true. But why does a company with Apple's resources not bring the necessary effort to bear on it? It's not as if they can't afford it. Sure, there are time constraints, but if you're going to fold to those and trowel out shitty UI, then don't presume to publish "human interface guidelines" or other such bibles.
And it's high time to stop tolerating smug douchebags who denigrate any complaint about missing functionality and declare, "Well, you simply long-press Option-Shift-Command-Q to bring up the configuration menu."
As an interface designer I am completely confident that you're looking at old usability through rose colored glasses. Most software these days doesn't come with a big monolithic manual because it doesn't need one. More people directly use software now than ever before and most of them will probably never read a single line from a software user manual; not because they don't exist, it's because they don't have to. The vanishingly small percentage of folks very experienced with interfaces designed for professional software users– often using old interface flows and metaphors– get frustrated with software designed for everybody else. Your understanding of software usability isn't representative– that's why almost no open source software, no matter how big the manual is, gains anywhere close to the same usage as their commercial equivalents.
That's not what the article ssys though.. do you really think "everybody else" would have discovered the actions described without any sort of help page?
I think the modern software does not come with the manuals because it simply has severly reduced functionality compared to the old days. For example, when was the last time you had used a navigation software with "avoid area" and "add custom road" features?
There was always a tradeoff between "more features thus more UX complexity" (power user approach) vs "cut features to keep UX simpler" (UX designer approach). Sadly, the designers won.
You know, I've had experience teaching elderly and computer-illiterate people tech. It was much easier to teach them Windows in early 00s than iOS today, because the former, while looking more complicated, was also very consistent - once you explained the basics like drop-down and context menus and drag and drop, those things generally worked everywhere throughout the system, even in new apps they haven't seen before.
Conversely, with iOS, the problem isn't just that all those techniques are undiscoverable - it's that the tricks you learn for one app will rarely work in another. Well, that, and the fact that they regularly change basic UX, too, like removing the physical home button.
The Back Tap feature is my favorite "hidden" action on the iPhone. You can double (or triple) tap the back of the phone to trigger whatever action you want. For example, toggle the flashlight or lock rotation. Show Spotlight or run a Shortcut.
You have to adjust your grip so that you can robustly tap the back with your index finger (holding the phone firmly between the other four fingers and your palm/base of your thumb). I agree that it’s not really practical.
> I think Apple does a decent job telling people about iOS features across the User Guide, Tips.app, and the Apple Support YouTube channel.
I have a nit-pick with the article, because all of the listed features are features where an obvious, more intuitive interface exists, but may be clunkier.
Eg. Calculator: the way “real” IRL calculators change the number you type is with the “clear” button. IPhones support that. Personally, i didn’t know about the swipe, and I think they should add a backspace button because swiping is confusing and younger people may not understand clear either. But you don’t need to swipe to use the calculator.
Keyboard cursor: This feature was demo’ed live when it was released, and it added a more dexterous way to position the typing cursor besides just touching the text where you want (old way, still works). Touching the screen where you want to type is pretty discoverable, and has been a stable feature for a while. It matches how mice and word processors on desktops would work, and seems intuitive to just tap on screen where you want to type anyways.
Safari Tab Change: You can just go to the safari tab page, available by touching a permanent on-screen button. Also, the swipe to change tab feature mimics the operating system’s swipe to change app experience, where the URL bar is indicative of the system’s handlebar. (Also demo’ed live when the feature was launched).
> Eg. Calculator: the way “real” IRL calculators change the number you type is with the “clear” button. IPhones support that.
The screen recording of the calculator app seems to just have "C", which I would assume deletes the whole number, not one digit. That's not the same action.
The Android calculator has a backspace button, that removes a single digit. It's clearly recognizable as a backspace, just like on any other on-screen keyboard input.
* 2. Move the insertion point by dragging around the keyboard.*
Anyone know how to go down doing this? Left, right, and up are all good, but I can’t figure out how to go down consistently. The 1.5mm below the space bar is really fiddly.
What do you do when you are using a traditional laptop trackpad, you're dragging an item downward but your finger is already at the bottom of the trackpad?
(Hint: the amount the cursor moves does not only depend on the dragged distance on the trackpad.)
Now do the same thing for the on-screen virtual trackpad.
Is this to work around the way Apple has utterly and bafflingly broken the insertion-point handling in iOS?
It worked fine for years, and now suddenly it takes five tries to invoke the "select all", "paste", or whatever menu. And moving the insertion point? Cumbersome shitshow.
Note that enabling "Back Tap" will cause a daemon to run in the background all times taking a couple percent CPU (to detect this action, of course). I don't have very good numbers of the battery life impact but my guess is that it's perhaps half an hour or so. Might be worth keeping in mind when deciding whether to enable this feature.
> Note that enabling "Back Tap" will cause a daemon to run in the background all times taking a couple percent CPU
Is this true? I was under the impression that iPhones (and most modern high end devices) had dedicated hardware to monitor these sort of features, and triggers a system event in the OS (the same way you don’t need a daemon to check for a keyboard or power button).
I had screenshot function mapped to back double tap when I attempted to use iPhone last year, result was a gallery full of screenshots triggered by putting the phone on a table.
Wow, these are great! The back tap feature seems really handy, and it even worked with my case on. I guess they're doing it by sensing phone movement/rotation rather than some sort of touch sensor on the back.
The actions listed in the article are fairly non-essential, but many fundamental iOS functions also have discoverability issues, as well as some head-scratching design choices. For example:
- seeing notifications requires swiping down specifically from the top left
- turning on the flashlight requires swiping down specifically from the top right. Swiping down from top left also gets you a flashlight button, but it is not actionable. The flashlight button on the home screen is also not actionable.
- universal search requires swiping to the left of the app pages. Swiping past the right end of the app pages also gets you a search bar, but it will only show you apps
- Seeing your open apps requires swiping up slowly from the bottom of the page.
- turning the phone off requires holding two unrelated buttons
Mobile UX generally is heavily-dependent on gestures, which inherently creates discoverability challenges, but it seems like Apple goes out of its way to hide every basic function behind a very specific swipe. It makes me wonder if they intentionally design their products to be exclusively usable by tech-savvy people.
Another possible explanation is that design requires users to develop muscle memory over time that will improve experience in the long run and makes competitors feel unnatural.
Did you perhaps skip the on-boarding process? A lot of what you mention has been part of the Getting to know your iPhone on-boarding since they were introduced on the iPhone X. Specifically where to pull down from and pull up from.
I’m not sure what you mean by the flashlight not being actionable. Notification shade/Lock Screen buttons are long press actions to prevent accidental activation.
On iOS 16 there’s now a button on the launcher to invoke search in case people want to avoid swiping
You say they design their products to be used by tech savvy people but gestural design is designed to become intuitive after the first tutorial. Indeed the only people I know who struggle with it are tech savvy people who skip the on-boarding. In much the same way that people who think they’re handy disregard ikea instructions etc
I am not talking about myself personally. Users like my mom, who have trouble navigating AirBnb, are going to have trouble remembering 8 different swipe actions, even if they go through a tutorial. A designer needs to anticipate users skipping tutorials or just forgetting them. A tutorial is not an alternative to usable design.
I’ve been using iOS devices for years and getting what I want when I swipe down is a constant roll of the dice. Did I do it in the ambiguous zone for the control center or did I do it where I’ll get recent notifications? Oh wait if I’m on my phone instead of my tablet I need to swipe up for the control center. Why it needs to be different on the tablets I’ve never really known, it used to be from the bottom and it was really nice for it to be consistent. Maddening.
I don't know how it is in other countries, but here in Japan, when you buy a phone in a mobile carrier shop, they set it up for you, which means any on-boarding that exists on the phone OS is skipped entirely.
The home button has like 11 different uses based on context and how you press it. It's just insane.
I kinda secretly judge UX people that enjoy iOS. And somehow almost all of them are Mac/iPhone users. I think that is a bit responsible for the uptick in form over function lately.
> - seeing notifications requires swiping down specifically from the top left
Not really, swiping from anywhere on the top (except top-right) works. This is consistent with the other gestures: swipe up from the lock screen to unlock it (just like you would open a roller shutter); swipe down to lock it again -- and see the notifications. You can also simply hit the side button.
> - turning on the flashlight requires swiping down specifically from the top right
You can turn it on from the lock screen, as others have pointed out.
> - turning the phone off requires holding two unrelated buttons
Yes, so that you don’t turn it off by mistake. Smartphones are not devices you turn on and off; outside of OS updates they stay on forever.
> It makes me wonder if they intentionally design their products to be exclusively usable by tech-savvy people.
I would say the opposite. When I think of tech-savy people, I think about mouse and keyboard; when I think of non-tech-savy ones, I think about tactile displays and gestures (and voice).
My 92-yo grandfather barely knows how to make his printer work but has no trouble remembering the most common swipe gestures: it’s hard to forget about pinch to zoom or swipe down to 'close' the phone and swipe up to 'open' it again.
I've been using ios devices for a decade and half and empirically I'm worse than 50/50 in terms of successfully invoking the show-me-notification gesture. Hitting the power button twice in sequence (lock->wake) has been my heuristic.
I suspect that designers are tasked with too many features on too small of a device. On a desktop with a mouse + keyboard there is a potential for so many shortcuts ... you have to get much more creative on a touch-only device.
I don't endorse it however. I think pop-up menus (even the much maligned hamburger) are at least a way to make many more features discoverable. I dislike "gestures".
I think this constraint actually created some of the best UI ever. Desktop designers had gotten lazy and just stacked tool bar over tool bar and then sidebars too with hundreds of icons everywhere.
Was it really more discoverable to have the icon always on screen but embedded among so many other useless icons?
I think many of these are inherently poweruser features though. Most non tech-savvy people will rarely have to turn off their phone or clear their open apps (really if the memory eviction system is good this shouldn't even be a concern).
Some of the other stuff you mentioned is discoverable in my opinion. Swiping down
from home gives you a search bar. Notifications being accessible by swiping down from the left is admittedly not as good as swiping down from the top bar in the older iPhones, but I think still more discoverable than the Windows notification system that requires you to click a button.
Turning off the phone is useful if you are low on power and don't have a way to charge. I do this sometimes when traveling or in the city.
When I go through open apps, it is usually to get back to something I was doing, rather than to clear it. Finding what you were just looking at 2 seconds ago shouldn't be considered a power user feature.
From time to time you need to reboot the phone to get some nonworking app or system function unstuck. I regularly have to guide family members to reboot their phone or tablet, and have to google each time which button combination has to be pressed on the particular iPhone or iPad model.
What I find most annoying in iOS UX is how much information is tucked behind the "Share" icon. Having to click the share button to access "Find on Page" is super unintuitive.
To add insult to injury, sometimes the share sheet is really, really slow! Like several seconds with no UI feedback to indicate anything is going on! That’s quite frustrating when I want to quickly invoke my password manager or search on page.
I agree, the share sheet is just too confusing. Favorites management, my password manager, air drop, texting, find on page, and the kitchen sink are all in there and are quite undiscoverable if you don’t know to look for them.
To add some confusion on top of that, some features and extensions go in the similarly un-discoverable “aA” button in the URL bar, so it’s not even like everything goes in the share sheet. That button is really tough to remember even exists since it goes away when you scroll.
You don’t have to use Share for Find on Page. Just type in the address bar, then “On this Page” will be the bottom option. Obviously, that doesn’t seem to be sufficiently discoverable as well.
The share button also has actions that vary unnecessarily.
For example, in SFSafariViewController there is no “add to home screen” button, but in actual safari there is. Despite them both being web view experiences controlled entirely by Apple.
(And you can’t detect SFSafariViewController vs Safari as a web app, so good luck onboarding users for your PWA.)
Have you looked at Apple’s Shortcuts App that allows some customisation of the share actions? e.g. that is the way to run JavaScript snippets in a web page on Mobile Safari (equivalent to javascript: bookmark urls in desktop browsers). I’ve only tried it on iPad, but I presume it works the same on iPhone (it would be unobvious if it didn’t).
The Shortcuts App is the way to access some deep and very unobvious functionality.
It's a little ironic how at the advent of the iPhone, the desktop os was mocked as being clunky and full of those things you just have to know, rather than being discoverable. The iPhone was limited and simple. And it really helped adoption of smartphones. Now it's assumed everybody is already familiar with smartphones and welcomes yet another shortcut or gesture to make usage quicker. Take Android and the removal of the three buttons at the bottom, in exchange for some gestures. Just so the interface looks cleaner. Imagine someone who has never used a smartphone before would be starting out with that. Just hand them the phone and see how long it takes them to return to the home screen after opening the first app.
Not to say this is good or bad, just an observation mostly.
The removal of the three dots completely throws me for a loop. I simply could not get the gestures to stick in my brain. And not to sound entitled, but... I shouldn't have to?
Anyway I'm ride or die for the three dots and am pretty worried they'll kill off the option to bring them back.
I don't use gestures, never had, if possible never will. Phone works 100% fine for me as it is (samsung s22 ultra), i am as efficient with it as I want and need. Whatever works for you is how you should use the product.
Plus, lets not be pathetic with wasting life on phone, real life happens outside screens. Its good to keep reminding oneself this little truth regardless how shiny new gimmicks manufacturers bring to keep us glued to their products and ad-based services.
The iPhone got some undeserved credit for being intuitive just because it was so limited. It did not even have copy/paste. Even now, it is far from obvious how to use undo (tap with three fingers). It’s when there are many features that great UI/UX designers shine.
Late to this thread - while 3-finger tap pops up the text editing menu, you can undo with a three-finger swipe from right-to-left, and redo is left-to-right. Also if you have something highlighted you can do a 3-finger pinch-in to copy, do it twice to cut, and the reverse (pinch-out?) to paste.
I like the way Xiaomi enabled gestures on my phone. The default was the three button layout, with the option to switch to gestures. The moment you enable gestures, you get a little interactive tutorial on how to use them so you don't get confused.
Android at least lets you re-enable the old UI in most cases. And then there's the ability to install custom launchers etc that also helps maintain sanity with major releases.
I moved to iOS over a year ago after using Android exclusively, and I agree that this was a discoverability downgrade. Editing text with various multi-finger gestures, shaking the device, or even putting the entire Photos app into a different mode to select multiple items are outright downgrades to what Android does.
The iPhone innovated finger-touch-based on-screen keyboards while everyone else was still typing with a stylus on a tiny keyboard, but since then, they seem to stagnate. iOS 16 just got haptic feedback on its keyboard in 2022(!), and I am still making more typing errors compared to Android with Gboard.
> iOS 16 just got haptic feedback on its keyboard in 2022
Wow thanks for letting me know. I finally have haptics again after switching to an iPhone 6 years ago.
I totally agree with iOS having a worse typing experience. On Android with Gboard I could swipe/type extremely quickly with few errors. On iOS I make a mistake every few works, and the swipe accuracy is significantly worse.
I know that there is Gboard with iOS, but I've had a lot of trouble with custom keyboards on iOS, so I've given up.
Well, the screen estate on a mobile device is limited, so the number of actions that are easy to discover needs to be limited too - otherwise the UI would be cluttered. I find Apple good at balancing this. Notice how actions that he mentions are just quicker alternatives to stuff that one can already do in another way.
That excuse doesn't hold for the calculator example, where the 0 button takes up 2 spaces for no good reason, and they could easily added a backspace button instead.
> That excuse doesn't hold for the calculator example, where the 0 button takes up 2 spaces for no good reason, and they could easily added a backspace button instead.
It would be a really weird place for a backspace button and you would always tap on it by mistake.
Not 'just' quicker. Easier. Less frustrating. Important elements. This is my experience of using the long-spacebar technique for placing the cursor.
Digressing: An AI noticing that someone has moved the cursor several time without ever typing anything could pop up a "Want me to show you other ways to move the cursor?" dialog, and teach the 'hard to discover' technique. I await the day ...
The Calculator app could suggest that I can swipe on the display at the top if it notices that I'm using 'C' often. This would not even need an AI. I wonder if there are UI frameworks that support the detection and suggestion of "better" actions?
This seems crucial to me - most of these 'nondiscoverable' actions are shortcuts, and there is another discoverable way to achieve the same outcome. They are similar to keyboard shortcuts in that way, they just help power users who know them.
(The calculator backspace seems to be an exception, which is why I also dislike it.)
There isn't a backspace in the calculator, but the swipe is still (basically) a shortcut for something which does have a button. Pressing the C button clears the current input entirely, after all.
The worst degradation for my UX with my iPhone has been when they added some Siri smart search stuff to searching. I'm someone who's given up on memorizing the layout of my apps and I always just swipe down and search by the name of the app. This used to be extremely efficient. Usually a single letter is enough to have the app be in the top 3 results
Now they also search a bunch of other apps and I think Siri does something with your searches, possible even looking for web results. All I really want is for the app results to show up first and not have them bogged down by all the other stuff being searched
Edit: Seems I'm not the only one.[0][1] Also what I was talking about was called "Spotlight search" now its in "Siri & Search". Doesn't seem like there's a real solution
Yeah sorry I should've specified I already tried turning Siri results off and it didn't help. I see you can also turn off "Show Content in Search" for each app... I guess I can spend 10 minutes doing that on my 100+ apps
Probably the single most useful skill I've had to learn in my life is, whenever you wonder something, just Google it. If you don't think your software does X... don't just assume it. Look it up.
I've been astonished at how often a feature was added 3 years ago to a program I've used for 8 years, or there's a secret swipe that avoids a bunch of menus, or an unofficial command-line flag.
And it really leaves me feeling deeply conflicted. Because on the one hand, I still believe in the virtue of learning your tools inside and out. I would read the manual and be proud I knew exactly what every program/language could and couldn't do. But on the other hand, is that really just a waste of time? Programs have so many features now, that instead of learning all the things via discoverability or a manual, we just learn them by... querying how to do things when we need them.
I don't really like the idea of so fundamentally relying on Google and forums and tutorials and Reddit and YouTube videos as the main way of learning how to use software. But at the same time, software does so much now and adds new things so quickly that it appears to be the only reasonable way.
The problem with this is that you don’t learn any new features that you wouldn’t have come up with on your own.
I loved reading the well-written manuals and “on-line” (integrated into the software) help of the 90s (and 80s) that explained all the concepts and features of the particular software. You’d learn a ton of new ideas and possibilities. It was really fun and exciting, and it gave you the perception of a well-designed and well thought-out whole. You built up a consistent mental model of the software, and you thus ended up with the feeling of mastery and control over the software.
Nowadays it feels more like poking the software with a stick in an attempt to build a mental model and discover its capabilities by trial and error, often remaining in doubt about the actual intent of how exactly things are supposed to work.
It does get a bit ridiculous though. There's entire sites and content creators focused solely on VS Code tips. At what point are we gonna end up with engineers working on adding a feature to VS Code that they didn't know already existed?
If anyone knows how to invoke this feature, I’d love to know. Just not enough to try to find out from checking more than about three search results. Anyway, ahhh discoverability. I remember you…
† Typed these on mobile, of course. Android, in case you're wondering.
For “consumer” software I usually don’t need manuals, I can find what I need via poking; this often includes power user features (when I use something sufficiently often).
Is it possible that what you’re feeling is nostalgia? Have you actually read the manual for your phone?
This make some sense for Web Product development where you want to find a market fit, VC and media were all over it. But it doesn't work as well for Desktop and other type of Apps. Or in System Programming.
And it is funny we start seeing push back against it 10 - 15 years later.
https://www.ableton.com/en/manual/welcome-to-live/
Wish more software companies were this thorough.
Deleted Comment
The manual is also available as an 843-page ebook [2], and is accessible via the Tips app, which is preinstalled on the home screen.
It's not quite like the days when a paper manual would be right in the box, but it's close. And I think most people would be unhappy if their sleek new phone came with an 843-page tome (not to mention how much paper that would waste).
[1] https://support.apple.com/en-gb/guide/iphone/welcome/ios
[2] https://books.apple.com/book/id6443146864
I didn't know that. I thought Tips was an app that gave you a couple of dozens of tips, and nothing else.
I tried to find the answer to a problem I had and it's not there. I inherited an iphone where the side ringer mute toggle bounces back after setting, it's a mechanical error. There is no software "unmute" override - unless you use assistive touch. It took quote a while of wading through google results and well-intentioned-but-clueless advice from people before I found someone who revealed this. The manual just says assistive touch can help adjust volume so it is technically true I guess but the keyword "mute" is not there.
I did learn about the triple-click shortcut from than online manual just now, so I should thank it for that (even though it neglects to mention it needs to be enabled first).
It's hard to Google things you don't know exist. And it's hard to Google things that changed. If you've ever had a problem on macos, Googling is only useful if the problem is new; if it's been an issue in older versions, you're going to find all those questions and maybe some answers but they likely don't apply because Apple broke it differently now. Other vendors are not immune to this either, of course. I understand Apple actually does have manuals, they just don't print them and if they reference them, many people (including me) never noticed.
Note that iPhones have a manual too, it’s easy to navigate and quite comprehensive. E.g. here’s a page that describes the “trackpad” action: https://support.apple.com/en-gb/guide/iphone/iph3c50f96e/16.... .
Then there is the problem of authors using product- and version-specific jargon that I haven't mastered yet, precisely because one can usually pick up the basic operations without reading a manual - "1. From the customization agent..."
Especially in coding I see this as the major difference between a good and great engineer - google things at the right time to find the best solutions. You don’t have to build it all yourself.
https://wiki.preterhuman.net/The_Macintosh/Newton_Easter_Egg...
OK, that's my #2 blocker. My #1 blocker is that I am no longer 15 and have things to do other than just poke about until I master a new text editor.
100% this. Once upon the time my attitude would have been a bratty "you just don't want to do the work to master your tools!" My response to that now is "no the fuck I do NOT, I want problem number 8131 solved yesterday so I can move on to problem number 8132 which btw requires entirely different tooling."
That does NOT solve the original discoverability problem.
>>I've been astonished at how often a feature was added 3 years ago to a program I've used for 8 years, or there's a secret swipe that avoids a bunch of menus, or an unofficial command-line flag.
THAT is the problem right there. You're a likely typical example of someone who is smart, knowledgeable, experienced, and motivated, probably beyond what used to be called a "power user". Yet great swaths of features, and the work of scores of designers, developers, & testers for years, simply escapes our notice.
The result is that vast amounts of productivity go squandered until we discover by chance accident or encounter with some "You didn't know this!" post.
I've occasionally seen some introduction wizards be somewhat helpful, but they are never in any kind of depth. It's like they are afraid to show you what the software can really do — but the real case is some marketing dude thinks (likely sometimes correctly) that if the intro is too long, people will think it's complicated and won't use it. If that's the case, we're again being dragged down by the lowest (of most ordinary) common denominator.
Solutions? How about the Quick Tour, and even the outline/TOC of a full manual, so we can at least skim it and see if some feature exists and then Google/DDG it...?
So, I'd say, google can be wrong, the ideal case is modify your existing tools as needed to maximize discoverability and minimize friction to actual documentation
Games are patched live, browser in constant update etc etc
This looks suspiciously similar to lifecycle of a social media, especially how they grow new features and eventually dies when users migrate.
Maybe it's a viable strategy for a social media, mobile OS, MMORPG, anything aaS to not explain features or educate users, but to grow as users consume added mechanics, until it becomes impossible for new users to catch up and old users to keep up?
Today, they can find the equivalent information for free without leaving the device where the software is running.
UI design takes time and thought; that's true. But why does a company with Apple's resources not bring the necessary effort to bear on it? It's not as if they can't afford it. Sure, there are time constraints, but if you're going to fold to those and trowel out shitty UI, then don't presume to publish "human interface guidelines" or other such bibles.
And it's high time to stop tolerating smug douchebags who denigrate any complaint about missing functionality and declare, "Well, you simply long-press Option-Shift-Command-Q to bring up the configuration menu."
I think the modern software does not come with the manuals because it simply has severly reduced functionality compared to the old days. For example, when was the last time you had used a navigation software with "avoid area" and "add custom road" features?
There was always a tradeoff between "more features thus more UX complexity" (power user approach) vs "cut features to keep UX simpler" (UX designer approach). Sadly, the designers won.
Conversely, with iOS, the problem isn't just that all those techniques are undiscoverable - it's that the tricks you learn for one app will rarely work in another. Well, that, and the fact that they regularly change basic UX, too, like removing the physical home button.
https://support.apple.com/guide/iphone/back-tap-iphaa57e7885...
I think Apple does a decent job telling people about iOS features across the User Guide, Tips.app, and the Apple Support YouTube channel.
The "undiscoverable" features in the article are all there.
Delete the last digit: If you make a mistake when you enter a number, swipe left or right on the display at the top.
https://support.apple.com/guide/iphone/calculator-iph1ac0b5c...
Turn the onscreen keyboard into a trackpad.
1. Touch and hold the Space bar with one finger until the keyboard turns light gray.
2. Move the insertion point by dragging around the keyboard.
https://support.apple.com/guide/iphone/type-with-the-onscree...
To access other open tabs, you can swipe left or right on the tab bar.
https://youtu.be/30tfnCxLWSg?t=21
I have a nit-pick with the article, because all of the listed features are features where an obvious, more intuitive interface exists, but may be clunkier.
Eg. Calculator: the way “real” IRL calculators change the number you type is with the “clear” button. IPhones support that. Personally, i didn’t know about the swipe, and I think they should add a backspace button because swiping is confusing and younger people may not understand clear either. But you don’t need to swipe to use the calculator.
Keyboard cursor: This feature was demo’ed live when it was released, and it added a more dexterous way to position the typing cursor besides just touching the text where you want (old way, still works). Touching the screen where you want to type is pretty discoverable, and has been a stable feature for a while. It matches how mice and word processors on desktops would work, and seems intuitive to just tap on screen where you want to type anyways.
Safari Tab Change: You can just go to the safari tab page, available by touching a permanent on-screen button. Also, the swipe to change tab feature mimics the operating system’s swipe to change app experience, where the URL bar is indicative of the system’s handlebar. (Also demo’ed live when the feature was launched).
The screen recording of the calculator app seems to just have "C", which I would assume deletes the whole number, not one digit. That's not the same action.
The Android calculator has a backspace button, that removes a single digit. It's clearly recognizable as a backspace, just like on any other on-screen keyboard input.
Anyone know how to go down doing this? Left, right, and up are all good, but I can’t figure out how to go down consistently. The 1.5mm below the space bar is really fiddly.
You can whip it upwards to the top of the screen which then gives you the entire screen to scroll down (precisely).
(Hint: the amount the cursor moves does not only depend on the dragged distance on the trackpad.)
Now do the same thing for the on-screen virtual trackpad.
It worked fine for years, and now suddenly it takes five tries to invoke the "select all", "paste", or whatever menu. And moving the insertion point? Cumbersome shitshow.
Is this true? I was under the impression that iPhones (and most modern high end devices) had dedicated hardware to monitor these sort of features, and triggers a system event in the OS (the same way you don’t need a daemon to check for a keyboard or power button).
This sounds like bullshit to anyone that knows how interrupts work.
A tap on the back looks like a very brief acceleration on the axis coming out of the screen. Like a reverse free-fall.
Dead Comment
Dead Comment
- seeing notifications requires swiping down specifically from the top left
- turning on the flashlight requires swiping down specifically from the top right. Swiping down from top left also gets you a flashlight button, but it is not actionable. The flashlight button on the home screen is also not actionable.
- universal search requires swiping to the left of the app pages. Swiping past the right end of the app pages also gets you a search bar, but it will only show you apps
- Seeing your open apps requires swiping up slowly from the bottom of the page.
- turning the phone off requires holding two unrelated buttons
Mobile UX generally is heavily-dependent on gestures, which inherently creates discoverability challenges, but it seems like Apple goes out of its way to hide every basic function behind a very specific swipe. It makes me wonder if they intentionally design their products to be exclusively usable by tech-savvy people.
Another possible explanation is that design requires users to develop muscle memory over time that will improve experience in the long run and makes competitors feel unnatural.
I’m not sure what you mean by the flashlight not being actionable. Notification shade/Lock Screen buttons are long press actions to prevent accidental activation.
On iOS 16 there’s now a button on the launcher to invoke search in case people want to avoid swiping
You say they design their products to be used by tech savvy people but gestural design is designed to become intuitive after the first tutorial. Indeed the only people I know who struggle with it are tech savvy people who skip the on-boarding. In much the same way that people who think they’re handy disregard ikea instructions etc
Here’s a talk on the thought process behind a lot of their “fluid” design https://developer.apple.com/wwdc18/803
I don't know how it is in other countries, but here in Japan, when you buy a phone in a mobile carrier shop, they set it up for you, which means any on-boarding that exists on the phone OS is skipped entirely.
The button is actionable, it just takes a long press to avoid accidental activation.
I kinda secretly judge UX people that enjoy iOS. And somehow almost all of them are Mac/iPhone users. I think that is a bit responsible for the uptick in form over function lately.
Not really, swiping from anywhere on the top (except top-right) works. This is consistent with the other gestures: swipe up from the lock screen to unlock it (just like you would open a roller shutter); swipe down to lock it again -- and see the notifications. You can also simply hit the side button.
> - turning on the flashlight requires swiping down specifically from the top right
You can turn it on from the lock screen, as others have pointed out.
> - turning the phone off requires holding two unrelated buttons
Yes, so that you don’t turn it off by mistake. Smartphones are not devices you turn on and off; outside of OS updates they stay on forever.
> It makes me wonder if they intentionally design their products to be exclusively usable by tech-savvy people.
I would say the opposite. When I think of tech-savy people, I think about mouse and keyboard; when I think of non-tech-savy ones, I think about tactile displays and gestures (and voice).
My 92-yo grandfather barely knows how to make his printer work but has no trouble remembering the most common swipe gestures: it’s hard to forget about pinch to zoom or swipe down to 'close' the phone and swipe up to 'open' it again.
I don't endorse it however. I think pop-up menus (even the much maligned hamburger) are at least a way to make many more features discoverable. I dislike "gestures".
Was it really more discoverable to have the icon always on screen but embedded among so many other useless icons?
Some of the other stuff you mentioned is discoverable in my opinion. Swiping down from home gives you a search bar. Notifications being accessible by swiping down from the left is admittedly not as good as swiping down from the top bar in the older iPhones, but I think still more discoverable than the Windows notification system that requires you to click a button.
When I go through open apps, it is usually to get back to something I was doing, rather than to clear it. Finding what you were just looking at 2 seconds ago shouldn't be considered a power user feature.
I agree, the share sheet is just too confusing. Favorites management, my password manager, air drop, texting, find on page, and the kitchen sink are all in there and are quite undiscoverable if you don’t know to look for them.
To add some confusion on top of that, some features and extensions go in the similarly un-discoverable “aA” button in the URL bar, so it’s not even like everything goes in the share sheet. That button is really tough to remember even exists since it goes away when you scroll.
Photos have an option to mark photos as hidden, so they won't show up in random places.
Its meant to be for more sensitive photos.
If you want to mark a picture as hidden, yep, its under the share icon.
For example, in SFSafariViewController there is no “add to home screen” button, but in actual safari there is. Despite them both being web view experiences controlled entirely by Apple.
(And you can’t detect SFSafariViewController vs Safari as a web app, so good luck onboarding users for your PWA.)
The Shortcuts App is the way to access some deep and very unobvious functionality.
in Safari, things constantly switch between the Share icon, and the "Aa" menu in the URL bar!
Not to say this is good or bad, just an observation mostly.
Anyway I'm ride or die for the three dots and am pretty worried they'll kill off the option to bring them back.
Plus, lets not be pathetic with wasting life on phone, real life happens outside screens. Its good to keep reminding oneself this little truth regardless how shiny new gimmicks manufacturers bring to keep us glued to their products and ad-based services.
The iPhone innovated finger-touch-based on-screen keyboards while everyone else was still typing with a stylus on a tiny keyboard, but since then, they seem to stagnate. iOS 16 just got haptic feedback on its keyboard in 2022(!), and I am still making more typing errors compared to Android with Gboard.
Wow thanks for letting me know. I finally have haptics again after switching to an iPhone 6 years ago.
I totally agree with iOS having a worse typing experience. On Android with Gboard I could swipe/type extremely quickly with few errors. On iOS I make a mistake every few works, and the swipe accuracy is significantly worse.
I know that there is Gboard with iOS, but I've had a lot of trouble with custom keyboards on iOS, so I've given up.
What multi-finger text editing gestures am I missing out on?
Undo the last edit: Swipe left with three fingers, then tap Undo at the top of the screen.
Redo the last edit: Swipe right with three fingers, then tap Redo at the top of the screen.
https://support.apple.com/guide/iphone/type-with-the-onscree...
It would be a really weird place for a backspace button and you would always tap on it by mistake.
Digressing: An AI noticing that someone has moved the cursor several time without ever typing anything could pop up a "Want me to show you other ways to move the cursor?" dialog, and teach the 'hard to discover' technique. I await the day ...
(The calculator backspace seems to be an exception, which is why I also dislike it.)
Now they also search a bunch of other apps and I think Siri does something with your searches, possible even looking for web results. All I really want is for the app results to show up first and not have them bogged down by all the other stuff being searched
Edit: Seems I'm not the only one.[0][1] Also what I was talking about was called "Spotlight search" now its in "Siri & Search". Doesn't seem like there's a real solution
[0] https://discussions.apple.com/thread/7887520
[1] https://arstechnica.com/civis/viewtopic.php?f=19&t=1483913
Although it’s true what they say in your [1] link with the speed.