Readit News logoReadit News
jhhh · 3 months ago
I understand the desire to want to fix user pain points. There are plenty to choose from. I think the problem is that most of the UI changes don't seem to fix any particular issue I have. They are just different, and when some changes do create even more problems there's never any configuration to disable them. You're trying to create a perfect, coherent system for everyone absent the ability to configure it to our liking. He even mentioned how unpopular making things configurable is in the UI community.

A perfect pain point example was mentioned in the video: Text selection on mobile is trash. But each app seems to have different solutions, even from the same developer. Google Messages doesn't allow any text selection of content below an entire message. Some other apps have opted in to a 'smart' text select which when you select text will guess and randomly group select adjacent words. And lastly, some apps will only ever select a single word when you double tap which seemed to be the standard on mobile for a long time. All of this is inconsistent and often I'll want to do something like look up a word and realize oh I can't select the word at all (G message), or the system 'smartly' selected 4 words instead, or that it did what I want and actually just picked one word. Each application designer decided they wanted to make their own change and made the whole system fragmented and worse overall.

PunchyHamster · 3 months ago
> He even mentioned how unpopular making things configurable is in the UI community.

Inability to imagine someone might have different idea about what's useful is general plague of UI/UX industry. And there seem to be zero care given to usage by user that have to use the app longer than 30 seconds a day. Productivity vs learning time curve is basically flat, and low, with exception being pretty much "the tools made by X for X" like programming IDEs

ryandrake · 3 months ago
Back in the 90s, you had a setting for everything! It was glorious. This trend of deliberately not making things configurable is the worst, and we can’t seem to escape it because artists are in charge of the UI rather than human interaction professionals.

App designers need to understand that their opinions on how the app should look and work are just that: opinions. Opinions they should keep to themselves.

stephenlf · 3 months ago
Convention over configuration is a powerful idea. Most people don’t want to twiddle with configs. The power user approach is the way to go.
jauntywundrkind · 3 months ago
> that it did what I want and actually just picked one word. Each application designer decided they wanted to make their own change and made the whole system fragmented and worse overall.

This is the trouble. It's been decades of the OS becoming less and less relevant. Apps have more power, more will to build their own thing.

And there's less and less personal computing left. There's the design challenges, the UX being totally different. But the OS used to be a common substrate that the user could use to do things. And the OS has just vanished vanished vanished, receeded into the sea. Leaving these apps to totally dominate the experience, apps that are so often little more than thin clients to some far off cloud system, to basically some corporations mainframe.

The OS's relevance keeps shrinking, and it's awful for users. Why bother making new UX for the desktop, if the capabilities budget is still entirely on the side of the app? What actually needs to change is's UX of the desktop or other OS paradigm (mobile), it's a fundamental shift in taking power out of the mainframe and having a personal computer that's worth a damn, that again has more than a quantum of capability embued in it that it can deliver to the user.

(My actual hope is that someday the web can do some of this, because apps have near always been a horrible thing for users that gives them no agency, no control, that's pre baked to be only what is delivered to the user.)

porkbrain · 3 months ago
Text selection used to be frustrating on mobile for me too until Google fixed it with OCR. I get to just hold a button briefly and then can immediately select an area of the screen to scan text from, with a consistent UX. Like a screenshot but for text.
taskforcegemini · 3 months ago
They are using OCR for selecting plain text?
clearleaf · 3 months ago
This is such an indictment of modern technology. No offense is meant to you for doing what works for you, but it is buck wild that this is the "fix" they've come up with. As somebody learning about this for the first time it sounds equivalent to a world where screenshotting became really hard so people started taking photos of their screen so they could screenshot the photo. How could such a fundamental aspect of using a computer become so ridiculous? It's like satire.
bathtub365 · 3 months ago
Does it automatically scroll down while selecting if the text is larger than the screen?
supportengineer · 3 months ago
That’s how I do it on the iPhone as well. I take a screen shot first.

You can count on it, it is reliable, it always works.

ahartmetz · 3 months ago
>Text selection on mobile is trash

Doesn't have to be - Blackberry BB10 had damn near solved it. I think they had some patents on it, but these should have expired, and I noticed some corresponding changes in Android. But it's still far from being as good as BB10. What BB10 had was a kind of combined cursor and magnifying glass that controlled really well, plus the ability to tap the thing left or right to move one letter at a time.

johanyc · 3 months ago
https://www.youtube.com/watch?v=Mi-XrtAmRuk

it doesn't look very easy to use in the demo tbh

diziet_sma · 3 months ago
Universal search on Google Pixels has solved a lot of the text selection problems on Android for me, with the exception being selecting text which requires scrolling.
linguae · 3 months ago
I enjoyed this talk, and I want to learn more about the concept of “learning loops” for interface design.

Personally, I wish there were a champion of desktop usability like how Apple was in the 1980s and 1990s. I feel that Microsoft, Apple, and Google lost the plot in the 2010s due to two factors: (1) the rise of mobile and Web computing, and (2) the realization that software platforms are excellent platforms for milking users for cash via pushing ads and services upon a captive audience. To elaborate on the first point, UI elements from mobile and Web computing have been applied to desktops even when they are not effective, probably to save development costs, and probably since mobile and Web UI elements are seen as “modern” compared to an “old-fashioned” desktop. The result is a degraded desktop experience in 2025 compared to 2009 when Windows 7 and Snow Leopard were released. It’s hamburger windows, title bars becoming toolbars (making it harder to identify areas to drag windows), hidden scroll bars, and memory-hungry Electron apps galore, plus pushy notifications, nag screens, and ads for services.

I don’t foresee any innovation from Microsoft, Apple, or Google in desktop computing that doesn’t have strings attached for monetization purposes.

The open-source world is better positioned to make productive desktops, but without coordinated efforts, it seems like herding cats, and it seems that one must cobble together a system instead of having a system that works as coherently as the Mac or Windows.

With that said, I won’t be too negative. KDE and GNOME are consistent when sticking to Qt/GTK applications, respectively, and there are good desktop Linux distributions out there.

gtowey · 3 months ago
It's because companies are no longer run by engineers. The MBAs and accountants are in charge and they could care less about making good products.

At Microsoft, Satya Nadella has an engineering background, but it seems like he didn't spend much time as an engineer before getting an MBA and playing the management advancement game.

Our industry isn't what it used to be and I'm not sure it ever could.

linguae · 3 months ago
I feel a major shift happened in the 2010s. The tech industry became less about making the world a better place through technology, and more about how to best leverage power to make as much money as possible, making a world a better place be damned.

This also came at a time when tech went from being considered a nerdy obsession to tech being a prestigious career choice much like how law and medicine are viewed.

Tech went from being a sideshow to the main show. The problem is once tech became the main show, this attracts the money- and career-driven rather than the ones passionate about technology. It’s bad enough working with mercenary coworkers, but when mercenaries become managers and executives, they are now the boss, and if the passionate don’t meet their bosses’ expectations, they are fired.

I left the industry and I am now a tenure-track community college professor, though I do research during my winter and summer breaks. I think there are still niches where a deep love for computing without being overly concerned about “stock line go up” metrics can still lead to good products and sustainable, if small, businesses.

vjvjvjvjghv · 3 months ago
I have heard a big factor is that a lot of the newer devs don’t really use desktop OS outside of work. So for them developing a desktop OS is more of an abstract project like for me developing software for medical devices which I never use myself.
Normal_gaussian · 3 months ago
It's great to hear from someone who thinks these people still care! It has rarely been my experience, but I haven't been everywhere yet.
XorNot · 3 months ago
GTK's dedication to killing the standard top bar menu layout is on intensely irritating.

We now have giant title bars to accommodate the hamburger menu button, which opens a list of...standard menu bar sub menu options.

You could fit all the same information into the same real estate space, using the original and tested paradigm.

Ekaros · 3 months ago
Somehow I just thought about this on VS Code. Native windows has the top command bar. One of Linux VMs do not... It saves entire line of text... As if I would care about that much vertical space on modern screen.

On other hand. Vivaldi I am trying on phone has this stupid thick bar at bottom on my Android. With essentially bookmarks, back, home, forward and tabs buttons... Significantly more taking visual space...

I am really not sure what is going on in total...

scottjenson · 3 months ago
I've given dozens of talks, but this one seems to have struck a chord, as it's my most popular video in quite a while. It's got over 14k views in less than a day.

I'm excited so many people are interested in desktop UX!

ChuckMcM · 3 months ago
I think you did a great job of bringing fairly nuanced problems into perspective for a lot of people who take their interactions with their phone/computer/tablet for granted. That is a great skill!

I think an fertile area for investigation would also be 'task specific' interactions. In XDE[1], the thing that got Steve Jobs all excited, the interaction models are different if you're writing code, debugging code, or running an application. There are key things that always work the same way (cut/paste for example) but other things that change based on context.

And echoing some of the sentiment I've read here as well, consistency is a bigger win for the end user than form. By that I mean even a crappy UX is okay if it is consistent in how its crappy. Heard a great talk about Nintendo's design of the 'Mario world' games and how the secret sauce was that Mario physics are consistent, so as a game player if you knew how to use the game mechanics to do one thing, you can guess how to use them to do another thing you've not yet done. Similarly with UX, if the mechanics are consistent then they give you a stepping off point for doing a new thing you haven't done but using mechanics you are already familiar with.

[1] Xerox Development Environment -- This was the environment everyone at Xerox Business Systems used when working on the Xerox Star desktop publishing workstation.

NetOpWibby · 3 months ago
Fantastic talk, I found myself nodding in agreement a lot. In my research on next-generation desktop interfaces, I was referred to Ink & Switch as well and man, I sure wish they were hiring. I missed out on the Xerox and Bell Labs eras. I'm also reading this book, "Inventing the Future" by John Buck that details early Apple (there's no reason the Jonathan Computer wouldn't sell like hotcakes today, IMHO).

In my downtime I'm working on my future computing concept[1]. The direction I'm going for the UI is context awareness and the desktop being more of an endless canvas. I need to flesh out my ideas into code one of these days.

P.S. Just learned we're on the same Mastodon server, that's dope.

---

[1]: https://systemsoft.works

calmbonsai · 3 months ago
I concur though per my earlier post I do feel "desktop stagnation" is inevitable and we're already there. You were channeling Don Norman https://jnd.org/ in the best of ways.
wiether · 3 months ago
I'm not into UX much, but listening to someone with such experience and knowledge in their craft, giving a well structured and coherent talk, without shouting or trying to sell anything felt... peaceful?

It's only in the end that I realized I just spent 40 minutes watching the video.

Thanks for sharing it with us!

az09mugen · 3 months ago
Thanks for that nice talk, it felt like a breeze of fresh air with basic & simple yet powerful but alas "forgotten" concepts of UX.

Will look into your other talks.

Deleted Comment

Deleted Comment

p_ing · 3 months ago
This was a great talk! I'm interested to hear what you believe the desktop UX ('human interface') issues to actually be, today.

Rightly your talk was not about specific issues or specific solutions, but as a desktop user (macOS primarily, Windows secondary but historically, and KDE a distant third), beyond the mishmash of different UIs, i.e. Windows 11 presenting Windows 3.x or just outright dumb decisions such as transparent everything, what is it that you want to solve for people on the desktop space to make them more /productive/ than they currently are? Especially now that our primary vehicle to information creation and sharing is not the desktop, but the web browser alone?

averynicepen · 3 months ago
This was a really fantastic talk and kept me riveted for 40 minutes. Where can I find more?
scottjenson · 3 months ago
I post nearly everyone on my blog jenson.org
agumonkey · 3 months ago
where can we find advanced ux labs ? i'm tired of the figma trend
pjmlp · 3 months ago
It was quite interesting.
analogpixel · 3 months ago
Why didn't Star Trek ever tackle the big issues, like them constantly updating the LCARS interface every few episodes to make it better, or having Geordi La Forge re-writing the warp core controllers in Rust?
thaumaturgy · 3 months ago
Because, something that a lot of tech-obsessed Trek fans never seem to really come to terms with, is that Trek didn't fetishize technology.

In the Trek universe, LCARS wasn't getting continuous UI updates because they would have advanced, culturally, to a point where they recognized that continuous UI updates are frustrating for users. They would have invested the time and research effort required to better understand the right kind of interface for the given devices, and then... just built that. And, sure, it probably would get updates from time to time, but nothing like the way we do things now.

Because the way we do things now is immature. It's driven often by individual developers' needs to leave their fingerprints on something, to be able to say, "this project is now MY project", to be able to use it as a portfolio item that helps them get a bigger paycheck in the future.

Likewise, Geordi was regularly shown to be making constant improvements to the ship's systems. If I remember right, some of his designs were picked up by Starfleet and integrated into other ships. He took risks, too, like experimental propulsion upgrades. But, each time, it was an upgrade in service of better meeting some present or future mission objective. Geordi might have rewritten some software modules in whatever counted as a "language" in that universe at some point, but if he had done so, he would have done extensive testing and tried very hard to do it in a way that wouldn't've disrupted ship operations, and he would only do so if it gained some kind of improvement that directly impacted the success or safety of the whole ship.

Really cool technology is a key component of the Trek universe, but Trek isn't about technology. It's about people. Technology is just a thing that's in the background, and, sometimes, becomes a part of the story -- when it impacts some people in the story.

cons0le · 3 months ago
>Because the way we do things now is immature. It's driven often by individual developers' needs to leave their fingerprints on something, to be able to say, "this project is now MY project", to be able to use it as a portfolio item that helps them get a bigger paycheck in the future.

AKA resume-driven development. I personally know several people working on LLM products, where in private they admit they think LLMs are scams

jfengel · 3 months ago
Most of Trek's tech is just a way to move the story along. Transporters were introduced to avoid having to land a shuttle. Warp drive is just a way to get to the next story. Communicators relay plot points.

Stories which focus on them as technology are nearly always boring. "Oh no the transporter broke... Yay we fixed it".

PunchyHamster · 3 months ago
That's fetishizing Star Trek a bit - they had touch interface for controlling the ship in middle of combat, explosions and everything shaking around which is hardly optimal, both on and off combat (imagine levitating hand across touch panel for hours at end)
amelius · 3 months ago
I still wonder why not everybody was lingering in the holodeck all the time.

(equivalent of people being glued to their smartphones today)

(Related) This is one explanation for the Fermi paradox: Alien species may isolate themselves in virtual worlds

https://en.wikipedia.org/wiki/Fermi_paradox

Mistletoe · 3 months ago
Isn't it probably just that they don't really have money in Star Trek so there is no contract promising amazing advances in the LCARS if we just pay this person or company to revamp it? If someone has money to be made from something they will always want to convince you the new thing is what you need.
bena · 3 months ago
LCARS was technically a self-adapting system that was personalized to a degree per user. So it was continuously updating itself. But in a way to reduce user frustration.

Now, this is really because LCARS is "Stage Direction: Riker hits some buttons and stuff happens".

lo_zamoyski · 3 months ago
> continuous UI updates are frustrating for users […] It's driven often by individual developers' needs to leave their fingerprints on something, to be able to say, "this project is now MY project", to be able to use it as a portfolio item that helps them get a bigger paycheck in the future.

Yes, although users also judge updates by what is apparent. Imagine if OS UIs didn’t change and you had to pay for new versions. So I’m sure UI updates are also partly motivated by a desire to signal improvements.

Deleted Comment

dragonwriter · 3 months ago
> In the Trek universe, LCARS wasn't getting continuous UI updates

In the Trek universe, LCARS was continuously generating UI updates for each user, because AI coding had reached the point that it no longer needs specific direction, and it responds autonomously to needs the system itself identifies.

krapp · 3 months ago
>In the Trek universe, LCARS wasn't getting continuous UI updates because they would have advanced, culturally, to a point where they recognized that continuous UI updates are frustrating for users.

Not to be "that guy" but LCARS wasn't getting continuous UI updates because that would have cost the production team money and for TNG at least would have often required rebuilding physical sets. It does get updated between series because as part of setting the design language for that series.

And Geordi was shown constantly making improvements to the ship's systems because he had to be shown "doing engineer stuff."

calmbonsai · 3 months ago
Trek needs to visibly "sci-fi-up" extant tech in order to have the poetic narrative license to tell its present-day parables.

Things just need to "look futuristic". The don't actually need to have practical function outside whatever narrative constraints are imposed in order to provide pace and tension to the story.

I forget who said it first, but "Warp is really the speed of plot".

PunchyHamster · 3 months ago
Case in point - nobody sensible would put realtime ship controls on a touchscreen if the designed use of it was combat or complex human driven manoeuvrers.
Findecanor · 3 months ago
I have often thought that Star Trek is supposed to show a future in which computer technology and user interfaces have evolved to a steady state that don't need to change that much, and which is superior to our own in ways that we don't yet understand. And because it hasn't been invented yet, the show does not invent it either.

It is for the audience to imagine that those printed transparencies back-lit with light bulbs behind coloured gel are the most intuitive, easy to use, precise user interfaces that the actors pretend that they are.

RedNifre · 3 months ago
Because the LCARS GUI is only for simple recurring tasks, so it's easy to find an optimal interface.

Complex tasks are done vibe coding style, like La Forge vibe video editing a recording to find an alien: https://www.youtube.com/watch?v=4Faiu360W7Q

I do wonder if conversational interfaces will put an end to our GUI churn eventually...

PunchyHamster · 3 months ago
Conversational interfaces are slow and will still be slow even if AI latency will be zero.

It might be nice way for making complex, one off tasks by personnel unfamiliar with all the features of the system, but for fast day to day stuff, button per function will always be a king.

rzerowan · 3 months ago
Mostly i believe its that the writers envisioned and were able to wrldbuildinsucha way that the tech was not a subject but was rather a part of the scenery/background with the main object being the people and their relationships. Additionally in some cases where alien tech was interfaced with the characters inthe storysome UI/code rewites were written in, for example in DS9 where the Cardassian interfaces/AI are frustrating to Chief O'Brien and his efforts to remedy/upgrade such gets a recurring role in the story.

Conversly recent versions have taken the view of foregrounding tech aidied with flashy CGI to handwave through a lot.Basically using it as a plot device when the writing is weak.

JuniperMesos · 3 months ago
Man, I should hope that the warp core controllers on the USS Enterprise were not written in C.

On the other hand, if the writers of Star Trek The Next Generation were writing the show now, rather than 35-40 years ago - and therefore had a more expansive understanding of computer technology and were writing for an audience that could be relied upon to understand computers better than was actually the case - maybe there would've been more episodes involving dealing with the details of Future Sci-Fi Computer Systems in ways a programmer today might find recognizable.

Heck, maybe this is in fact the case for the recently-written episodes of Star Trek coming out in the past few years (that seem to be much less popular than TNG, probably because the entire media environment around broadcast television has changed drastically since TNG was made). Someone who writes for television today is more likely to have had the experience of taking a Python class in middle school than anyone writing for television decades ago (before Python existed), and maybe something of that experience might make it into an episode of television sci-fi.

As an additional point, my recollection is that the LCARS interface did in fact look slightly different over time - in early TNG seasons it was more orange-y, and in later seasons/Voyager/the TNG movies it generally had more of a purple tinge. Maybe we can attribute this in-universe to a Federation-wide UX redesign (imagine throwing in a scene where Barclay and La Forge are walking down a corridor having a friendly argument about whether the new redesign is better or worse immediately before a Red Alert that starts the main plot of the episode!). From a television production standpoint, we can attribute this to things like "the set designers were actually trying to suggest the passage of time and technology changing in the context of the show", or "the set designers wanted to have fun making a new thing" or "over the period of time that the 80s/90s incarnations of Star Trek were being made, television VFX technology itself was advancing rapidly and people wanted to try out new things that were not previously possible" - all of which have implications for real-world technology as well as fake television sci-fi technology.

bigstrat2003 · 3 months ago
> recently-written episodes of Star Trek coming out in the past few years (that seem to be much less popular than TNG, probably because the entire media environment around broadcast television has changed drastically since TNG was made)

That's probably part of it. But the larger part is that new Star Trek is very poorly written, so why is anyone going to bother watching it?

AndrewKemendo · 3 months ago
Because it’s a fantasy space opera show that has nothing to do with reality
sprash · 3 months ago
Unpopular take: Windows 95 was the peak of Desktop UX.

GUI elements were easily distinguishable from content and there was 100% consistency down to the last little detail (e.g. right click always gave you a meaningful context menu). The innovations after that are tiny in comparison and more opinionated (things like macos making the taskbar obsolete with the introduction of Exposé).

fragmede · 3 months ago
Heh, the number of points you've probably gotten for that comment, I don't think that it's that unpopular. Win 98 was my jam but it looks hella dated today, but as you said, buttons were clearly marked, but also menus were navigatible via keyboard, soms support for themes and custom coloring, UIs were designable via a GUI builder in VB or Visual Studio using MFC which was very resource friendly compared to using Electron today. Because smartphones and tablets, but even the wide variety of screen sizes also didn't exist so it was a simpler time. I can't believe how much of a step back Electron is for UI creation compared to MFC, but that wasn't cross-platform and usually elements were absolute positioned instead of the relative resizable layout that's required today.
kvemkon · 3 months ago
> buttons were clearly marked

Recently some UI ignored my action by clicking an entry in a list from drop down button. It turned out, this drop down button was additionally a normal button if you press it in the center. Awful.

> UI creation compared to MFC

Here I'd prefer Borland with (Pascal) Delphi / C++ Builder.

> relative resizable layout that's required today.

While it should be beneficial, the reality is awful. E.g. why is the URL input field on [1] so narrow? But if you shrinks the browser window width the text field becomes wide eventually! That's completely against expectations.

[1] https://web.archive.org/save

SoftTalker · 3 months ago
I would say Windows 2000 Pro, but that really wasn't too different from Windows 95. The OS was much better though, being based on NT.
Telaneo · 3 months ago
I don't think it's a stretch to call it the UI language of 95, while 2000 just adds more functionality within the bounds of that framework. Add in the Win7 search bar in the start menu, and the OS not crashing, you haven't really done anything of note with the UI beyond staying within its framework. It'll still be a Win95 UI.

Meanwhile, WinXP started to fiddle with the foundation of that framework, sometimes maybe for the better, sometimes maybe for the worse. Vista did the same. 7 mostly didn't and instead mostly fixed what Vista broke, while 8 tried to throw the whole thing out.

throaway45425 · 3 months ago
KDE Plasma is better than all of these right now. Cinnamon also.
porise · 3 months ago
I didn't watch the 45 min video but does he mention tiling environments? They have solved every complaint I had before.

I can immediately swap to the exact windows I want without tabbing, I can rebind everything to pull up whatever application I want, and I can even switch a window to floating.

Deleted Comment

DonHopkins · 3 months ago
Golan Levin quotes Joy Mountford in his "TED Talk, 2009: Art that looks back at you":

>A lot of my work is about trying to get away from this. This a photograph of the desktop of a student of mine. And when I say desktop, I don't just mean the actual desk where his mouse has worn away the surface of the desk. If you look carefully, you can even see a hint of the Apple menu, up here in the upper left, where the virtual world has literally punched through to the physical. So this is, as Joy Mountford once said, "The mouse is probably the narrowest straw you could try to suck all of human expression through." (Laughter)

https://flong.com/archive/texts/lectures/lecture_ted_09/inde...

https://en.wikipedia.org/wiki/Golan_Levin

https://www.flong.com/

https://en.wikipedia.org/wiki/Joy_Mountford

https://www.joymountford.com/

rustcleaner · 3 months ago
IMO a good interface should behave like it gives priority to the universe of workflows, that it respects the universality of Universal Machines and doesn't just merely resemble an appliance. Siemens NX, MSOffice/LibreOffice, Adobe CS6 Master Collection, and KDE 3.5 (Konqueror > Dolphin) are all examples off the top of my head of applications which, in varyng amounts, respected this universality principle in UI/UX design. They were fields to put your workflows together on, like ComfyUI is in image generation. Gnome has been the antithesis of this. Any kind of "simplified design" just seems to me like an artist has control and not an engineer-mathematician.

Great UI/UX will foster emergence in habits and workflows, AND AVOID BREAKING MUSCLE MEMORY AT ALMOST ANY COST! Terrible UI/UX will create hard but beautiful chutes to push cattle through and into the money fleecing machine.