Readit News logoReadit News
hota_mazi · 6 years ago
I remember booting Windows 95 for the first time when it came out and being completely gobsmacked at how good it looked and felt.

I was still a heavy Amiga fan back then, even though I was painfully aware that my favorite computer of all times was slowly falling behind. But it was still able to do preemptive multitasking, something that was still widely unavailable across all OS at the time (except for Windows NT). AmigaOS was still definitely better than Windows 3, of that I was still very convinced (and quite distraught that despite this technical superiority, Windows 3 reigned supreme).

All of my convictions got shattered that fatal day I booted Windows 95. The UI was beautiful, preemptive multitasking was working fine despite the memory and CPU conditions at the time. I just couldn't get enough of launching various apps on Windows 95 just to see how they looked.

On that very day, I thought "This is it, Amiga is truly dead".

I sold my Amiga and bought a Windows box in the weeks that followed, with a heart that was both heavy and excited.

Sohcahtoa82 · 6 years ago
My dad was a huge Amiga fan. I always felt that the Amiga was a bit ahead of its time. It was doing SNES graphics and sound in the NES era.

It's a shame it never caught on beyond the Video Toaster being used in TV/movie studios.

ido · 6 years ago
You're right but to be a nitpicker a more appropriate comparison would be the sega megadrive/genesis - it had very similar hardware and capabilities to the original amiga (1000/500/early 2000s).
NeedMoreTea · 6 years ago
Then I burned my first CD on Windows 95, ended up with a coaster because I moved the mouse or Windows played a sound, and my "obsolete" Amiga would end up sticking around for another half decade... Amiga simply crashed far less and could do pre-emptive properly. A screen saver or music app didn't ruin CD burning. I ended up finally migrating to Win with Windows 2000 as Win was consistently worse until then -- at everything.

The Win 95 box, bought at insane cost as I spcced top end everything including SCSI and 10k drives became my first FreeBSD server - v2.1 IIRC - within months of (finally) buying into Windows. Kept the subscription to BSD CD releases until they shuttered it to support them...

Granted I was on 040 A3000 and A4000 with 24 bit retargetable graphics by the time of Windows 95, and was no longer earning the living off Amiga by then.

hota_mazi · 6 years ago
An Amiga / Windows flame war in 2019, yay!
Timberwolf · 6 years ago
I'm fascinated by how many of the things they identified as helpful to new users were later reverted. Looking at my Windows 10 desktop all of the work they did to find a solution where all users identified a program as "running" has been undone - it's back to icons with only a subtle indicator to distinguish between those that represent shortcuts and those that represent a running program.

The Start Menu is now assumed knowledge that the furthest left icon on the taskbar is "special" and does something different to all the other shortcuts. I guess the modern equivalent of "Start" is "Type here to search" in the Cortana bar but that's not a great experience for new users. We all know the propensity for it to decide to search Bing at the slightest provocation but if you try some natural "never used a computer before" things like searching for "power off" or "shut down" you get some quite unhelpful results. (The former wants me to set up a power plan, the latter directs me to Add or Remove Programs)

It feels like computer use is assumed knowledge in 2019 - that everyone who buys a computer already knows what the Windows logo represents, how minimising and overlapping works, what the difference between an app icon and a notification tray icon is, and so on. Microsoft no longer feel the need to design so much for people buying their first ever family computer, having never even used one before. Probably true in the first world, but I wonder if this holds out globally?

pdonis · 6 years ago
I think one key factor is that, whereas this article talks about making Windows easy for home users being a priority because of the huge untapped market there at the time, home users are no longer a real concern of MS: they make their real money from large corporate deployments of tens or hundreds of thousands of copies of Windows, not from individuals. MS basically assumes that, once everyone is using a Windows computer at work, they will have a Windows computer at home as well, and that they will learn how to use it at work because they will have to, and their employer will train them as needed to be able to accomplish work tasks, which will then give them enough basic knowledge to use their home computer. In other words, MS has outsourced the job of training people how to use computers to their corporate customers, so their design process no longer worries about it.
chungus_khan · 6 years ago
In my experience it is definitely a problem that comes up a lot introducing Windows 10 to people who have never used a computer before. They don't know what is running and what isn't, they don't know which window is focused if there are several on screen at once, they are afraid to poke around in the start menu in a way they weren't when it was simpler and clearer, and in general don't understand the meanings of icons and menus that used to be properly labeled. Windows 95 was a lot more clear and discoverable for someone who had never used a graphical computer.
dspillett · 6 years ago
> ... people who have never used a computer before. They don't know which window is focused if there are several on screen at once

This isn't just a problem for beginners: I'm old hat and I sometimes can't tell what Windows 10 has given focus to at times when I have several things on the go, especially over two or more screens where windows overlapping isn't the obvious go-to clue. The distinction between focused and not is sometimes so close to non-existent it might as well be completely non-existent (like the titlebar text+icons being a slightly different shade of grey), and it varies from app to app (even amongst Microsoft's output) so there is not one set visual cue to follow.

It definitely used to be better than this, including in Windows land.

When I get around to it (so probably never!) I intend to write a little tool that scans for the top-most window and draws a bright border around/over it somehow. I know this is possible (and probably not difficult) as I did some similar hacky window decorating back in the Win2K days[‡], but I've been almost entirely a database+infrastructure fellow for more than a decade and my desktop dev knowledge has rotted terribly.

[†] an always-on-top window positioned so it is a line across the top of the focused window would do, four such objects, one for each side, would be the easy hacky way to achieve a border, a single drawing surface with transparency and mouse click-through would be cleaner but with my current skillset more faf working out the relevant API jiggery-pokery or finding a library that wraps that nicely already

[‡] Using Delphi. Anyone else remember that? Does it still exist in a similar form?

deepspace · 6 years ago
> people who have never used a computer before

To be fair, back in 1995 the number of people who had never used a computer before was huge, so it made sense for companies like Microsoft to cater to their needs.

Today though, the fraction of people who truly have no computer experience must be utterly minuscule, so I can see that MS does not see the beginner experience as a priority any more.

sedatk · 6 years ago
> They don't know what is running and what isn't

And they shouldn't. That's the idea Windows 10 tries to convey. Windows Store apps already stay in the background even after you close them. You shouldn't care which app is running or not. Clicking its icon would bring it up and that's the only thing the user needs to know.

iOS and Android already work that way.

dmos62 · 6 years ago
As much as I dislike the general Windows UX, which is largely because I can't run a custom window-manager, I feel like I'm the only person that thinks Windows 10 is a big improvement. Maybe we don't have to look at it as a subjective like/dislike type of thing and just say that for me it's an improvement.

I use the classic non-contracted-to-an-icon task-bar, in the small variant. That's just to say that my task-bar looks and feels the same as it would on Windows 95.

The big improvement for me is the start screen. I use the fullscreen start-menu (I call it screen since it's fullscreen), where I put the shortcuts I want easy access to. It was introduced in Windows 8, but later the defaults were reverted to a 95-style mini-menu.

I have desktop disabled (desktop icons, more specifically), because I find that a desktop has a negative effect on organization, so all my icons are in the fullscreen start.

When I need a shortcut that's not pinned to my start, Win-S pops up the search dialog.

Apart from bloat, I'm pretty content with this UI.

fortran77 · 6 years ago
I was a big fan of Windows 95 (I was at the Launch Event!). I was working at Adobe then, porting Mac apps over to Windows.

But I also like Windows 10. In fact, I switched "back" from Mac OSX to Windows 10 several years ago when there was no decent "Pro" desktop option from Apple anymore and am completely happy. I then switched laptops over to Windows 10, too, from Macbooks because there are some great Windows laptops out now.

The UI is very good. Sure, every now and then you delve deep into control panel and see an old wonky UI that's a holdover from a prior OS, but 99.5% of the time, everything is consistent, stable, and rational.

gbrown · 6 years ago
Maybe it's my particular setup, but I've had nothing but problems using a Windows 10 workstation which I routinely remote into. Graphical scaling is unstable and screwy, I constantly have to select "reset view" in outlook, programs routinely decide to start off screen, and my two identical monitors behave differently for reasons I've been unable to discover in settings.

I'd just use Linux, but at my organization everything is Outlook-centric.

enriquto · 6 years ago
you could certainly run a custom window manager in win95. That's what I did during the best years of my life. No idea if with the windowses of today is still possible.
com2kid · 6 years ago
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion\Winlogon

go to the shell key and replace it with whatever you want.

This has been supported since at least Windows 2000!

whalesalad · 6 years ago
Win10 is particularly bad. I have recently started to maintain some Windows infrastructure and have been simulating a new deployment with Server 19, Win 10 clients, Win 7 clients, etc... and the start menu on modern Windows is absolutely unbearable. That combined with the constant full screen takeover modals for basic permissions, and the default icon collapse task bar ... it’s definitely a high bar for a new user.
uryga · 6 years ago
you can change the way permission modals display to make them a pop-up window w/o the fullscreen! makes them much less jarring
kevin_thibedeau · 6 years ago
Don't forget that the start menu has regressed into a CLI because it is too ponderous to find anything by graphical navigation. Regressed, because typed querys take seconds to complete.
Timberwolf · 6 years ago
I loved this feature in Windows 7, because all it did there was an exact substring match over the items in the Start menu. As a result, it was instant.

The Windows 10 one where it's trying to do a Bing search, look through the Microsoft Store for apps to buy and who knows what else is... I can see what they're trying to do, but if you have 24 years of ingrained habit of only using Start to launch programs it's annoying having the feature made so CPU and disk intensive to add options you don't use. At least Microsoft have put a lot of work into improving it. Using some of the early iterations on a slow spinning-rust laptop would frequently result in the process searching your installed programs timing out, leaving only the web search option.

fortran77 · 6 years ago
On both Mac and windows, I start programs by typing the first few letters in the search box and clicking.
amyjess · 6 years ago
So pretty much the only thing I boot into Windows for nowadays is to play FF14.

My routine is always the same: log in, open Chrome if it didn't open already (happens 50% of the time), and then open the start menu, type fi, and click the first icon that appears.

vbezhenar · 6 years ago
Similar thing happened in Apple with iOS 7. Before they used skeuomorphic design using realistic images to represent UI which was supposed to help people using computing devices first time. E.g. leather notepad to represent reminders application, 3D buttons, etc. They abandoned this design paradigm and switched to minimalist one probably because most of their users became computer-literate and did not need those analogues to understand UI.
goalieca · 6 years ago
I think going with a simple icon is far better. Consider the design of symbols such as corrosive or the street is icy, or s-curve coming up ahead. They do not look like real-world images but they are so wonderfully intuitive. We do not need to stare at the detail to understand what it is.

Likewise, the new icons are brilliant. The reminders app has basically a list. Music is a musical note, etc. The only one that really sucks is "photos". It looks more like a colour wheel.

sedatk · 6 years ago
They reverted some of the things because the paradigms around them have changed, such as "running applications". That paradigm is dead on iOS and Android and is slowly dying on desktop operating systems too. We're getting over the times when we have to carefully consider which apps we should be running to conserve memory, CPU etc. That's not something users should be thinking about.

Modern equivalent of the Start Menu is still the start menu. When you click on it, it shows a list of applications available on the system. Shut down is also the nearest available option there.

So, I disagree with your sentiment that Windows 10 has rolled back good ideas. Windows 10 is simply the best Windows to date ever in all aspects.

xyzzy_plugh · 6 years ago
The problem is that it is worse. The system begins guessing what users want, and that means the system will be wrong. It often is.

For example, on my Pixel 3, Spotify is terminated if I open the camera app around 80% of the time. Is it a resource issue? Who knows! The system gives me no insight.

Is this what users really want? I'm very doubtful.

JohnFen · 6 years ago
> Windows 10 is simply the best Windows to date ever in all aspects.

Wow.

My opinion is very, very different. The best Windows to date, in my opinion, in Win 7. I even consider Win 95 to be better than Win 10.

Asooka · 6 years ago
But have there been long and deep usability studies with users from all ages and experiences to determine if this is the most comfortable and easiest to use system? Or did Steve Jobs go on stage, show the iPhone tailored to his preferences and usage style and then everyone copied that? Because I kind of feel like the systems you describe are doing little more than cargo-cult programming mixed with envy for whatever is "cool" at the time.
fludlight · 6 years ago
The main issue is organizational incentives. Microsoft has thousands of employees in the windows division who need to justify their jobs by constantly changing things. They call this innovation but it’s really just busywork to minimize the chance that they get fired or their boss’s boss’s boss loses clout in the company. The major innovations in windows over the past 20 years have been increased stability, an App Store, better security, easier control in corporate deployments, and some cloud and local backup features, the rest were mostly unnecessary.
swiley · 6 years ago
Man, if you assume knowledge and expect people to type what they want why not just use a decent CLI?
mackrevinack · 6 years ago
press the "start" button to stop your computer? i really doubt this was this helpful to new users. it does make sense for most other things though, like launching programs and maybe even the restart option, but for the shutting down option its fairly illogical so its strange that it took them so long to switch to an icon instead.
Asooka · 6 years ago
I have so far only seen computer nerds complain that "shutdown" is under "start". Regular users have no problem starting the shutdown process.
londons_explore · 6 years ago
I love the fact that one of the usability issues identified 25 years ago in windows is still the case today.

* If you "cut" a bit of text, it disappears from your document, and unless you later paste that text, it's gone forever.

* If you "cut" a file, but never paste it, the file stays in its original location, contrary to the users expectations.

This kind of thing is really the implementation details showing through into the UI (the clipboard is not a place on disk, and cannot have files and directories moved into it)

kitsunesoba · 6 years ago
This is likely why in macOS, you can’t cut files but instead have to copy them and then move them with Command-Shift-V (holding down Shift changes the title of the Paste menu item) — the intention to move the file is indicated on the pasting half of the operation, dodging the Cut issue entirely.

Also, cut never really made metaphorical sense for files… you don’t break out the scissors to transfer documents from one folder to another.

jxdxbx · 6 years ago
I get the menu item to change name with option, not shift.

Incidentally I've been using a Mac for decades, but I never remember to check what modifiers do. I'd always just copy and then go back and delete.

I think the right way to deal with this is to have a fancy clipboard manager / shelf, frankly. Cut things, they're put on the shelf, paste pastes in the top thing in the shelf. I suppose lots of people might just end up with cluttered shelves and lose track of stuff. But the shelf concept is a good GUI power user thing.

Wowfunhappy · 6 years ago
What happens if you "cut" an important file, and then forget to paste it before copying something else? You could easily loose many hours of work!

That can happen with text too, of course, but because text takes up more screen real-estate, selecting lots of text "feels" dangerous, so you're less likely to accidentally screw yourself.

--

When you "cut" a file in early Windows, did it disappear from the original location, or did the icon fade out as on modern platforms? IMO, the fade makes it clear what is happening, and is a great example of how minor visual tweaks can be imbued with lots of meaning.

JonathonW · 6 years ago
Windows 95 behaved like Windows does today: "cutting" a file faded out the icon until it was pasted someplace else.

Prior versions of Windows did not have an equivalent to the "cut" operation; for example, Windows 3.1's File Manager moved files via a "File > Move..." menu option, which opened a dialog with two text fields to specify the source and destination. (Copying behaved similarly; the document-editing-inspired Ctrl-C, Ctrl-V, and Ctrl-X bindings used today were all introduced for file management in the Windows world in Windows 95.)

derefr · 6 years ago
I remember https://en.wikipedia.org/wiki/Shell_Scrap_Object_File s. (Most people didn't realize this was a thing. I'm not at a Windows PC at the moment: is it still a thing?)

I've always thought it'd make a lot of sense, for any modern OS, if every time you cut something, you'd be creating a .scrap on the desktop; and then, if you later pasted it, the .scrap would go away. Desktop-as-spool-directory. Same as the way macOS tried to use the desktop as a spool for screenshots until recently (where now the screenshot spool has become a mysterious locationless storage displayed in an ephemeral modal window.)

londons_explore · 6 years ago
This doesn't work well with multiple drives. If you cut an object from a usb stick, should it get transferred entirely to your desktop (maybe taking many hours for big files), for you to then paste it somewhere else on the usb stick? (When a direct move would be near instant).

If instead you want a .scrap directory on every volume, you have a new inconsistency - if something is cut, and the device removed before the paste happens, the paste will fail.

unicornfinder · 6 years ago
I'm honestly impressed that they thought so in depth about these things. That said, I'm of the opinion that a clipboard manager should be a standard part of an OS and would arguably solve this problem.
Wowfunhappy · 6 years ago
This wouldn't have been considered a problem in the 90s, but my concern with clipboard managers is that they would end up containing a lot of sensitive information, particularly passwords from my password manager.

(Not that this isn't a concern otherwise, but there's a difference between a pasteboard that holds one piece of information at a time and immediately purges old information, and one that logs history for an extended period.)

zamadatix · 6 years ago
Windows 10 includes one with clipboard history now.
zamadatix · 6 years ago
Windows 10 has a clipboard manager with clipboard history now. It's largely just an awareness issue at this point.
judah · 6 years ago
I've used Windows for decades and didn't know about this until today.

Start -> Clipboard Settings -> Clipboard History

Once on, Win+V shows clipboard history. Super cool.

ShamelessC · 6 years ago
I had no idea! How do you access it?
gambler · 6 years ago
Worth remembering: the original windowed interface on Xerox machines was a view into underlying system objects. It was designed around a unified vocabulary of interactions that allowed user to message those objects and also direct inter-object communication:

https://www.youtube.com/watch?v=Cn4vC80Pv6Q

Xerox -> Apple -> Microsoft interface transfer preserved nothing of those core concepts. UI became a crutch developers grudgingly added to the system for "those stupid users". Thus, most software engineers today are still convinced that a teletype emulation is the best possible interface to the underlying OS that can possibly exist. Also, normal users are treated as second-class citizens in their own systems.

swiley · 6 years ago
> Also, normal users are treated as second-class citizens in their own systems.

I would argue that’s an artifact of corporate software development culture and not GUI design.

ChrisSD · 6 years ago
Settings > System > Clipboard

Make sure "Clipboard history" is turned on. It will also tell you how to access the manager: "Press the Windows logo key + V".

g4d · 6 years ago
A recent example of this for me is using Abaqus (an FE solver). When using it to set up models I usually set all of the boundary conditions and contact interactions in a script. It wasn't until I was helping a colleague (who was using the gui) with a problem that I realised some of the options just aren't available through the gui.
tolger · 6 years ago
I really like the Windows 95 UI. That's why Windows 2000 was my favorite version of Windows, ever. It was very consistent and functional throughout.
coldpie · 6 years ago
Yeah, despite 20 years of "progress", I don't see today's desktops being any more usable than Win 95's, for the typical mouse-and-keyboard setup. There's a couple niceties that have been introduced since then, mostly keyboard shortcuts for window management tasks, but otherwise it's all change for the sake of change, and change for the sake of advertisements.
shantly · 6 years ago
Launch-programs-by-search becoming common is the only significant advance I can think of since the 90s. I've added some keyboard window management to my personal workflow since then but I don't know any non-geeks who do that—hell, most of them don't use launch-by-search either.
zamadatix · 6 years ago
Compositing was huge. I agree most of the usability stuff is around the window management improvements though. Overall I wouldn't say there has been a lot of change though, you could teleport a Windows 95 user into the present and they'd immediately be running with the current Windows 10 desktop.

Deleted Comment

sdegutis · 6 years ago
Personally I think VS Code is the epitome of the evolution of the ~~Windows~~desktop UI, and it’s one of my favorite UIs ever.
shantly · 6 years ago
I really miss that design. It's a lot more relaxing to look at than modern "flat" UI, IMO. Depth and consistent use of UI elements make it so easy to tell what's what. Modern UI seems to be designed to look like a static glossy brochure, even when it's full of interactive elements.
pier25 · 6 years ago
I agree. Even if 95/98/2000 look outdated by today's standards, these are still the best designed Windows IMO.

Windows 10 is really a frankenstein.

airstrike · 6 years ago
The fact that the Settings app and the Control Panel are allowed to coexist baffles me
Sohcahtoa82 · 6 years ago
Windows 2000's UI was a slightly evolved 95 UI, but I think Win7 is my favorite. I used the classic theme so it looked like 2000.

And since most home users had no experience with 2000, they saw the theme and thought I was still running 98 or ME in 2010.

gruez · 6 years ago
>That's why Windows 2000 was my favorite version of Windows, ever

you could still get the same look (with the "classic" theme) up until windows 7.

Kwpolska · 6 years ago
You get the (IMO ugly) look, but none of the feel. The old start menu, the old explorer, etc. are not available with the classic theme.
tomjen3 · 6 years ago
I installed Lbuntu on an older ultra-portable machine, and while not extremely pretty the UI is extremely snappy -- and mostly resembles win 95. I do miss the ability to search for a program by name, but that is about it.
worble · 6 years ago
Lubuntu is using LXQT these days right? I'm running it on arch, but by default that DE should support menu searching out of the box, and if it doesn't I expect it's been configured out for some reason. Definitely worth playing around to bring it back.
cmrdporcupine · 6 years ago
One of the trends that W95 started that I still find objectionable is putting the 'X' close button gadget next to the other window controls, so sloppy mousing can lead to accidental closures when the intent was minimization or maximization.

This is one thing that the original Lisa and MacOS got right (and NeXTstep, and GEM and I think AmigaOS) but W95 and its successors did not. The close button was on the far left, and the other actions on the right.

(EDIT: my recollection was wrong about the Lisa, I think. Its window controls were not as clear as Mac OS)

Unfortunately OS X inexplicably adopted the W95 conventions. And in the first Aqua releases made it even worse by hiding the functionality icons until mouse-over.

elweston2 · 6 years ago
It was annoying. The problem they had was the upper left already had a control there. The system menu. In some programs it is still 'there' but hidden. You can see it if you left click on the upper left corner. They could not get rid of it as some old win3x programs went trolling around in that menu and changed it.
paulmooreparks · 6 years ago
Double-clicking the upper-left corner closed the window, a carryover from earlier versions. I think it still does this, to support programs that auto-click (such as quick and dirty corporate IT apps and such). They couldn't put a single-click control in the same spot.
cmrdporcupine · 6 years ago
I guess? Couldn't they just have moved that control to the right? Or put minimize & maximize next to it and had close on right, like in NeXTstep?

Somehow everyone thought it was a great idea to copy them.

I change this in most of my window managers on Linux, but Chrome stubbornly insists on doing its own window controls.

Deleted Comment

criddell · 6 years ago
> putting the 'X' close button gadget next to the other window controls

My keyboard has volume up (Fn-F11) right beside shut down (Fn-F12). I've accidentally turned my computer off a few times.

swiley · 6 years ago
If we’re going to talk about wm. CWM is definitely the best: the mouse is only used for sloppy positioning and focus changing (not even topping/level changing.) Everything else is done via the comparatively precise keyboard.
AnIdiotOnTheNet · 6 years ago
I feel like Microsoft... well, basically everyone in tech really, hasn't actually given a damn about the user experience of their products for quite a while now. Personal computing used to be about enabling people to use technology to make their lives better and OSs like Win95 were focused on allowing the user to leverage the power of computing for themselves. Nowadays, computing is apparently about herding users like cattle so you can get them to look at more ads and harvest their sweet sweet data. Developers stopped caring about user experience because thinking of users as people would make their job of treating them like cattle harder.
Timberwolf · 6 years ago
Every once in a while I'll catch myself trying to read web content in the tiny window between the sidebar ads, the floating video, the cookie control panel and the "other content you might like" block... and I'll stop, pull back, and ask myself, "how did computing get this BAD?"

Especially in the web context, I think it's as you say: the real product has little to do with the task you came to the site for, so everything is trying to distract you away from that task. The end experience is as if the people who used to run warez pages with 9 giant buttons to download dodgy IE toolbars and one 20px link to get the actual file grew up and got jobs running mainstream news sites.

hombre_fatal · 6 years ago
> Every once in a while I'll catch myself trying to read web content in the tiny window between the sidebar ads, the floating video, the cookie control panel and the "other content you might like" block... and I'll stop, pull back, and ask myself, "how did computing get this BAD?"

Seems pretty nice to me: I have control over my browser. I can simply elect to use any "Reader Mode" extension (or feature of the browser). Or use an anti-annoyances list on uBlock to avoid all sorts of cosmetic warts.

I can even configure my adblocker to remove entire parts of the layout like sidebars and navbars. It's amazing.

I don't have much choice when I'm using a poorly designed native app. And I'm thankful I am able to rely less and less on native apps. The computing experience is only getting better and better in my eyes.

mschuster91 · 6 years ago
At least the cookie panels are not the fault of IT, they're the fault of greedy marketers and clueless politicians trying to rein them in.

For the inventor of floating video ads, let's just say I hope that hell exists.

nsxwolf · 6 years ago
I can't even read the article for more than a few seconds because I get redirected to one of those full page "Your computer has a virus" ads that won't let you hit the back button.
LeftHandPath · 6 years ago
> the real product has little to do with the task you came to the site for, so everything is trying to distract you away from that task

This is really the biggest thing I had against the reddit redesign (and why I love the simple UI of Hacker News). It cleared away space for ads on either side, made the platform as a whole better suited to image- or video-content (dissuading users from using text posts on the site for discussion), and ultimately felt - exactly as you described - "distracting". It made it blatantly clear that the site was for upvoting pictures, getting inundated with political propaganda from whichever side is currently paying more, and ultimately wasting the user's time.

Facebook is feeling the same way.

I've literally gotten to the point where I separate "good content" from "time-wasting content" by browser - anything that I can consider educational, time-sensitive, or otherwise important (... I have a list of pinned tabs; Schwab, ThinkOrSwim, Financial Times, Bloomberg, The Economist, the Wall Street Journal) goes in Opera.

Anything that I save solely for my spare time goes in Firefox. All of my social media accounts go in firefox (google is the only thing I knowingly allow to track me between sites in Opera). Reddit and Hacker News go in Firefox (although sometimes I browse HN without logging in in Opera, so that I don't have to switch browsers to log into news sites with paywalls).

It makes it really easy to keep myself focused when I need to do something important, and it also makes it easier to live with some of the more annoying (but security-bolstering) browser settings I currently use.

smacktoward · 6 years ago
The peak of systematic thinking about UX in tech was arguably the 1984 Macintosh Human Interface Guidelines (https://www.amazon.com/exec/obidos/ASIN/0201622165/guidebook...).

Unfortunately it only took a few years for Apple to start sliding away from their own guidelines, and Windows was always less rigorous to begin with. And then the Web came along and blew the state of UX back to the Stone Age. Sigh.

Lammy · 6 years ago
The Mac is great and all, but I've used a 1984 Mac and I don't think I want to adhere to a guideline centered around running a single fullscreen application at a time.
pvg · 6 years ago
That's an odd thing to say at the tail end of a decade that's seen very rapid and broad adoption of new interaction technologies - touch and voice, to pick a couple. The former in particular is a radical departure from the desktop metaphor. These things take substantial research and development efforts. The result is, in the industrialized world, not merely a computer in everyone's hand but a far more 'personal' and capable device than any Win 95 PC ever was.
cesarb · 6 years ago
> not merely a computer in everyone's hand but a far more 'personal' and capable device than any Win 95 PC ever was.

I don't know if it's just me, but my feeling is that a Win 95 PC is far more "personal" than any modern smartphone. With a smartphone, you are limited by whatever the mass-produced OS and apps allow (and it's getting worse - more recent OS releases allow much less access to the filesystem, for instance), while a Win 95 PC was wide open and could be customized the way you wanted - in the extreme, the whole operating system could be easily replaced.

That is: when I use my smartphone, I feel like I'm a guest at a Google-owned hotel. When I use my PC, I feel like I'm at my own home.

ssss11 · 6 years ago
This is so accurate. Thank you. I’ve lost all passion for technology as an industry, unless I find a suitable opportunity to fix the current issues.
pishpash · 6 years ago
You can thank Google and Facebook for that. Tech changed permanently with the 2.0 batch.
tomc1985 · 6 years ago
It's the appliance vs. tool mentality. Computers were a tool, but now they're just an appliance
starsinspace · 6 years ago
And they figured this all out without so-called "telemetry" in Win 3.1. Isn't that amazing. Almost as if today's "telemetry is absolutely necessary for improving UX" mantra isn't actually true...
BenjiWiebe · 6 years ago
Telemetry could be an indicator of when they've pushed the user too far in swallowing the horrible design decisions.