> For a long time the X Window System had a reputation for being difficult to configure. In retrospect, I’m not 100% sure why it earned this reputation, because the configuration file format, which is plain text, has remained essentially the same since I started using Linux in the mid-1990s.
It's because X's config files were asking you questions that there was no good way of knowing the answers to other than trial-and-error. (After all, if there was some OS API already available at the time to fetch an objectively-correct answer, the X server would just use that API, and not ask you the question!)
An example of what I personally remember:
I had a PS2 mouse with three mouse-buttons and a two-axis scroll wheel ("scroll nub.") How do I make this mouse work under X? Well, X has to be told what each signal the mouse can send corresponds to. And there's no way to "just check what happens", because any mouse calibration program is relying on the X server to talk directly to the mouse driver — there wasn't yet any raw input-events API separate from X — so in the default X configuration that assumes a two-button mouse, none of the other buttons on the mouse get mapped to an X input event, so the mouse calibration program won't report anything when you try the other parts of the mose.
So instead, you have to make a random guess; start X; see if the mouse works; figure out by the particular way it's wrong what you should be telling X instead; quit X; edit the config file; restart X; ...etc.
(And now imagine this same workflow, but instead of something "forgiving" like your mouse not working, it's your display; and if you set a resolution + bit-depth + refresh rate that add up to more VRAM than you have, X just locks up the computer so hard that you can't switch back to a text console and have to reboot the whole machine.)
Yup, things are so much better now that they just work. Except when they don't, because now it's harder to do anything about it.
I've lost count of the number of Linux machines I've seen that won't offer the correct resolution for a particular monitor (typically locked to 1024x768 on a widescreen monitor).
I don't know whether the problem's with Linux, Xorg, crappy BIOSes or crappy monitors - but even now I occasionally resort to an xorg.conf file to solve such issues.
Do you work with a lot of KVMs? Directly plugged monitors usually just work thanks to EDID info, but cheap KVMs frequently block that signal and cause problems. It's rare for a monitor plugged directly into the computer to have problems these days, even on Linux.
> For a long time the X Window System had a reputation for being difficult to configure.
Honestly I thought it was hard to configure because until I used Linux, my X terminals didn’t need to be configured at all!
I may be misremembering but I think my NCD terminal used bootp and probably a little tftp, then off it went. The hardest part was probably finding an Ethernet drop to plug it into.
You surfaced memories of childhood me installing RedHat 5.2, carefully selecting packages and X config options, getting it wrong, not knowing how to get back to that magical installation UI, and reinstalling the OS just to have another crack at it.
Eventually I figured out how to launch that xconfig utility and found some sane defaults, and was thrilled when I finally saw the stippling pattern or even a window manager.
The manual that came with your laptop of 25 years ago isn't going to tell you whether your touchpad is Alps or Synaptic, or which PS/2 protocol it imitates.
To the people down-voting you: X is from a time when devices actually came with manuals. When the people using it were engineers and scientists and reading a datasheet or a manual was a normal thing to them.
I think this started around the 90ies that devices turned into magic black box consumables that are expected to "just work" while being undiagnosable when they don't.
I think, for at least the first 30 years of my life, every Linux system I've ever built was with "hand-me-down" hardware. First hardware from my parents, then from various friends, then finally from my own expired projects.
When I was young (eleven!), this meant that we'd get a new computer, and now the very old computer it replaced could be repurposed as a "playground" for me to try various things — like installing Linux on — rather than throwing it out. (My first Linux install was Slackware 3.4 on a Pentium 166 machine. Not the best hardware for 1998!) Nary a manual in sight; of course my parents didn't keep those, especially for something like a monitor.
When I was a teenager, this meant getting hand-me-down hardware from friends who had taken the parts out of their own machines as they upgraded them. Never thought to ask for manuals, of course. (Also, sometimes I just found things like monitors laying on the side of the road — and my existing stuff was so old that this "junk" was an upgrade!)
And during my early adulthood, my "main rig" was almost always a Windows or (Hackintoshed) macOS machine. So it was still the "residue" of parts that left that rig as it got upgraded, that came together to form a weird little secondary Linux system. (So I could have kept the manuals at this point; but by then, the manuals weren't needed any more, as everything did become more PnP.)
It's only very recently that I bought a machine just to throw Linux on it. (Mostly because I wanted to replace my loud, power-sucking Frankenstein box, with one of those silent little NUC-like boxes you can find on Amazon that have an AMD APU in them, so I could just throw it into my entertainment center.) And funny enough... this thing didn't come with a manual, or even a (good) data-sheet! (Which is okay for HDMI these days, but meant that it was pretty hard to determine e.g. how many PCIe lanes the two M.2 slots on the board are collectively allocated.)
> You didn't have to guess, you just had to read the specs in the manual that didn't come with your equipment.
Hey you missed a word so I added it in for you. Most consumer PC equipment definitely did not come with any documentation covering the sort of stuff X's config file was asking about.
When that documentation was available it was something you could only get by contacting the manufacturer about. But you couldn't mention the word "Linux" because the CS rep would give a blanket "we don't support Linux" and you'd get nothing.
In the very early 90s, my dad started using some sort of unix again (I don't know if it was an early linux or a BSD of some sort.) Up until that point, I'd only ever seen him used windows 3.1 or some raw terminal/TTY emulator.
It was winter and suddenly his screen was a fuzzy grey, with funny looking windows, instead of the comforting (to me) windows teal.
At the time, it represented to me, a change into the unkonwn. As it was (assume) the start of a new contract (my dad worked at home alot) it was also a time of financial pressure.
So to me, I hated X, and how it looked. It was to me, the equivalent of a brutalist housing block. Well built sure, but foreboding to look at.
Later when I was I was using Linux my self (around redhat 5/6) If you suddenly saw that you were dropping into a "natural" X, It was a sign that you'd fucked up the window manager, or that the switch between gnome and E (or which ever one you were trying) had gone wrong.
The stipple and X cursor are forever ingrained into my memories. I remember it so vividly how back in 1998 when I installed my first Linux distro (suse 6-ish) and after some configuring i typed "startx" and then BOOM! Grey "unix-y" weirdness for a minute or two and then KDE 1. It will never not hit me with immense levels of nostalgia whenever I come across it, which admittedly is not very often these days.
Wow, I could have written that exact comment myself. Those were the happy startx, after you got the monitor sync rates right. The myth went that if you got them wrong, you could fry the monitor. I remember that suse package came with a couple of pins. one tux and one suse chameleon. I preserved them for a long time. But I moved way too many times. Fun times. Thanks for the nostalgia:)
That part about "...you wouldn’t want to wing it with the configuration, because allegedly you could break your monitor with a bad Monitor setting" -- strike the "allegedly"! Or at least, let me allege it from personal experience: I did that to one monitor, in the early 1990s. You could smell the fried electronics from across the room.
For the interested: CRT monitors have a high-voltage power supply which uses an oscillator. Cheap(er) monitors allegedly reused the horizontal sync frequency for the power supply oscillation, to save an oscillator, so if the horizontal sync frequency was very different from expected, or worse, completely stopped, it could burn out the HV power supply.
Has anyone tested this hypothesis? It could also be that the horizontal sync itself burns out, although that seems less likely.
(In even more detail: Like any other switching power supply, the HV supply in a CRT runs on a two-phase cycle: first, a coil, which creates electrical inertia, is connected to the power source, allowing current to build up. Then the current is suddenly shut off, and the force of the coil attempting to keep it flowing creates a very high voltage, which is harvested. If the circuit gets stuck in phase one, the current never stops increasing, until it's limited by the circuit's resistance, much higher than it's supposed to be. The excessively high current overheats and burns out the switching component. Anyone working on switching power supplies will have encountered this failure mode many times.)
It is not really about saving one oscillator, but about two things:
- saving the drive circuitry for the flyback, which is usually combined with horizontal deflection amplifier. Also such a design probably simplifies the output filter for horizontal deflection as the flyback primary is part of that filter.
- synchronizing the main PSU of the display to the horizontal sync as to make various interference artifacts in the image stay in place instead of slowly vandering around, which will make them more distracting and noticable.
It is not that hard to see the whole CRT monitor as essentially being one giant SMPS that produces bunch of really weird voltages with weird waveforms. And in fact is you take apart mid-90's CRT display (without OSD), the actual video circuitry is one IC, few passives and lot of overvoltage protection, rest of the thing is powersupply and the deflection drivers (which are kind of also an power supply, as the required currents are significant).
Your (parenthesized) explanation of switching power supplies made a lot of "secondhand knowledge" click in my head -- like, for instance, why there's lots of high-frequency noise in the DC output. Thank you!
I was briefly pleased with the ability to run an 8" monitor that looked like the kind on 90s cash registers at the impressively high resolution of 1024x768. Then after about 10 seconds it blinked out, smelled like burning electronics, and never worked again.
Neal Stephenson's Cryptonomicon made reference to a hacker dubbed The Digi-Bomber, as he could make his victims CRT monitors implode in front of them by remotely forcing a dangerously bad configuration.
In the early ChromeOS days when they were thinking about which graphics stack to use, the quiet but definitive top manager said, if they picked X11, that he'd better not see any stipple on boot. It's such a funny comment that stayed with me because it really captures how seeing that stipple is such a symbol of "I guess you're booting X11 now", and his insight on how it's not what he wanted the first impression of the product to be.
My understanding is the root weave is a pattern designed to be hard on your monitor(a crt when it was designed). It is ugly as sin but that tight flip from black to white was intended to expose any weakness in the driving beam, ether from misconfiguration or components failing. Where another pattern may obscure the problem. I think it is also rough on lcd's where a misbehaving one really sparkles on the weave.
I am not sure why it was the default, I suspect it was to give you a chance to see how your monitor was behaving on a fresh install and you were expected to set the background to something else.. I still run the root weave on my desktop, it is obsd with their xenocera where it is still the default. but I also run a tiling window manager so only actually see the root window once in a blue moon.
The stipple pattern always reminded me of the pattern on Sun workstation mousepads. For those of you who don't remember: Sun workstations had optical mice, but they're not like the Intellimouse-derived ones we enjoy today that can track on any suitably textured surface, even your pant leg. They had to go on a special mousepad with a clear, slick glass or plastic surface and a special dot pattern underneath that the optosensor would use to reckon movement. I think even getting the upper surface dirty or fingerprinted could negatively mess with the tracking (like smudging a CD could affect playback).
The Mouse Systems mouse used on Sun workstations had two LEDs (and matching sensors) of two different colors, and the solid mouse pad had vertical bars of one color ink, and horizontal of the other. You can take it from there.
Inventor / founder Steve Kirsch used some of the proceeds to fund Frame Technology, which then went on to be sold to Adobe. And then Infoseek (a search-engine also-ran), sold to Disney; and then Abaca (anti-spam), sold to Proofpoint.
> In the old days, it used to be that mouse, keyboard, video card, monitor, fonts, plugin+module data, etc. needed to be spelled out in detail in /etc/X11/XF86Config.
Man does it make me feel old that the /etc/X11/XF86Config days don't feel like the 'old days' to me. That stipple takes me back to using TWM on Sun3 workstations because OpenWindows was too slow.
Yes, it takes me back to configuring my X session for the first time on an NCD Xterminal in the computer lab at uni, connected to the schools's Sun and Dec servers. It was so much better than all the vt220 serial terminals, and they were "scary" enough that it was surprisingly easy to get one.
Saw the stipple just last week on a (presumably) failed startup of an airplane's seat back entertainment system. Not the X cursor but the normal X11 arrow. Recognized it immediately and was, in my own way, entertained.
It's because X's config files were asking you questions that there was no good way of knowing the answers to other than trial-and-error. (After all, if there was some OS API already available at the time to fetch an objectively-correct answer, the X server would just use that API, and not ask you the question!)
An example of what I personally remember:
I had a PS2 mouse with three mouse-buttons and a two-axis scroll wheel ("scroll nub.") How do I make this mouse work under X? Well, X has to be told what each signal the mouse can send corresponds to. And there's no way to "just check what happens", because any mouse calibration program is relying on the X server to talk directly to the mouse driver — there wasn't yet any raw input-events API separate from X — so in the default X configuration that assumes a two-button mouse, none of the other buttons on the mouse get mapped to an X input event, so the mouse calibration program won't report anything when you try the other parts of the mose.
So instead, you have to make a random guess; start X; see if the mouse works; figure out by the particular way it's wrong what you should be telling X instead; quit X; edit the config file; restart X; ...etc.
(And now imagine this same workflow, but instead of something "forgiving" like your mouse not working, it's your display; and if you set a resolution + bit-depth + refresh rate that add up to more VRAM than you have, X just locks up the computer so hard that you can't switch back to a text console and have to reboot the whole machine.)
I've lost count of the number of Linux machines I've seen that won't offer the correct resolution for a particular monitor (typically locked to 1024x768 on a widescreen monitor).
I don't know whether the problem's with Linux, Xorg, crappy BIOSes or crappy monitors - but even now I occasionally resort to an xorg.conf file to solve such issues.
I've been using linux for over 20 years, Xorg for most of that time, and I've never had any issues with screen resolution.
Honestly I thought it was hard to configure because until I used Linux, my X terminals didn’t need to be configured at all!
I may be misremembering but I think my NCD terminal used bootp and probably a little tftp, then off it went. The hardest part was probably finding an Ethernet drop to plug it into.
Now - get off my lawn!
Eventually I figured out how to launch that xconfig utility and found some sane defaults, and was thrilled when I finally saw the stippling pattern or even a window manager.
You didn't have to guess, you just had to read the specs in the manual that came with your equipment.
For a trip down memory lane, read through the XFree86 Video Timings HOWTO (https://tldp.org/HOWTO/XFree86-Video-Timings-HOWTO/index.htm...). Getting stuff to work in the Good Old Days was _not_ easy.
I think this started around the 90ies that devices turned into magic black box consumables that are expected to "just work" while being undiagnosable when they don't.
I think, for at least the first 30 years of my life, every Linux system I've ever built was with "hand-me-down" hardware. First hardware from my parents, then from various friends, then finally from my own expired projects.
When I was young (eleven!), this meant that we'd get a new computer, and now the very old computer it replaced could be repurposed as a "playground" for me to try various things — like installing Linux on — rather than throwing it out. (My first Linux install was Slackware 3.4 on a Pentium 166 machine. Not the best hardware for 1998!) Nary a manual in sight; of course my parents didn't keep those, especially for something like a monitor.
When I was a teenager, this meant getting hand-me-down hardware from friends who had taken the parts out of their own machines as they upgraded them. Never thought to ask for manuals, of course. (Also, sometimes I just found things like monitors laying on the side of the road — and my existing stuff was so old that this "junk" was an upgrade!)
And during my early adulthood, my "main rig" was almost always a Windows or (Hackintoshed) macOS machine. So it was still the "residue" of parts that left that rig as it got upgraded, that came together to form a weird little secondary Linux system. (So I could have kept the manuals at this point; but by then, the manuals weren't needed any more, as everything did become more PnP.)
It's only very recently that I bought a machine just to throw Linux on it. (Mostly because I wanted to replace my loud, power-sucking Frankenstein box, with one of those silent little NUC-like boxes you can find on Amazon that have an AMD APU in them, so I could just throw it into my entertainment center.) And funny enough... this thing didn't come with a manual, or even a (good) data-sheet! (Which is okay for HDMI these days, but meant that it was pretty hard to determine e.g. how many PCIe lanes the two M.2 slots on the board are collectively allocated.)
Hey you missed a word so I added it in for you. Most consumer PC equipment definitely did not come with any documentation covering the sort of stuff X's config file was asking about.
When that documentation was available it was something you could only get by contacting the manufacturer about. But you couldn't mention the word "Linux" because the CS rep would give a blanket "we don't support Linux" and you'd get nothing.
Ahh Linux people. Some things will never change.
Deleted Comment
It was winter and suddenly his screen was a fuzzy grey, with funny looking windows, instead of the comforting (to me) windows teal.
At the time, it represented to me, a change into the unkonwn. As it was (assume) the start of a new contract (my dad worked at home alot) it was also a time of financial pressure.
So to me, I hated X, and how it looked. It was to me, the equivalent of a brutalist housing block. Well built sure, but foreboding to look at.
Later when I was I was using Linux my self (around redhat 5/6) If you suddenly saw that you were dropping into a "natural" X, It was a sign that you'd fucked up the window manager, or that the switch between gnome and E (or which ever one you were trying) had gone wrong.
I kinda like it now though.
Yes
Taste, it is a subjective thing
That's why I loved it
Has anyone tested this hypothesis? It could also be that the horizontal sync itself burns out, although that seems less likely.
(In even more detail: Like any other switching power supply, the HV supply in a CRT runs on a two-phase cycle: first, a coil, which creates electrical inertia, is connected to the power source, allowing current to build up. Then the current is suddenly shut off, and the force of the coil attempting to keep it flowing creates a very high voltage, which is harvested. If the circuit gets stuck in phase one, the current never stops increasing, until it's limited by the circuit's resistance, much higher than it's supposed to be. The excessively high current overheats and burns out the switching component. Anyone working on switching power supplies will have encountered this failure mode many times.)
- saving the drive circuitry for the flyback, which is usually combined with horizontal deflection amplifier. Also such a design probably simplifies the output filter for horizontal deflection as the flyback primary is part of that filter.
- synchronizing the main PSU of the display to the horizontal sync as to make various interference artifacts in the image stay in place instead of slowly vandering around, which will make them more distracting and noticable.
It is not that hard to see the whole CRT monitor as essentially being one giant SMPS that produces bunch of really weird voltages with weird waveforms. And in fact is you take apart mid-90's CRT display (without OSD), the actual video circuitry is one IC, few passives and lot of overvoltage protection, rest of the thing is powersupply and the deflection drivers (which are kind of also an power supply, as the required currents are significant).
I am not sure why it was the default, I suspect it was to give you a chance to see how your monitor was behaving on a fresh install and you were expected to set the background to something else.. I still run the root weave on my desktop, it is obsd with their xenocera where it is still the default. but I also run a tiling window manager so only actually see the root window once in a blue moon.
Inventor / founder Steve Kirsch used some of the proceeds to fund Frame Technology, which then went on to be sold to Adobe. And then Infoseek (a search-engine also-ran), sold to Disney; and then Abaca (anti-spam), sold to Proofpoint.
Man does it make me feel old that the /etc/X11/XF86Config days don't feel like the 'old days' to me. That stipple takes me back to using TWM on Sun3 workstations because OpenWindows was too slow.
Also modern devs: Show HN: My proposal for a modern terminal that supports 24bit color, inline graphics, and video