Readit News logoReadit News
retrac · 3 months ago
I think some younger people might have never really seen a CRT. And they're positively rare now. I encountered a CRT TV in the hospital waiting room recently and was a bit startled to see one. So for those only passingly familiar, if you get the opportunity, spend a bit of time experimenting with it visually. Jiggle your eyes, look away suddenly, and then back, and try oblique angles. Maybe you'll see what they mean about "you just can't recreate that glow".

It's hard to describe but the image is completely ephemeral. All display technologies involve sleight-of-hand that exploits visual illusion and persistence of vision to some degree, but the CRT is maybe the most illusory of the major technologies. It's almost entirely due to persistence of vision. With colour TV and fast phosphors the majority of the light energy is released within a few milliseconds of the spot being hit by the beam. If you had eyes that worked at electronic speeds, you would see a single point drawing the raster pattern while varying in brightness.

A bit of TEMPEST trivia: The instantaneous luminosity of a CRT is all you need to reconstruct the image. Even if it's reflected off a wall or through a translucent curtain. You need high bandwidth, at least a few megahertz, but a photodiode is all that's necessary. The resulting signal even has the horizontal and vertical blanking periods right where they should be. Only minor processing (even by old school analog standards) is required to produce something that can be piped right into another CRT to recreate the image. I'd bet it could be done entirely in DSP these days.

everdrive · 3 months ago
Once we lost CRTs we began this insane race for all these duct tape solutions; all sorts of anti-aliasing, lighting, etc, and a resolution race that can only be described as pathological. Try modern games in low resolution with minimal effects on a CRT. You'll be surprised how much better they look, but also how much better they run; you don't need such a powerful rig when you turn your graphical effects down when using a CRT.
Aurornis · 3 months ago
> all sorts of anti-aliasing, lighting, etc, and a resolution race that can only be described as pathological

I enjoy CRT nostalgia now and then, but modern high resolution games are absolutely amazing. The blurry, low resolution, low refresh rate CRT look is fun for old games, but playing in 4K at 100+ fps on a modern monitor is an amazing experience on its own.

speeder · 3 months ago
I got very sad when my CRT monitor died. I was using a Radeon RX 380X, part of the reason is that it was one of the few cards to still have analog output.

Then I went and played lots of recent games in lower resolution, but could turn on lots of expensive effects even with such underpowered card, because I could do low-res with anti-alias disabled and no scaling and have decent results.

But true pleasure was playing for example Crypt of Necrodancer on that screen, the game felt so easy. I eventually stopped playing after that screen died, I could never nail the timing anymore on modern screens, the response time is not the same.

Waterluvian · 3 months ago
Also don’t forget to rub your palm across the screen to collect the fuzzies that built up.
BizarroLand · 3 months ago
My favorite thing to do around christmas was to take the aluminum foil garland strands and place them on the CRT screen.

Static would hold them in place, and when people would go to manually turn on the TV they would suddenly become energized with thousands of volts of static electricity from the ray gun in the tube and leap out like an electric snake to zap the everloving snot out of the unfortunate bastard that turned the tv or monitor on.

Which was usually me, because I liked the challenge, but if you ever have that combo available with kids who didn't grow up around them and know any better, it would be a good prank.

black_knight · 3 months ago
Ah, I can feel it from just reading your comment! That’s a feeling I haven’t felt in a while!
gadders · 3 months ago
Put a magnet by the screen as well.
x187463 · 3 months ago
The Slowmo Guys on YouTube have a great video showing the CRT scanlines.

https://www.youtube.com/watch?v=3BJU2drrtCM

auselen · 3 months ago
I might miss visual aspects of CRTs, but I mean most of them had a coil sound or some kind of cracking sound. May be as TVs, screens for gaming consoles they were fun, but as monitors I don’t miss the heat burning my face.
emh68 · 3 months ago
Sometimes I think about the bizarre path computer technology took.

For instance, long-term storage. It would stand to reason that we'd invent some kind of big electrical array, and that's the best we could hope for. But hard drive technology (which relies on crazy materials technology for the platter and magnets, crazy high-precision encoders, and crazy physics like floating a tiny spring over the air bubble created by the spinning platter) came in and blew all other technology away.

And, likewise, we had liquid crystal technology since the 70s, and probably could have invented it sooner, but no need, because Cathode Ray Tube technology appeared (a mini particle accelerator in your home! Plus the advanced materials science to bore the precision electron beam holes in the screen grid, the phosphor coating, the unusual deflection coil winding topology, and leaded glass to reduce x-ray expose for the viewers) and made all other forms of display unattractive by comparison.

It's amazing how far CRT technology got, given its disconnect from other technologies. The sophistication of the factories that created late-model "flat-screen" CRTs is truly impressive.

The switch to LCDs/LEDs was in a lot of ways a step back. Sure, we don't have huge 40lb boxes on our desks, but we lost the ultra-fast refresh rate enabled by the electron beam, not to mention the internal glow that made computers magical (maybe I'm just an old fuddy-duddy, like people in the 80s who swore that vinyl records "sounded better").

Someday, maybe given advances in robotics and automation, I hope to start a retro CRT manufacturing company. The problems, such as the unavailability of the entire supply chain (can't even buy an electron gun, it would have to be made from scratch) and environmental restrictions (lead glass probably makes the EPA perk up and notice).

mrandish · 3 months ago
> like people in the 80s who swore that vinyl records "sounded better"

I'm not one of those people who ever thought vinyl sounded better than a properly recorded and mastered digital version and I've always believed a high-bandwidth digital audio signal chain can recreate the "warmth" and other artifacts of tube compressors well beyond the threshold of human perception, however a broadcast-quality, high-definition CRT being fed a pristine hi-def analog RGB signal can still create some visuals which current flat screens can't. This is only controversial because most people have never seen that kind of CRT because they were incredibly rare.

I got to see one of the broadcast production CRTs made to support NHK's analog high-definition video format in the 90s directly connected to HD broadcast studio cameras and the image quality was simply sensational. It was so much better than even the best consumer CRT TVs, that it was simply another thing entirely. Of course, it cost $40,000 and only a few dozen were ever made but it was only that expensive because these were prototypes made years before digital hi-def would be standardized and begin mass production.

In fact, I think if it was A/B compared next to a current high-end consumer flat screen, a lot of people would say that CRT looks more pleasing and overall better. For natural imagery a CRT could render the full fidelity and sharpness of a 1080 image but without that over-crisp 'edginess' today's high-end flat screens get. And those "cathode rays" can render uniquely rich and deep colors vs diodes and crystals. Of course, for synthetic images like computer interfaces and high-dpi text, a flat screen can be better but for natural imagery, we lost something which hasn't yet been replaced. I'd love to see an ultra high-end CRT like that designed to display modern uncompressed 4K 12-bit HDR digital video.

speeder · 3 months ago
I had a music teacher that insisted analog recordings were different.

One day she said there is a simple way to prove it. Certain stringed instruments have the string move on their own to the correct note if you put them near a source of similar sound. If you put these instruments in front of a speaker playing from an analog source and have the strings move, then play the exact same music but from a digital source on the same speaker, the strings stop moving, even if to most humans it sounds exactly the same.

Sadly I never had the gear to test this, I am not a professional musician and was learning from that person as a hobby (she is a teacher for professional musicians).

ThrowawayTestr · 3 months ago
Have you looked at any high end OLEDs lately?
estimator7292 · 3 months ago
I think what's even more interesting is how CRTs evolved. Conceptually it was like starting from an incandescent bulb to single LED to seven segment display and then to LCD. The progression from neon bulb up to HD CRT tubes is pretty much linear! We started with "magic eye" tubes, a sort of radial bar graph, and then tubes that could raster a single line and we used them for oscilloscopes. Then monochrome 2D raster and then more and more complex color raster systems.

It's pretty neat how smoothly the technology progressed through every intermediate step. There weren't many huge revolutionary leaps, just steady progress

thworp · 3 months ago
Imo OLED has completely eclipsed CRT by now.

I don't know enough to say where CRTs could be today if they had gotten the development $ that went into other tech. But to be as good as OLEDs they would have had to find something else than phosphor as the inner coating.

For response times, CRT will always remain the king of dark-to-light response times, but afterglow for bright-to-dark would always be a factor unless a different coating was developed. OLEDs have no such issues. Subjectively, the claimed < 0.1 ms response times are real and there are zero artifacts, no afterglow, no ghosts, just extremely sharp and defined motion.

imcritic · 3 months ago
Ghosting from long display of static image is real.
TacticalCoder · 3 months ago
> It's amazing how far CRT technology got

And China is still building, today, brand new CRT boards for CRT TVs and monitors. You can buy them on AliExpress.

I don't know if CRT themselves are still being built though.

I'm hanging on to my vintage arcade cab from the 80s with its still-working huge CRT screen. Hope I fail before that thing (and I hope it doesn't fail anytime soon!).

Dylan16807 · 3 months ago
> The switch to LCDs/LEDs was in a lot of ways a step back. Sure, we don't have huge 40lb boxes on our desks, but we lost the ultra-fast refresh rate enabled by the electron beam, not to mention the internal glow that made computers magical (maybe I'm just an old fuddy-duddy, like people in the 80s who swore that vinyl records "sounded better").

CRTs don't have particularly good refresh rates. There is very little delay on the output scan, but 99% of the time the delays built into rendering make that irrelevant compared to fast screens using other technologies. And the time between scans doesn't go very low.

I have no idea what you mean by internal glow.

sokoloff · 3 months ago
The heated filament in many old CRTs would glow orange.
trenchpilgrim · 3 months ago
Some images to demonstrate how retro games look on CRT vs unfiltered on a modern display:

https://x.com/ruuupu1

https://old.reddit.com/r/crtgaming/comments/owdtpu/thats_why...

https://old.reddit.com/r/gaming/comments/anwgxf/here_is_an_e...

Modern emulators have post-processing filters to simulate the look, which is great. But it's not quite the same as the real thing.

majormajor · 3 months ago
Blowing things up to that size is not representative.

Back when I first started playing things on emulators we were using 12" to 20" CRTs or LCDs with much higher resolution than a TV, so whether CRT or LCD the pixels were chunkier.

None of the nostalgia is how I remember it at all.

The average CRT TV had crap color and poor brightness and going from that and the flicker of 1-to-1 size NTSC on a 20-something TV to an emulated "chunkier pixel" rendition on a progressize-scan 72+hz 1024x768-or-higher CRT or an LCD looked way better.

Take the side by side pictures and zoom WAY out on a high-res screen or go stand several feet away from your monitor so that they're the size they were designed and expected to be seen at, and the vast majority of the perceived improvement from making the CRT subpixels visible goes away. And then put them into motion - especially vertical motion - and those lines in between, and losing half on each frame becomes more noticable and distracting.

The 4th image there of the yellow monster is a good example. Even zooming to 50% on my high-res display makes the "bad" version suddenly look way sharper and detailed as the size starts to show how frequently "rounded dots with gaps between it" just looks like fuzziness instead of "better".

And these comparisons tend to cherry-pick and not show examples of things that lose clarity as a result of the subpixels and scanlines instead of gain clarity.

thaeli · 3 months ago
I'm the same way. The scanlined, subpixeled versions just look terrible to me.
Theodores · 3 months ago
The article concerns 'PVMs', not a phrase I remember in period, even though we had hundreds of Sony D1 monitors, which were the pinnacle of 'professional digital monitors'.

These were different beasts to civilian TVs, even top of the line Trinitron. They had none on the RF circuitry of a regular TV and the inputs were typically component, or, in the late nineties, digital, but not the digital we know today, that signal came down one BNC connector.

We had an outside broadcast company which had massive trucks full of screens for televising sports, concerts and public events. A new boss decided to outfit the new trucks with domestic TVs rather than the expected mega-expensive D1s. The trucks did not last long, much to the amusement of the crew. The TVs rattled themselves to pieces before they made it to their first event.

Unlike the civilian TVs, the Sony D1 monitors were designed to be repaired. We had people for that and you could often see the innards of one of them if you went to see the engineers in their den. They generally did not need to be repaired, but, if you have hundreds of the things then you raised the odds of having a few need a little bit of servicing.

In the studio environment they were rack mounted with air conditioning and extremely neat cabling to some type of desk where you had the buttons to choose what camera, VT or other source went to the screen. Lighting in the gallery was also controlled, so the picture you saw was the definitive picture, with no fiddling of brightness or contrast necessary. The blacks were black, which flat screens were only really able to achieve decades later with AMOLED.

In the basement with the DigiBeta tape machines we had smaller D1s in the racks, often with an adjacent oscilloscope. You could tell if the content was 'adult material' by the oscilloscope, which I always found amusing.

The magic of TV in that era was the analog nature of the CRT. The studio set was usually very battered and yet you could put a few tens of thousands of watts of lighting onto it for the cameras to show something beautiful on the D1 monitors. The advent of HD was problematic in this regard as every dent and scratch would show, along with every wrinkle and blemish on the presenter's faces.

Video games of the era were designed around the hardware, in Europe this meant 720 x 576 PAL, with lots of that image as 'overscan'. Note that JPG was also designed for the magic of analog, with slow CPUs. You can change the look up table in JPG to make it work for digital and fast CPUs but only MozJPEG does that.

You mention flickering, and most CRTs would be flickery, think of electrical shops of the era and what you would see out of the corner of your eye. Clearly you would not want this in a studio gallery lest anyone collapse with an epileptic fit. In Europe we had 50Hz rather than 60Hz, so, even with interlacing, flicker was a thing but only in the electrical shop, not in the studio gallery. This had more to do with genlock (for analog) than phosphor persistence trickery.

Regarding the article, I am pleased that the D1 monitors of old have found a new fan base that truly appreciate them. In period we put a lot of work into our setups and, to this day, I struggle to come to terms with all of that expertise and expense having gone forever.

In broadcasting there has always been an 'old guard' that can remember the 'good old days'. I now feel like one of those fuddy duddies!!!

dangson · 3 months ago
This helps validate my memories of SNES and PS1 games looking so much better when I was a kid than on an emulator today.
anthk · 3 months ago
With 25% scanlines on PC CRT's they looked pretty close to TV's. On LCD's, forget it. Not even close, even with CRT filters.
StopDisinfo910 · 3 months ago
I played SNES and PS1 games on a CRT. I played them on LCD and OLED TVs. I can’t tell the difference.

I mean I can tell that hdmi cables never introduce chromatic abberation something which was quite common on these old TVs when the SCART cables I used to use got old and I never had a LCD screen catch fire something which happened to me twice with aging CRT screens.

I really don’t get the nostalgia or whatever it is called when some of the people who think it was better then weren’t born at the time.

cobbzilla · 3 months ago
Absolutely. I love playing Atari 2600 games, and it seems sacrilegious to play on anything but an old-school CRT TV.

Also, I’ve heard a CRT is required for NES light-gun games like Duck Hunt. Anyone know if this is true? I don’t have an NES, and if I did, I’d hook it up to my CRT, so I still wouldn’t know the answer :)

toast0 · 3 months ago
The NES light gun works with the properties the CRT provides... Roughly what happens is ... When you pull the trigger, the next frame is all black, and then one frame per target with a white square for the targer. If you're on target, the photodetector (photodiode? photoresistor?) will make a step change when the beam hits the white square, and the game code is looping to detect that. If the light comes late, it won't count; if it's not a big enough change, it won't count. If the screen was too bright during the black frame (or you were pointing at a light the whole time), it won't count.

Most modern displays are going to show the square too late, some might not be bright enough.

If you have an LED matrix and the right driving circuitry, you could probably replicate the timing, and that might work too, but I've not seen it done.

More details and options for LCDs https://www.retrorgb.com/yes-you-can-use-lightguns-on-lcds-s...

vintermann · 3 months ago
Yes, light guns/light pens actually relied on vertical/horizontal sync of the CRT screen to identify the position you pointed at, so they won't work on a modern screen.
nomel · 3 months ago
> But it's not quite the same as the real thing.

To be fair, with modern "retina" HDR displays, it should be very very close.

mrob · 3 months ago
The most important element of the CRT look is the fast phosphor decay. This is why CRTs have so little sample-and-hold blur. No other hardware can simulate it perfectly, but a 480Hz OLED display comes close:

https://blurbusters.com/crt-simulation-in-a-gpu-shader-looks...

hulitu · 3 months ago
> it should be very very close

It should. It isn't. For some obscure reason, VGA colours look different on every modern LCD.

EvanAnderson · 3 months ago
I regret taking all my old tube monitors to Goodwill back in the mid-2000s. I saved a Commodore 1942, at least, but I sent all the rest away to die.

I appreciate the CRT modeling in emulators, but a hardware device that passes thru a display signal and provided sub-frame CRT artifacting and phosphor modeling (particularly if it supported 240P) would be bitchin'.

thaeli · 3 months ago
FPGA based devices that can do this, and quite well, do exist, they're just expensive. The RetroTINK-4k Pro is the top of the line as of this writing but it's a $750 converter.
reactordev · 3 months ago
I’ve been searching for years for the old CRT Viewsonic Mac monitor that used RGB inputs. (Might have been CMYK). You plug in the DIP head to the video out and you attach each individual connectors to the color connectors on the back. The thing was massive, easily over 24”. Beige plastic that we all love.

Growing up my dad was a Mac guy and he had all kinds of Apple stuff. The weird page sized monitor, performa 600, trackball mouse, ergonomic keyboard. Granted my father was in software and this was the early 90s but it would _definitely_ define my initial passion for computers.

I’ve been looking for this monitor so that I can restore his setup. I have his Performa, peripherals, and restored those. I just need that giant monitor he used to use.

My father passed away last year. My world has been different ever since.

detritus · 3 months ago
It'd've been RGB :)

But you've now got me mulling over the implications of a CMYK-based subtractive colour process 'monitor'. I'm guessing the refresh rate wouldn't be too hot..!

pansa2 · 3 months ago
> It'd've been RGB :)

Yeah, sounds like one of those CRT monitors that has separate BNC connectors for each of RGBHV.

throw0101d · 3 months ago
If in the Toronto, Canada area, the television museum may be worth a visit:

* https://mztv.com

Pre-WW2 televisions seem to be quite rare:

> To put this special set in some context, there are more 18th century Stradivarius violins in existence than pre-World War II TVs and, to make it that bit rarer, this TV has only had two owners. “I’ve handled 38 pre-war tells and this is the finest and even comes with the original invoice,” said Bonhams specialist Laurence Fisher. “It cost a huge amount and the owner must have had wealth and means…It is a very rare thing and there are collectors who would love to have it.”

* https://newsfeed.time.com/2011/04/05/do-not-adjust-your-set-...

retrac · 3 months ago
That museum looks exciting!

> Pre-WW2 televisions seem to be quite rare

Pre-war television was rare. Electronic television came to maturity during the Great Depression. Early TVs were horribly expensive and given the economic situation of the 1930s almost no one could afford them! The UK started regular TV broadcasts in 1936 and by 1939 there were about 20,000 televisions in the whole country. And that was much further along than France, Germany, or the USA.

Most of those 20,000 were scrapped or sold during the war or they served as TVs faithfully into the 40s and even 50s, and then were retired. There are only a handful of working televisions of that vintage in the world.

cesaref · 3 months ago
Back in the 80s, as the home computer revolution got going, computers were typically wired up to small, cheap, portable TVs as a display device. These TVs used shadow masks, and the computer video output was typically modulated to a TV signal, and the TV was 'tuned' to the computer. All of this added large amounts of blur and distortion even before the signal was displayed on the TV.

By the mid 80s, it was maybe more typical to buy a dedicated CRT monitor, and the computer connected via composite, or maybe even an RGB feed to the monitor, allowing higher resolution and much improved quality.

For the well healed, this route also led to the holy grail, a trinitron tube!

At each of these changes, the aesthetic of the display technology changed, but probably the best memories come from the original blurry stuff as the magical moment of actually getting something out of a home computer.

qingcharles · 3 months ago
For a long time my only "monitor" when I was a kid was a 12" B+W TV for my ZX Spectrum. For the day of my birthday when I got it I was allowed to hook it up to the family 14" color TV, but after that it was back to the B+W for the next couple of years!

(funnily enough, when I finally got a PC years later, the only monitor I could afford was a Philips monochrome VGA -- I guess they now sell for multiples of the original retail price? https://www.ebay.com/itm/176945464730)

jader201 · 3 months ago
I’ve got a 27” CRT right next to my 65” LG OLED C9 (which is starting to feel ancient, too).

It sits in a cabinet that currently holds an NES, SNES, N64, GameCube, and PS2.

It doesn’t get a ton of playtime, but when my now 21- and 18-year-old sons were young, I’d play on them quite a bit (they were already retro even then), and as they got older, they would too.

My oldest is particularly fond of the retro consoles and playing on the CRT, so he’ll hop on it when he gets the itch for something retro.

I feel like there’s a charm that will never fade, not only with retro consoles, but also playing them on a CRT.

I’ll never get rid of our CRT.

My oldest son wouldn’t let me, even if I wanted to.