Many moons ago I was on a constrained internet connection -- I set up a repeater by hanging an old phone over my curtains so it could catch Wifi from the cafe across and connected to the phone's internet over bluetooth.
I had like 2KB/s.
This made most of the internet unusable, but it turns out the parts I care about are text. So I just browsed it through a text browser.
This didn't really work either, because it turns out web protocols don't work very well over 2KB/s.
So I browsed the internet by connecting to a $1 VPS (very fast internet!) over Mosh (which is like SSH, but more efficient and resilient). So that way, it would only send the actual bytes of text to me.
I mostly browsed HN and the linked articles at that point.
The browser that rendered HN the best in those days was w3m. I remember it had indentation and even mouse / scrolling support. I tried lynx too and it was good too, but I went with w3m in the end.
I see w3m hasn't been updated in 15 years, but it's probably still better for reading HN, whose UI hasn't changed for longer than that! I will have to give them both a spin :)
Yeah mosh works really well in those kinds of scenarios. My provider once had an outage where they dropped 50% of all packets, which rendered most of the internet completely unusable. I was able to connect to my VPS via mosh (it took 6 attempts since it uses SSH for the initial handshake), and then mosh + w3m worked essentially the same as if no packet drops even existed. Feels like magic
For HN there's gopher://hngopher.com and for the main web, links preserves
the cascaded formatting for threads. Yes, I used mosh with 2.7 KB/s too, against
a pubnix, but I used gopher too for tons of services.
Thanks to bitlbee I could talk with my SO and relatives against Telegram and https://brutaldon.org allowed me to read posts at Mastodon.
> set up a repeater by hanging an old phone over my curtains so it could catch Wifi from the cafe across and connected to the phone's internet over bluetooth.
I had no idea this was possible. Can you explain why this works? Sounds fascinating.
Basically you are making a network connection via bluetooth. Depending on the OS you drive, there's probably a guide such as [1] to set up bluetooth via the modem adapter to act as an internet connection.
Of course it relies on both devices being able to create and maintain a good bluetooth connection.
It would be nicer if phones could run two wifi networks at the same time, allowing a mix of leach or hotspots but I guess in practical terms it's only one in a thousand type of demand.
I still used Lynx as my default browser while working on ships until 2020. Satellite internet connections at sea were slow and very expensive which made Lynx a good choice. But it turned out that the text-based, distraction-free browsing could be a better experience than the same site in a modern browser. And a few sites still serve text versions, like text.npr.org. I liked Lynx enough that I would still use it back on land until the habit faded.
Proud user of noscript/basic (x)html browsers here.
Lynx and links (and I wanted to _code_ my own using netsurf libraries).
Restoring noscript/basic (x)html will only happen with hardcore regulation (or "tarif"/"gigantic fines"... same same...).
This is critical for the web, since that makes developing real-life alternative browsers a reasonable task from many pertinent perspectives.
The current technical landscape of the web is a disaster: a cartel of 2.5 absurdely and grotesquely gigantic web engines written in the most complex computer language out there which requires a compiler on the same complexity level... and there are only 2 of them from roughly from the same cartel/mob.
It seems that technical interop of the web with a very simple standard, stable in time and good enough to do the job is a 'competitive' issue of the small vs the big and should be handle by regulating administrations.
Remember, tons of web sites were noscript/basic (x)html compatible and doing a more than enough good job already... without insane technical dependencies...
Very early in my Linux days in the early 2000s I was bound and determined to learn how to use Lynx as I thought the skill would be a necessity for maintaining servers. Being able to look up issues online and what not.
Little did I realize that 99% of the time I would be SSHed in from a full desktop with a standard browser, and Lynx has just been kind of a fun novelty for me.
It is unfortunate that modern web development has led to websites so complex that they either break entirely or look terrible in text-based browsers like Lynx. Take Mastodon, for example:
$ lynx https://mastodon.social/
[…]
To use the Mastodon web application, please enable JavaScript.
Alternatively, try one of the native apps for Mastodon for your
platform.
The C2 Wiki does not load either:
$ lynx https://wiki.c2.com/
[…]
javascript required to view this site
why
To their credit, at least they use the <noscript> tag to display the above notices. Some websites don't even bother with that. But there are many old school websites that still load fine to varying degrees:
lynx https://danluu.com/ # Mostly okay but some needed spaces missing
lynx https://en.wikipedia.org/ # Okay, but a large wall of links on top
lynx https://irreal.org/blog/ # Renders fine
lynx https://libera.chat/ # Mostly fine
lynx https://news.ycombinator.com/ # Of course!
lynx https://sachachua.com/ # Mostly fine
lynx https://shkspr.mobi/ # Renders really well
lynx https://susam.net/ # Disclosure: This is mine
lynx https://norvig.com/ # A classic!
lynx https://nullprogram.com/ # Also pretty good
If you have more examples, please comment, and I'll add them to this list in the two hour edit window I have.
While JavaScript has its place, I believe that websites that focus on delivering primarily text content could prioritise working well in TUI browsers. Sometimes testing it with text-based browsers may even show fundamental issues with your HTML. For example, several times, I've seen that multiple navigation links next to each other have no whitespace between them. The links may appear like this:
HomeBlogRSSAboutCodebergMastodon
Or, in a list of articles, dates and titles may appear jammed together:
14 Mar 2025The Lost Art of Dual Booting
15 Mar 2025Some Forgotten Features of Gopher
16 Mar 2025My Favourite DOS Games
The missing spaces aren't obvious in a graphical browser due to the CSS styling hiding the issue, but in a text-based one, the issue becomes apparent. The number of text-based web users may be shrinking, but there are some of us who still browse the web using tools like lynx, w3m, and M-x eww, at least occasionally.
Unlike computer interfaces, the web was never text-first. It was graphical from the start. The first browser was in a GUI, not a terminal.
Sites have been hobbled/broken on Lynx since the very beginning. It's neat and can be convenient to have a browser that works in your terminal for simple stuff, but the web was never designed for that. It's natural and to be expected that many sites will break. The burden is really on Lynx to do what it can to support sites as they are, rather than sites to try to build compatibility with Lynx.
It's kind of like, there are programs to "view" a PDF in the terminal, or at least its text content. But PDF authors shouldn't be expected to optimize the order text is presented for those programs. That's not what PDF was ever meant for, even if you can get it work sometimes.
Given the web’s much wider remit than pdf, it has support for accessibility tools and much better non-visual handling than pdf, so the comparison isn’t entirely fair I think.
If a website doesn’t handle lynx well, there’s a good chance it doesn’t handle accessibility well either.
> It's natural and to be expected that many sites will break.
There is nothing "natural" about software development at all. It was an active choice to hobble the internet as a browser to sell ads via interactive apps.
Back in the day my 28.8k modem came bundled with a book on HTML, which is how I learned to make my first personal web site.
Even back then, the book recommended testing your web site in Lynx for two reasons:
1. Web sites are supposed to gracefully degrade when viewed in browsers without support for advanced features.
2. Accessibility matters, and while most of us don't have access to or know how to operate screen readers if we can comfortably view and navigate a web site in Lynx there's a pretty decent chance that it'll be usable with a screen reader.
It's been ~30 years since then and those reasons still apply just as well. For the vast majority of web sites which do not have any need to be interactive webapps there's not really any good reason for it not to be perfectly usable in a text-only browser, and if it's both readable and navigable in a text browser it should also be with a screen reader.
By the time it caught on, HTML did allow for very graphical web pages, and that's not even considering how popular Flash Player was when the internet had its initial growth spurt. Technically, very early versions of HTML, especially predating CSS, were closer to epub than modern HTML, but there were so few web pages it's not a meaningful argument against your supposition.
I think what's more important is that a significant portion of web pages have no more complex layout than a newspaper or slideshow, so why not make them easy to parse? Not only would it make browsing in Lynx easier, but it would work well with screen readers, which are the only way some people can browse web pages.
I think a significant difference is that in the early days, the content was predominantly text, with styling/images/multimedia to embellish the content. But today it feels like a large proportion of websites put the embellishments first, the text content is thin and you often have to hunt it out.
Of course the web has evolved and has uses other than reading/absorbing information (some of them great) and multimedia content is valid, but it does seem to have become harder to find substance in amongst all the style.
When I'm surfing the web it's still usually words that I'm looking for. I think that may be going out of fashion.
Wasn’t lynx before other graphical browsers? I remember first using the web through a vax terminal and lynx. You could download images but had to launch a viewer.
I think you guys are talking across one another. You both are correct to my point of view. OP is correct that allowing websites, by enacting standards that allowed such a thing, to script their rendering was a mistake. You are also correct that the web is visual from the start.
The early web was not terminal based but that doesn't automatically mean it was "graphical". HTML was meant to be processed by any number of different user agents. A lot of early HTML tags were semantic in nature in order to convey the intent of the element. Conveying intent allows non-graphical browser, meaning everything from spiders to screen readers to AI agents, to use a page with the same capability.
People abusing HTML for the purpose of styling is the whole reason for CSS existing. A well written HTML document should have a very clean structure. The CSS has the ability to do all the crazy graphical styling. The old CSS Zen Garden was an amazing demonstration of that, an incredibly well structured and mostly semantic HTML document could have any number of crazy styles only by varying the CSS.
Bullshit HTML loaded with bullshit CSS frameworks generated by megabytes of bullshit JavaScript is a complete failure of web devs to master the medium. Unless a web page is absolutely reliant on a graphical interface (Google Maps, a game, etc) there's no reason that it shouldn't render passably in lynx. Even in those cases it should have noscript guardrails to provide at least an explanation as to why it can't work.
I have to add https://nullprogram.com, just because of the care the author took to have it work better in lynx[1]:
Just in case you haven’t tried it, the blog also works really well with terminal-based browsers, such as Lynx and ELinks. Go ahead and give it a shot. The header that normally appears at the top of the page is actually at the bottom of the HTML document structure. It’s out of the way for browsers that ignore CSS.
Which works perfectly, including navigation (next/prev/parent). The perfect way to use javascript to enhance a site (collapsing threads etc) but not require it.
HN is hosted on a single machine in a colo somewhere (with a backup elsewhere), yet has far more value than the majority of sites 100 times as complex.
Because HN value is the value of the comments, and those are a scarce resource. Making a great website (for whatever is your definition of great) doesn't guarantee that it will become valuable.
All this to say, HN shouldn't an example to blindly follow.
I agree with your javascript complaints, and I use a graphical browser.
I do prefer to surf with js disabled, and in most cases it actually works pretty well.
But the lack of non-js mastedon has pretty much stopped me from reading posts on the system. I can surf github, and this site, with no js, but mastedon is a no go.
The conversion of the internet into a scam ad distribution system is the primary culprit leading to the massive proliferation of js, along with the use of overly complex "frameworks" that could often be static html, I don't know what mastedon's excuse is...
When I started using the WWW in 01992 the majority of Web users were probably using text-based browsers, and specifically Lynx, because that was what the University of Kansas was using for its campuswide information service (CWIS). Mosaic didn't exist yet, and most people accessing the internet were using either dumb terminals like I was (typically in my case a VT-100 or CIT-101 clone of it) or dialup terminal emulators like Procomm+.
I made an e-commerce platform that has zero JavaScript. It is PHP only. Additionally, cgit uses JavaScript for updating idle time, but you do not need it, just refresh the page.
But yeah, I wish people were more hesitant over-using JavaScript.
> The missing spaces aren't obvious in a graphical browser due to the CSS styling hiding the issue, but in a text-based one, the issue becomes apparent.
Why would this be considered "an issue" or "a problem in your HTML"? TUI browsers are really a fun novelty and not much more, I'd be shocked if even the largest sites in the world receive more than 1000 visits per day from lynx or eww or any other combined. Unless you have a compelling reason to think that your site will be used by terminal browser fans, there is no reason whatsoever to care about how your HTML renders without CSS. Even screen readers would not have problems properly reading links not separated by spaces correctly.
I maintained it for a while, then delegated the DNS to someone else, but they didn't maintain it either, swapped it back. ~I'll update it when I get a chance.~
edit: Updated with the correct version and some small HTML tweaks
Probably Fantasque because I've (to the best of my memory) never installed Cosmic Sans (and I made the screenshot, obvs.) but I do occasionally use Fantasque for terminals.
But since the screenshot needs updating, I'm open to suggestions for what font to use this time.
Should we blame an old timey basic webpage for its lack of complexity or should we blame a modern browser for not accommodating the web in its most simple form?
I had like 2KB/s.
This made most of the internet unusable, but it turns out the parts I care about are text. So I just browsed it through a text browser.
This didn't really work either, because it turns out web protocols don't work very well over 2KB/s.
So I browsed the internet by connecting to a $1 VPS (very fast internet!) over Mosh (which is like SSH, but more efficient and resilient). So that way, it would only send the actual bytes of text to me.
I mostly browsed HN and the linked articles at that point.
The browser that rendered HN the best in those days was w3m. I remember it had indentation and even mouse / scrolling support. I tried lynx too and it was good too, but I went with w3m in the end.
I see w3m hasn't been updated in 15 years, but it's probably still better for reading HN, whose UI hasn't changed for longer than that! I will have to give them both a spin :)
And, well, gopher://magical.fish it's unbeatable.
I had no idea this was possible. Can you explain why this works? Sounds fascinating.
Of course it relies on both devices being able to create and maintain a good bluetooth connection.
It would be nicer if phones could run two wifi networks at the same time, allowing a mix of leach or hotspots but I guess in practical terms it's only one in a thousand type of demand.
[1] https://www.lifewire.com/internet-on-laptop-with-a-bluetooth...
I think I did manage to get about 5k/s by placing it at various points around the room, but it was mostly dropped packets and 1k down.
Dead Comment
Oh, and reddit: gopher://gopherddit.com
HN: gopher://gopherddit.com
But, as they stated, connecting to a pubic Unix like mosh was and is magic.
chromium --headless example.com --disable-gpu --run-all-compositor-stages-before-draw --dump-dom --virtual-time-budget=10000 --window-size=800,600 | sed "s|<head>|<head><base href=example.com>|g" | lynx -stdin
Lynx and links (and I wanted to _code_ my own using netsurf libraries).
Restoring noscript/basic (x)html will only happen with hardcore regulation (or "tarif"/"gigantic fines"... same same...).
This is critical for the web, since that makes developing real-life alternative browsers a reasonable task from many pertinent perspectives.
The current technical landscape of the web is a disaster: a cartel of 2.5 absurdely and grotesquely gigantic web engines written in the most complex computer language out there which requires a compiler on the same complexity level... and there are only 2 of them from roughly from the same cartel/mob.
It seems that technical interop of the web with a very simple standard, stable in time and good enough to do the job is a 'competitive' issue of the small vs the big and should be handle by regulating administrations.
Remember, tons of web sites were noscript/basic (x)html compatible and doing a more than enough good job already... without insane technical dependencies...
Little did I realize that 99% of the time I would be SSHed in from a full desktop with a standard browser, and Lynx has just been kind of a fun novelty for me.
While JavaScript has its place, I believe that websites that focus on delivering primarily text content could prioritise working well in TUI browsers. Sometimes testing it with text-based browsers may even show fundamental issues with your HTML. For example, several times, I've seen that multiple navigation links next to each other have no whitespace between them. The links may appear like this:
Or, in a list of articles, dates and titles may appear jammed together: The missing spaces aren't obvious in a graphical browser due to the CSS styling hiding the issue, but in a text-based one, the issue becomes apparent. The number of text-based web users may be shrinking, but there are some of us who still browse the web using tools like lynx, w3m, and M-x eww, at least occasionally.Unlike computer interfaces, the web was never text-first. It was graphical from the start. The first browser was in a GUI, not a terminal.
Sites have been hobbled/broken on Lynx since the very beginning. It's neat and can be convenient to have a browser that works in your terminal for simple stuff, but the web was never designed for that. It's natural and to be expected that many sites will break. The burden is really on Lynx to do what it can to support sites as they are, rather than sites to try to build compatibility with Lynx.
It's kind of like, there are programs to "view" a PDF in the terminal, or at least its text content. But PDF authors shouldn't be expected to optimize the order text is presented for those programs. That's not what PDF was ever meant for, even if you can get it work sometimes.
There is nothing "natural" about software development at all. It was an active choice to hobble the internet as a browser to sell ads via interactive apps.
Even back then, the book recommended testing your web site in Lynx for two reasons:
1. Web sites are supposed to gracefully degrade when viewed in browsers without support for advanced features.
2. Accessibility matters, and while most of us don't have access to or know how to operate screen readers if we can comfortably view and navigate a web site in Lynx there's a pretty decent chance that it'll be usable with a screen reader.
It's been ~30 years since then and those reasons still apply just as well. For the vast majority of web sites which do not have any need to be interactive webapps there's not really any good reason for it not to be perfectly usable in a text-only browser, and if it's both readable and navigable in a text browser it should also be with a screen reader.
I think what's more important is that a significant portion of web pages have no more complex layout than a newspaper or slideshow, so why not make them easy to parse? Not only would it make browsing in Lynx easier, but it would work well with screen readers, which are the only way some people can browse web pages.
Of course the web has evolved and has uses other than reading/absorbing information (some of them great) and multimedia content is valid, but it does seem to have become harder to find substance in amongst all the style.
When I'm surfing the web it's still usually words that I'm looking for. I think that may be going out of fashion.
People abusing HTML for the purpose of styling is the whole reason for CSS existing. A well written HTML document should have a very clean structure. The CSS has the ability to do all the crazy graphical styling. The old CSS Zen Garden was an amazing demonstration of that, an incredibly well structured and mostly semantic HTML document could have any number of crazy styles only by varying the CSS.
Bullshit HTML loaded with bullshit CSS frameworks generated by megabytes of bullshit JavaScript is a complete failure of web devs to master the medium. Unless a web page is absolutely reliant on a graphical interface (Google Maps, a game, etc) there's no reason that it shouldn't render passably in lynx. Even in those cases it should have noscript guardrails to provide at least an explanation as to why it can't work.
Which works perfectly, including navigation (next/prev/parent). The perfect way to use javascript to enhance a site (collapsing threads etc) but not require it.
HN is hosted on a single machine in a colo somewhere (with a backup elsewhere), yet has far more value than the majority of sites 100 times as complex.
All this to say, HN shouldn't an example to blindly follow.
I do prefer to surf with js disabled, and in most cases it actually works pretty well.
But the lack of non-js mastedon has pretty much stopped me from reading posts on the system. I can surf github, and this site, with no js, but mastedon is a no go.
The conversion of the internet into a scam ad distribution system is the primary culprit leading to the massive proliferation of js, along with the use of overly complex "frameworks" that could often be static html, I don't know what mastedon's excuse is...
[0] https://brutaldon.org/about
I wouldn't be surprised if it's growing in absolute numbers, in relative numbers it stays at essentially 0% where it always was.
For the c2wiki, there are clones of it which work with plain text, but I can't remember the alternative domain. You can DDG/GG it, tho.
But yeah, I wish people were more hesitant over-using JavaScript.
Why would this be considered "an issue" or "a problem in your HTML"? TUI browsers are really a fun novelty and not much more, I'd be shocked if even the largest sites in the world receive more than 1000 visits per day from lynx or eww or any other combined. Unless you have a compelling reason to think that your site will be used by terminal browser fans, there is no reason whatsoever to care about how your HTML renders without CSS. Even screen readers would not have problems properly reading links not separated by spaces correctly.
I maintained it for a while, then delegated the DNS to someone else, but they didn't maintain it either, swapped it back. ~I'll update it when I get a chance.~
edit: Updated with the correct version and some small HTML tweaks
> Access Denied - Sucuri Website Firewall
...
> Block reason: Access from your Country was disabled by the administrator.
For that reason I don't think it's a good page to recommend.
https://github.com/gregkh/cosmic-sans-neue
which was later renamed Fantasque due to hate mail.
Probably Fantasque because I've (to the best of my memory) never installed Cosmic Sans (and I made the screenshot, obvs.) but I do occasionally use Fantasque for terminals.
But since the screenshot needs updating, I'm open to suggestions for what font to use this time.
Which phone browser renders it in an unreadable manner?
* saving webpages as text with the links nicely organized at the bottom, and
* calling it from mutt (MUA) to display HTML parts of mail messages.
It works great and it's consistent.