Readit News logoReadit News
frognumber commented on The Nobel Prize and the Laureate Are Inseparable   nobelpeaceprize.org/press... · Posted by u/karakoram
frognumber · a month ago
I had a physics professor I worked with who had a Nobel Prize.

He didn't win it. It was won by a team of students / collaborators / mentees, who felt he deserved it. I can't disagree with them. Among the nicest people in the world.

I don't think anyone meant it in the sense of "You're a Nobel Prize Winner," so much as "We couldn't have done this without your mentorship, and you deserve to hold onto this." He certainly doesn't consider himself to be a Nobel Prize winner.

frognumber commented on ASCII characters are not pixels: a deep dive into ASCII rendering   alexharri.com/blog/ascii-... · Posted by u/alexharri
frognumber · a month ago
This was painful to read. It become better and simpler with a basic signals & systems background:

- His breaking up images into grids was a poor-man's convolution. Render each letter. Render the image. Dot product.

- His "contrast" setting didn't really work. It was meant to emulate a sharpen filter. Convolve with a kernel appropriate for letter size. He operated over the wrong dimensions (intensity, rather than X-Y)

- Dithering should be done with something like Floyd-Steinberg: You spill over errors to adjacent pixels.

Most of these problems have solutions, and in some cases, optimal ones. They were reinvented, perhaps cleverly, but not as well as those standard solutions.

Bonus:

- Handle above as a global optimization problem. Possible with 2026-era CPUs (and even more-so, GPUs).

- Unicode :)

frognumber commented on Roam 50GB is now Roam 100GB   starlink.com/support/arti... · Posted by u/bahmboo
Someone1234 · a month ago
I'm actually a huge fan of "unlimited slow speeds" as a falloff, instead of a cliff.

Aside from the fact it allows you to work with Starlink to buy more fast speed, it also allows core stuff to continue to function (e.g. basic notifications, non-streaming web traffic, etc).

frognumber · a month ago
Years ago, I picked cell carrier because of this. When I ran out, it switched to O(200kbps), which is fine for email, basic web search, etc.

It was actually a bit ironic that, at the time, you could burn through the whole high-speed quota in seconds or minutes, if you went to the wrong web page. Most carriers would stop or bill you an arm-and-a-leg after.

frognumber commented on The worst possible antitrust outcome   pluralistic.net/2025/09/0... · Posted by u/leotravis10
tomComb · 5 months ago
I think the ad model as the document model of the internet consumer was discovered long before Google, but otherwise agree with you.
frognumber · 5 months ago
I think you're wrong, and you're underestimating the transformational impact of Ad-Words.

Free internet existed before paid internet, true, but mostly because people did things for other motives (like fun). Altavista was a tech demo for DEC. Good information was found on personal web pages, most often on .edu sites.

Banner ads existed, but they were confined to the sketchy corners of the Internet. Thing today's spam selling viagra. Anyone credible didn't want to be associated with them.

What Google figured out was:

1) Design. Discrete ad-words didn't make them look sketchy. This discovery came up by accident, but that's a longer story.

2) Targeting. Search terms let them know what to ads to show.

I can't overstate the impact of #2. Profits went up many-fold over prior ad models. This was Google's great -- and ultra-secret -- discovery. For many years, they were making $$$, while cultivating a public image of (probably) bleeding $$$ or (at best) making $. People were doing math on how much revenue Google was getting based on traditional web advertising models, while Google knew precisely what you were shopping for.

By the time people found out how much money Google's ad model was making, they had market lock-in.

frognumber commented on John Carmack's arguments against building a custom XR OS at Meta   twitter.com/ID_AA_Carmack... · Posted by u/OlympicMarmoto
ronald_petty · 5 months ago
Not saying these are perfect, but consider reviewing the work of groups like the Internet Society or even IEEE sectors. Boots on the ground to some extent such as providing gear and training. Other efforts like One Laptop Per Child also leaned into this kind of thinking.

What could it could mean for a "tech" town to be born, especially with what we have today regarding techniques and tools. While the dream has not really bore out yet (especially at a village level), I would argue we could do even better in middle America with this thinking; small college towns. While its a bit of existing gravity well, you could do a focused effort to get a flywheel going (redo mini Bell labs around the USA solving regional problems could be a start).

Yes it takes decades. My only thought on that is, many (dare say most) people don't even have short term plans much less long term plans. It takes visionaries with nerves and will of steel to stay on paths to make things happen.

Love the experiment idea.

frognumber · 5 months ago
Pick a university, and given them $1B to never use Windows, MacOS, Android, Linux, or anything other than homebrew?

To kick-start, given them machines with Plan9, ITS, or an OS based on LISP / Smalltalk / similar? Or just microcontrollers? Or replicate 1970-era university computing infrastructure (where everything was homebrew?)

Build out coursework to bootstrap from there? Perhaps scholarships for kids from the developing world?

frognumber commented on John Carmack's arguments against building a custom XR OS at Meta   twitter.com/ID_AA_Carmack... · Posted by u/OlympicMarmoto
827a · 5 months ago
Continuing the thought experiment: There's an interesting sort-of contradiction in this desire: I, being dissatisfied with some aspect of the existing software solutions on the market, want to create an isolated monastic order of software engineers to ignore all existing solutions and build something that solves my problems; presumably, without any contact from me.

Its a contradiction very much at the core of the idea: Should I expect that the Operating System my monastic order produces be able to play Overwatch or be able to open .docx files? I suspect not; but why? Because they didn't collaborate with stakeholders. So, they might need to collaborate with stakeholders; yet that was the very thing we were trying to avoid by making this an isolated monastic order.

Sometimes you gotta take the good with the bad. Or, uh, maybe Microsoft should just stop using React for the Start menu, that might be a good start.

frognumber · 5 months ago
An isolated monastic order in the hills around the Himalayas should ideally be completely isolated from Overwatch and .docx files.
frognumber commented on John Carmack's arguments against building a custom XR OS at Meta   twitter.com/ID_AA_Carmack... · Posted by u/OlympicMarmoto
01HNNWZ0MV43FF · 6 months ago
If it's so bloated then just start cutting

Whatever expertise you need to prune a working system is less than the expertise you'll need to create a whole new one and then also prune it as it grows old

frognumber · 5 months ago
Absolutely not.

Software is bloated in part because it's built in layers. People wrap things over, and over, and over. Stripping down layers is neigh-impossible later. Starting from scratch is easy.

Starting from scratch fails in practice because you don't get feature parity in time short enough for VC (or grant) funding cycles.

If we build a tech tree around 200MHz 32MB machines, except for things like ML and video, we'd have a tech tree which did everything existing machines do, only 10x more quickly in 0.1% of the memory. Machines back then were fine for word processing, spreadsheets, all the web apps I use on a daily basis (not as web apps), etc.

Need would drive people to rebuild those, but with a few less layers.

frognumber commented on John Carmack's arguments against building a custom XR OS at Meta   twitter.com/ID_AA_Carmack... · Posted by u/OlympicMarmoto
ksec · 6 months ago
Love this idea and wondering where that low cost of living place would be. But genuinely asking;

What problem are we trying to solve that is not possible right now? Do we start from hardware at the CPU ?

I remember one of an ex Intel engineer once said, you could learn about all the decisions which makes modern ISA and CPU uArch design, along with GPU and how it all works together, by the time you have done all that and could implement a truly better version from a clean sheet, you are already close to retiring .

And that is assuming you have the professional opportunity to learn about all these, implementation , fail and make mistakes and relearn etc.

frognumber · 5 months ago
> Love this idea and wondering where that low cost of living place would be

Parts of Africa and India are very much like that. I would guess other places too. I'd pick a hill station in India, or maybe some place higher up in sub-Saharan Africa (above the insects)

> What problem are we trying to solve that is not possible right now?

The point is more about identifying the problem, actually. An independent tech tree will have vastly different capabilities and limitations than the existing one.

Continuing the thought experiment -- to be much more abstract now -- if we placed an independent colony of humans on Venus 150 years ago, it's likely computing would be very different. If the transistor weren't invented, we might have optical, mechanical, or fluidic computation, or perhaps some extended version of vacuum tubes. Everything would be different.

Sharing technology back-and-forth a century later would be amazing.

Even when universities were more isolated, something like 1995-era MIT computing infrastructure was largely homebrew, with fascinating social dynamics around things like Zephyr, interesting distributed file systems (AFS), etc. The X Window System came out of it too, more-or-less, which in turn allowed for various types of work with remote access unlike those we have with the cloud.

And there were tech trees build around Lisp-based computers / operating systems, SmallTalk, and systems where literally everything was modifiable.

More conservatively, even the interacting Chinese and non-Chinese tech trees are somewhat different (WeChat, Alipay, etc. versus WhatsApp, Venmo, etc.)

You can't predict the future, and having two independent futures seems like a great way to have progress.

Plus, it prevents a monoculture. Perhaps that's the problem I'm trying to solve.

> Do we start from hardware at the CPU ?

For the actual thought experiment, too expensive. I'd probably offer monitors, keyboards, mice, and some kind of relatively simple, documented microcontroller to drive those. As well as things like ADCs, DACs, and similar.

Zero software, except what's needed to bootstrap.

frognumber commented on John Carmack's arguments against building a custom XR OS at Meta   twitter.com/ID_AA_Carmack... · Posted by u/OlympicMarmoto
frognumber · 6 months ago
John describes exactly what I'd like someone to build:

"To make something really different, and not get drawn into the gravity well of existing solutions, you practically need an isolated monastic order of computer engineers."

As a thought experiment:

* Pick a place where cost-of-living is $200/month

* Set up a village which is very livable. Fresh air. Healthy food. Good schools. More-or-less for the cost that someone rich can sponsor without too much sweat.

* Drop a load of computers with little to no software, and little to no internet

* Try reinventing the computing universe from scratch.

Patience is the key. It'd take decades.

frognumber commented on US Intel   stratechery.com/2025/u-s-... · Posted by u/maguay
georgeburdell · 6 months ago
If I may add my view as a formerly high-achieving semiconductor worker that Intel would benefit greatly from having right now, a lot of us pivoted to software and machine learning to earn more money. My first 2 years as a software engineer earned me more RSUs than a decade in semiconductors. Semiconductors is not prestigious work in the U.S., despite the strategic importance. By contrast, it is highly respected and relatively well remunerated in the countries doing well in it.

From this lens, the silver lining of the software layoffs going on may be to stem the bleeding of semiconductor workers to the field. If Intel were really smart, they’d be hiring more right now the people they couldn’t get or retain 3-5 years ago

frognumber · 6 months ago
As a former EE, it's not just pay.

The cog-in-a-machine corporate culture is not fun. Tech culture is much healthier.

There's no upside to big electronics companies here.

u/frognumber

KarmaCake day2112June 28, 2022View Original