I'm sad. I'm a software guy, not much of a hardware expert, so please bear with me if I"m too pessimistic. However, I feel that if trends like this continue, it might be the end of enthusiast-level personal computing as we know it. No more being able to head down to the electronics store and purchase RAM, motherboards, processors, GPUs, storage, and other components. We're going to be limited to locked-down terminals connected to cloud services, both of which provided by a small number of multinational corporations. If we're lucky, we might still have USB peripherals.
The sad thing is that we enthusiasts are a small market compared to the overwhelming majority of computer users who don't mind locked-down devices, or at least until they've been bitten by the restrictions, but if there are no alternatives other than retrocomputing, then it's too late. For decades we enthusiasts have been able to benefit from other markets with overlapping needs such as gaming, workstations, and corporate servers. However, many on-premise servers have been replaced by cloud services, the workstation market has been subsumed by the broader PC market, and PC gaming has faced challenges, from a push toward locked-down consoles to challenges in the GPU market due to competition with cryptocurrency mining and now AI.
One of the things I'm increasingly disappointed with is the dominance of large corporations in computing. It seems harder for small players to survive in this ecosystem. Software has to deal with network effects and large companies owning major platforms, and building your own hardware requires tons of capital.
I wonder if it's possible even for a company to make even 1980s-era electronics without massive capital expenditures? How feasible is it for a small company to manufacture the equivalent of a Motorola 68000 or Intel 386?
I'd like to see a market for hobbyist computing by hobbyist computer shops, but I'm not sure it's economically feasible.
I've seen things going the opposite way. It's only recently that an average person could jump on eBay and get assembled low-level electronic module/boards for cheap, and assemble into their project.
Yes, you'll probably have difficulty walking into a STORE to buy PC components, but only because online shopping has been killing local shops for decades now. You'll find it easy to get that stuff online, for better prices.
PCs, since the very start, have been going through a process of being ever more integrated each generation. Not too many people install sound cards, IDE controllers, etc., anymore. CPUs, GPUs, and RAM are about the only holdouts not integrated on the motherboard these days. It's possible that could change, if CPUs and GPUs becomes fast enough for 99% of people, and RAM gets cheap enough that manufacturers can put more on-board than 99% of people will need. And while you might not be happy about that kind of integration, it comes with big price reductions that help everyone. But we're not there yet, and I can't say how long down the road that might be.
Not my experience. I've been able to go to a local store to buy PC components for more than 35 years now and last did to upgrade the RAM in the laptop to be eligible for Win11. Online only was not cheaper and local store had it available same day. Local store does have online presence and is a chain tho.
Mouse replacement on a weekend coz old one broke same story (button smashed in and not usable at all any longer). Online not cheaper, no same day available at any price, Amazon delivery without Prime no next day either. Local chain store had it for immediate pickup and I was gaming again in 30 minutes.
> It's only recently that an average person could jump on eBay and get assembled low-level electronic module/boards for cheap, and assemble into their project.
People have been tinkering with electronic/electric modules for decades:
> Yes, you'll probably have difficulty walking into a STORE to buy PC components, but only because online shopping has been killing local shops for decades now.
Rather: very commonly the local shops don't stock the parts that I would like to buy, and it is often hard to find out beforehand which kind of very specialized parts the local shop does or doesn't stock.
True story concerning electronic components: I went to some electronic store and wanted to buy a very specialized IC, which they didn't stock. But since the sales clerk could see my passion for tinkering with electronica, he covertly wrote down an address of a different, very small electronics store including instructions which tram line to take to get there (I was rather new to the city), which stocks a lock more stuff that tinkerers love. I guess the sales clerk was as disappointed with the range of goods that his employer has decided to concentrate on as I was. :-)
On the other hand, lots of former stores for PC component now have whole lots of shelf rows with mobile phone cases instead. I get that these have high sales margins, but no thanks ...
Thus, in my opinion it is not online shopping that killed local shops, but the fact that local shops simply don't offer and stock the products that I want to buy.
I feel this vibe. It's part of the reason I invested in a monster home workstation - threadripper 9995wx, 768gb ecc ram, 96gb Blackwell pro. I expect it may easily be the last proper home PC I buy that is scaled in line with the scaling that I grew up with in the 80s and 90s.
Increasingly, what we have are mobile terminals - possibly with a dock for monitor connections - for remote big iron. And the continuous push from governments for more control - seemingly synchronous demands for age gating (i.e. requiring IDs) and chat snooping - males me think this remote hardware won't really be yours before long.
Windows, caught up in the LLM mania, is to be left by the wayside too.
The Terry Davis tinfoil-hat version of me has a theory that this wider industry trend of pushing away consumers from general purpose home computers towards only using remote datacenters from locked down mobile/thin edge devices, is supported by both industries and governments because:
Number one, you become a recurring subscription instead of a one and done deal, making it incredibly profitable for industry
And number two, the government can more easily snoop on your data when it's all in the cloud versus a HDD box in your closet.
Granted, I think we're far away from that future, but I do feel that's the future the powers that be desire and they can use various mechanism to force that behavior, like convenience, and pricing, like for example making PC parts too expensive for consumers and subsidizing cloud and mobile devices to accelerate the move, and once enough consumers only know how to use Apple or Google devices they'll be less inclined to spend more money to build a PC and learn what a Linux is.
Whoa that definitely sounds like a monster rig! Out of curiosity, how much did that cost? Blackwell alone can be $10k+! It's definitely an investment, especially since it may soon become a relic, not in terms of specs, but in terms what manufacturers will actually end up making.
It's not just market forces. Computers are actually too subversive for the powers that be to allow mere citizens to have them.
Give citizens computers and they have encryption. This alone gives them a fighting chance against police, judges, three letter agencies, militaries.
Give citizens computers and they can wipe out entire sectors of the economy via the sheer power of unrestricted copying.
The future is bleak. Computer freedom is dying. Everything then word "hacker" ever stood for is dying. Soon we will no longer have our own systems, we will have no control, we will be mere users of corporation and government systems. Hacking will go extinct, like phreaking.
This fact brings me a profound sadness, like something beautiful is about to perish from this earth. We used to be free...
Alright this might be taking things a little too far… the free software movement is stronger than it’s ever been and hardware is also more accessible than it’s ever been. Losing this one company is simply not a death bell of the entire enthusiast computing market.
I too am very sad. This has been my brand of ram for a long time. I’m also more of a software guy than hardware, but I appreciate being able to have high-performance gaming class systems for my work. It runs circles around much of the stuff my colleagues run, including in deployment in various cloud environments.
This isn't "we are only going to sell products to big companies" nor is it "we are only going to make products that are locked down appliances"... the "consumer" versions of these products are just cheaper unreliable parts that lack the error correction or power loss protection of the enterprise parts, and in a world where we routinely have disks with tens of terabytes of storage in computers with a terabyte of memory, the argument for why "consumer" grade parts can even make sense to exist is pretty weak. It maybe sucks that, in the very short term, there will be a quick uptick in prices, but focusing on the better parts will also help bring their prices down... and, fwiw, they really aren't that bad to begin with: you can build a micro-ATX machine with a Xeon or an EPYC in it for what feels like a pretty reasonable price.
Apple CPUs come with their own GPUs on die and RAM in the chip package now. How much more is going to be put on the chip and assembled in increasingly fine grained processes?
Apple puts the RAM in the chip package because they integrate the GPU, and then they want to be able to have multiple channels to feed the GPU without having that many slots. (Their entry level models also don't have any more memory bandwidth than normal PC laptops and there is no real reason they couldn't use a pair of SODIMMs.)
But low end iGPUs don't need a lot of memory bandwidth (again witness Apple's entry level CPUs) and integrating high end GPUs makes you thermally limited. There is a reason that Apple's fastest (integrated) GPUs are slower than Nvidia and AMD's fastest consumer discrete GPUs.
And even if you are going to integrate all the memory, as might be more justifiable if you're using HBM or GDDR, that only makes it easier to not integrate the CPU itself. Because now your socket needs fewer pins since you're not running memory channels through it.
Alternatively, there is some value in doing both. Suppose you have a consumer CPU socket with the usual pair of memory channels through it. Now the entry level CPU uses that for its memory. The midrange CPU has 8GB of HBM on the package and the high end one has 32GB, which it can use as the system's only RAM or as an L4 cache while the memory slots let you add more (less expensive, ordinary) RAM on top of that, all while using the same socket as the entry level CPU.
And let's apply some business logic to this: Who wants soldered RAM? Only the device OEMs, who want to save eleven cents worth of slots and, more importantly, overcharge for RAM and force you to buy a new device when all you want is a RAM upgrade. The consumer and, more than that, the memory manufacturers prefer slots, because they want you to be able to upgrade (i.e. to give them your money). So the only time you get soldered RAM is when either the device manufacturer has you by the short hairs (i.e. Apple if you want a Mac) or the consumers who aren't paying attention and accidentally buy a laptop with soldered RAM when their competitors are offering similar ones for similar prices but with upgradable slots.
So as usual, the thing preventing you from getting screwed is competition and that's what you need to preserve if you don't want to get screwed.
I think this is an incorrect zero-sum mindset. Yes in the short term there is a fixed amount of GPUs, ram etc. But in the long run the money from ai and crypto is invested in building factories, researching new nodes etc. These investments will lead to better and cheaper products trickling down to everyone.
> I wonder if it's possible even for a company to make even 1980s-era electronics without massive capital expenditures? How feasible is it for a small company to manufacture the equivalent of a Motorola 68000 or Intel 386?
Very feasible but it would have to be redesigned around the cell libraries used in newer nodes since the i386 was manufactured on >1um size nodes.
Prototypes would cost around $1-2k per sq mm at 130nm and $10k per sq mm at 28nm (min order usually around 9 sq mm). Legacy nodes are surprisingly cheap, so non-recurring engineering will generally be the bulk of the cost. The i386 was originally >104 sq mm but at 1um, so you could probably fit the entirety of a i386 clone in 1-2 sq mm of a 130nm chip. Packaging it in the original footprint and the lead times on prototypes would probably be more annoying than anything.
At this point it's getting ridiculously easy to justify the acquisition of such manufactory - I wonder if there are existing players in other categories (be it hardware or even software) that are considering about trying to get into this field. I mean - I get it that someone like OVH may not have the caps yet to handle something like this - but I wouldn't be surprised if they could find a good company to partner with - same is for pretty much everywhere.
I'm really wondering about hardware as well but today's tech is surprising in it's scale and requirements - I wouldn't be surprised if we could do mid 70's tech as hobbyist today - but further than that...
This is a classic slippery slope fallacy. Micron's reversible exit from one of its businesses is clearly does not signify the end of the PC era. As long as the demand for DIY PCs persists, there will be suppliers providing the products needed. If you follow the industrial memory market, you probably know that it is currently experiencing a severe supply shortage. I think Micron's decision simply reflects the current market situation.
You should look into what’s happening with DIY robotics because it looks eerily similar to what I experienced in the early to mid 90s with PC hardware and software
And you can do way more than just host a bbs with robots
I think you should split the problem into hardware and software parts:
- Hardware
We won't have any hardware without secure boot and we won't have the signing keys. Signed firmware is required for everything x86, everything Apple, everything mobile, probably everything ARM too. Rockchip ARM could still boot without signed firmware last time I checked a few years ago, but I'm not sure about the newer CPUs.
[ Short story: I have an Asus Tinkerboard S. It came with an unprovisioned emmc onboard. Just empty. A few years ago I made the mistake of trying the official OS from Asus. It automatically set up the CPU to boot from emmc first and provisioned the emmc with boot0, boot1 and rpmb. These partitions can't be written and can't be removed. Their creation is a one-way operation according to emmc standards. Now I have to keep the emmc masked because it won't boot otherwise. So beware of any devices with emmc. ]
You can, of course, use MCUs for general computing today. ESP32 is pretty powerful - it's probably 4 times faster than a 486, certainly more powerful than an i386 or a 68000 you suggested. The big problem here is memory, graphics and software. No MMU requires a completely new OS. Linux (uclinux) could boot at some point without a MMU, but it won't fit in 540KB memory. MCUs can access external memory (PSRAM), but via slow buses and it's paged. Also there are no hi-speed buses for graphics.
There is some hope coming from the chinese CPUs. AFAIK, they don't support secure boot at all. I'm planning on getting one as soon as their proprietary firmware/UEFI/ACPI and be replaced by uboot.
- Software
It's useless to make a i386 or 68000 today. There's nothing but old software for them. Not even linux has i386 support anymore. This is an even bigger problem than hardware. Much, much bigger. To have any hope of a minimally useful computing platform, we need a working browser on that platform. This is an absolute requirement. There's no way around this. I had to abandon Win98, then WinXP, and soon Win7 because of no working browser.
Linux is generally usable today, but as soon as Linus retires, it's going to fall into the same user-lockdown like all others. The basic infrastructure is all in place: secure boot, root login basically deprecated, access rights and security settings administered by the distro and package manager, not the user, no per-program firewall rules, automatic updates as standard, surveillance infrastructure via udev (hardware events), dbus (software events) and gtk accessibility (input events), etc. Linus fought hard to keep them outside the kernel, but he won't live forever.
To have any hope of privacy and/or freedom our personal computers we need to turn the security paradigm completely on it's head: users are not the threat - programs are. The user should login as root (or system account in Windows) and every program, including services, should run under their own limited accounts with no access to the rest of the system, especially user's files.
Of all OSs today, Gentoo portage is probably the easiest package manager to tweak into creating accounts and groups for programs instead of users and Gobo Linux has the best separation of programs in it's filesystem. I'd love to see a merger of these two.
Hobbyist computing? Ha! First get a browser working.
I'm more optimistic as I think about SBC manufacturers, plenty of other manufacturers wanting to service this market, and companies like Framework (warts and all - I don't think they're perfect) didn't really exist a couple of decades ago.
I'm actually a big fan of Apple hardware (when you crunch the numbers for base spec machine and when you're able to get discounts, the price/performance for the half-life you get is incredible), but I'm also planning to get back into home-brew builds a bit more over the next year: I need to build a NAS, a home lab, I might look at a gaming rig... and I'm far from alone.
So yes, it's a niche market, but a profitable one for a lot of players, and one that Micron will be glad is still available to them when the data centre bubble bursts.
I don't think any SBCs will be a replacement for quite some time. At what point will there be a raspberry pi with 128 GB of RAM and a 16/32 core? That's what I'm running right now, and I really hope I don't have to downgrade in the future.
It’s never been easier to be a hobbyist or a small electronics company. Honestly I don’t know what you’re talking about.
Micron is exiting this business because it’s a commodity. It’s everywhere. There are numerous companies producing parts in this space. Their investments are better spent on other things.
> I wonder if it's possible even for a company to make even 1980s-era electronics without massive capital expenditures? How feasible is it for a small company to manufacture the equivalent of a Motorola 68000 or Intel 386?
I don’t know what your threshold is for massive capital expenditure, but you could get a tapeout for a 68000 clone easier than at any point in history. There are now even takeout services that will let you get a little piece of a shared wafer for experimenting with making your own chips for very low prices. There are even accessible tools available now.
The hobby and small scale situation is better now than at any point in history. I don’t know how anyone can be sad about it unless they’re ignoring reality and only speculating based on the most cynical online takes about how the future is going to do a complete 180.
> Micron is exiting this business because it’s a commodity. It’s everywhere. There are numerous companies producing parts in this space. Their investments are better spent on other things.
Numerous as in plenty? Or basically three? Samsung, SK Hynix and Micron make up over 90% of the market share of DRAM. Micron saying goodbye to the consumer market basically leaves us with yet another duopoly.
Here is a video of Sam Aloof who now runs Atomic Semi with Jim Keller. It likely took thousands of dollars to make his own custom Z2 chip that only has 1200 transistors and its nowhere near the likes of the 68k or Intel 386. They might have more advanced stuff now at Atomic Semi but they haven't announced anything
It is interesting that your comment made no mention of single board computers (SBC) like Raspberry Pi. These are the more likely future of hobbyist computing as the price to develop custom PCBs is lower than ever. Yes, you need to buy the components, but it will come a day where you shop by parts then some LLM arranges the parts on an SBC. Finally, you review the the PCB layout, and click a button to place an order with PCBWay or a competitor. X days later a tiny board appears at your house exactly built to spec and runs Linux. The coolest part: The cost is so low that millions and millions of more (young) people can join the hobbyist computing party.
Wow. They're not selling off the business, they're totally exiting it.
This is a big loss. Crucial offered a supply chain direct from Micron. Most other consumer DRAM sources pass through middlemen, where fake parts and re-labeled rejects can be inserted.
> They're not selling off the business, they're totally exiting it.
From what I understand, OpenAI just bought out a significant proportion of the capacity of Samsung and Hynix, which is the big reason prices just spiked. They're two of the three DRAM manufacturers, Micron being the third.
That gives us a good idea as to what Micron is doing here: They have contracts with their other customers to supply DRAM at whatever price they previously negotiated, but now prices are higher, and they don't have to honor supply contracts with Crucial because they own it. So they're taking all the supply that would have gone to Crucial and selling it to the high bidder instead.
Spinning off the brand in that context doesn't work, because then "Crucial" would need to come with supply contracts or they'd have no access to supply and have nothing to sell. Moreover, the supply constraint isn't likely to be permanent, and whenever prices come back down then Micron would still own the brand and could start selling under it again, which they couldn't if they sold it.
> Moreover, the supply constraint isn't likely to be permanent, and whenever prices come back down then Micron would still own the brand and could start selling under it again, which they couldn't if they sold it.
Why not just announce limited supply, then, instead of exiting?
This seems like a "automaker invests more in financing arm, because it's the most profitable" concentration mistake, towards an industry with wide concerns over intermediate term financial sustainability.
They are rerouting RAMs for consumers to enterprise for server build up - for higher margins I’m sure. MAG7 will happily pay more but poor consumers like us can’t - this is more bad news for us.
Wondering if we're going to have a situation in the future where we end up having to buy the hand-me-downs from industry after they're done with them (and thus kind of outdated tech)? Kind of seems like the days of building your own PC are numbered.
Have you ever confused BlackRock with Blackstone? Despite their similar
sounding names, these two financial powerhouses represent distinct approaches
to investment management.
Major news organizations and sector researchers describe the claim as
unfounded and often rooted in confusion between BlackRock Inc. and the
private-equity firm Blackstone Inc.
Should countries have a upper limit on the ratio of server:client memory supply chain capacity? If no one can buy client hardware to access the cloud, how would cloud providers survive after driving their customers to extinction?
It shouldn't be possible for one holding company (OpenAI) to silently buy all available memory wafer capacity from Samsung and SK Hynix, before the rest of civilization even has the opportunity to make a counteroffer.
What if we realize that 8 GB of memory is actually a tremendous amount, and experience a resurgence in desktop operating systems as people begin to prioritize memory for productive computation again instead of using up a gigabyte for a chat client?
> Should countries have an upper limit on the ratio of server:client memory supply chain capacity? If no one can buy client hardware to access the cloud, how would cloud providers survive after driving their customers to extinction?
You mean a central planning, command and control economy? There is a lot of history of countries trying these things and they don’t have the outcome you want.
DRAM manufacturing is a global business. If one country starts imposing purchase limits for whatever reason, the DRAM consumers are going to laugh as they move their data centers and operations to another country that doesn’t try to impose their laws on a global market.
> It shouldn't be possible for one holding company (OpenAI) to silently buy all available memory wafer capacity from Samsung and SK Hynix, before the rest of civilization even has the opportunity to make a counteroffer.
Good news: That’s not how markets work. DRAM manufacturers don’t list a price and then let OpenAI buy it all up. Contracts are negotiated. Market prices fluctuate.
No supplier of anything is going to let all of their inventory disappear to one buyer without letting the market have a chance to bid the price higher.
These suggestions seem to me part of an absurd struggle against basic market economics.
This isn’t antitrust because the companies aren’t reselling it to you at a much higher price after cornering the market (cough cough Ticketmaster & scalpers).
> Most other consumer DRAM sources pass through middlemen, where fake parts and re-labeled rejects can be inserted.
Large DIMM vendors are definitely not buying through middlemen.
Any vendor consuming a lot of RAM chips over a threshold will be negotiating contracts with RAM chip manufacturers. It’s not that hard even at medium scale.
and when the AI boom pops, Micron is going to lose out on the consumer market. This is a horrible business decision. All they had to do was increase the price.
I think others will pick up the slack. The Chinese seem to be pretty good at producing competent Flash/DRAM products (I don't think they are behind that much). They also don't seem to be all in on the AI craze, so maybe we will buy their stuff if nothing else?
With ram, you can verify pretty quick what you are getting.
I really wouldn't want to buy any NAND vendor until a bunch of years after they build a reputation. It's too scary to get a decent bargain SSD drive that actually oh secretly dies really early, doesn't actually have anywhere near the endurance it claims.
Yep. I've followed Micron since before Y2k. I've seen the ups and downs of their stock. Seen a CEO literally crash and burn (RIP).
This is a mistake. The consumer business is a bet , which they excel at. Yes, its not printing money right now, but it is an option. Exiting the consumer business will mean they may miss insights into the next hot consumer trend.
The game for large companies like this should be to keep small bets going and literally, just survive. That's what Micron was doing, that's what NVIDIA did for a better part of a decade. Now that both are printing money.
Yet, Micron has decided its retiring from placing more bets. Mistake.
What is there to sell? The brand itself has value I guess, but that "Direct Access" goes out the door the second they sell it, so there's no value specifically in that to anyone else.
Crucial is primarily a Marketing and Support company, they didn't really make anything although there was a small engineering team that did DIMM/Module binning, but mostly contracted out heatsinks to glue to Micron DIMMs. On the SSD side of things, they used Phison controllers with Micron flash, just like pretty much any other consumer SSD that isn't Samsung or SK/Solidigm.
Corsair, Gskill, Geil, etc don't buy components from Crucial, they get them Micron. Crucial closing their doors has no bearing on that as far as we can tell.
They probably considered dumping it on some Private Equity firm or something, but likely decided to keep the IP in case they decide to resurrect it in the future should the market landscape change.
It sucks that they're winding down crucial, but it makes sense. Frankly I'm surprised they didn't pull the trigger sooner, and by sooner I mean years sooner.
Maybe they keep the brand to resurrect it some years in the future when the surge of data centers has faded away and they'll have to find custumers between consumers.
I feel like the "democratization of technology" is on the back slide. For the longest time, we had more and more access to high end technology at very reasonable price points.
Now it feels like if you're not Facebook, Google, OpenAI, etc. etc. computation isn't for you.
I hope this is just a blip, but I think there is a trend over the past few years.
I also hope its just a blip, but I don't actually think it is.
The democratization of technology was something that had the power to break down class barriers. Anyone could go get cheap, off the shelf hardware, a book, and write useful software & sell it. It became a way to take back the means of production.
Computing being accessible and affordable for everyone = working class power.
That is why its backsliding. Those in power want the opposite, they want to keep control. So we don't get to have open devices, we get pushed to thin clients & locked boot loaders, and we lose access to hardware as it increasingly only gets sold B2B (or if they do still sell to consumers, they just raise prices until most are priced out).
When the wealthy want something, that something becomes unavailable to everyone else.
> Those in power want the opposite, they want to keep control. So we don't get to have open devices, we get pushed to thin clients & locked boot loaders
While it's undeniable that MAFIAA et al have been heavily lobbying for that crap... the problem is, there are lots of bad actors out there as well.
I 'member the 00s/10s, I made good money cleaning up people's computers after they fell for the wrong porn or warez site. Driver signatures and Secure Boot killed entire classes of malware persistence.
I don't see how it can be a blip if AI actually turns out to be successful. They'll likely gobble up any lose hardware for their datacenters until only scraps are left or the AI bubble pops if AGI isn't achieved in the next few years and stock values fall off a cliff
I'm not a fan of ultra big tech, but I don't get the concern here exactly.
What high end technology do you want that you can't get?
In the 90s, I paid nearly $10k for a high-end PC. Today, I can get something like an Nvidia RTX Pro 6000 Blackwell for ~$8k, with 24,064 CUDA cores and 96 GB RAM, that's capable of doing LLM inference at thousands of tokens per second.
I realize the prices from this example are a bit steep for many people, but it's not out of line historically with high-end hardware - in fact the $10k from the 90s would be something like $25k today.
My point is I don't see how "if you're not Facebook, Google, OpenAI, etc. etc. computation isn't for you." I'd love an example if I'm missing something.
Software has been moving in the right direction. Tons of open source projects for every application imaginable. But hardware has gotten more closed. You can't replace batteries in phones, they get pre-loaded with state level spyware, laptops today have about the same hard drive space as 10 years ago to drive cloud usage, and GPUs and now memory seem to be becoming increasingly cost prohibitive for consumers.
It's likely that all the mega cloud and AI companies want regular people forced to go to them for solutions and buying up any companies that might pose a potential for allowing that. In response they will use a small percentage of the trillions being thrown at them to eliminate those companies that allow for self hosting or mid tier providers to thrive.
A little foil hat conspiracy i supposed, but the big companies saw nobodies become incredibly wealthy over the last decade, and this is the new companies protecting their position by limiting technology.
Crucial was always a brand that I associated with quality, and I used their memory to upgrade several MacBooks back when it was still possible to upgrade the memory on MacBooks.
That being said, the only SSD I’ve ever had fail on me was from Crucial.
In recent builds I have been using less expensive memory from other companies with varying degrees of brand recognizability, and never had a problem. And the days of being able to easily swap memory modules seem numbered, anyway.
I've long (very, very long) been a storage snob. Originally via the IBM UltraStar drives, and continued with the Intel SSDs. Even with good backups, a drive failure is often a pain in the ass. Slightly less so with RAID.
IBM really locked me in on the Ultrastar back in the mid '90s. Sure, it has proven itself to be a great product. But some of the first ones I bought, one of the drives arrived failed. I called the vendor I bought it from and they said they wouldn't replace it, I'd have to get a refurb from IBM. So I called IBM, when I told them I had just bought it they said I needed to call the place I bought it from because otherwise I'd get a refurb. I explained I had already called them. "Oh, who did you buy it from?" I told them. "Can you hold on a minute?" ... "Hi, I've got [NAME] on the line from [VENDOR] and they'll be happy to send you a replacement."
My most memorable RAM upgrade was adding 512KB to an Atari ST in 1988. Had to suck the solder out of 16x(16+2) factory flow-soldered through-holes, then solder in the 16 individual RAM chips and their decoupling capacitors. I was a teenager and hadn’t soldered before. I had no one to show me how, so I got a book from the library with pictures.
Was a huge relief that the machine come up successfully. But then it would lock up when it got warm, until I found the dodgy joint.
Was a very stressful afternoon, but a confidence builder!
I bet there are many people whose sole experience inside a computer is popping in some DIMMs. I’ll be kinda sad if those days are also gone. On the other hand, super integrated packages like Apple’s M-series make for really well-performing computers.
And before that I duct-taped the insanely large 16KB RAM extension (from 1KB), so it doesn't reset with the slightest movement, on my Sinclair ZX81, which I've also assembled and soldered from a kit :)
That's because the SSD business was little more than a carbon copy of most other consumer non Samsung or SK/Solidigm brands. They've been phison controller with some cheap NAND flash with a different coat of paint for generations now, or in the case of the portable/external ones, that plus a 3rd party enclosure and IO module that they'd contracted out. In terms of hardware, this sub-business-unit was no more "Micron" than Corsair is (Support may be a different story). Enterprise SSD's and Consumer ones diverged years ago, and today are about as different from one another as GPU's are from CPU's.
The only real difference between Crucial RAM and Micron's unbuffered RAM was which brand's sticker they put on it, with some binning and QA on the higher-end enthusiast SKU's and a heatsink. This sub-business-unit was almost entirely redundant to Micron.
> And the days of being able to easily swap memory modules seem numbered, anyway.
I keep seeing people say this in threads across platforms discussing this news, and it baffles me. Why?
All the higher margin non consumer markets are moving away from socketed ram for integrated ram for performance and manufacturing cost reasons. It’s hard to see what the motivation for spending some of their limited foundry time on products that are only of interest to lower margin direct consumers if this keeps up
I also had a Crucial SSD fail. I believe it was either 256GB or 512GB SATA, around 2013-2014. Right around the same time OCZ released a batch of SSDs that were so bad they went out of business, despite being a leader in performance. It was a fairly large story about defective silicon. Good lesson in not being too loyal to brand names.
What a disaster for Micron. Having a consumer facing brand is 'crucial' for brand awareness. Micron is the smallest of the big 3 in DRAM and the only one in America. They're going to be swallowed up and replaced by CXMT.
The brand aware "consumers" are really just DIY PC builders, which is relatively a small number. Enterprise DRAM business is doing so great that Micron just doesn't see the consumer market is worth chasing.
This is bad for consumers though since DRAM prices are skyrocketing and now we have one less company making consumer DRAM.
The people who occupy the b2b ram buying kind of jobs are not aliens from another planet. Brand awareness in consumer markets, especially ones that are so closely tied to people's jobs (nerds gonna nerd) is going to have a knock on effect. It's not like a clothing brand or something.
Considering how many people don't realize Crucial is a Micron brand, or that Micron components are in a lot of non-crucial consumer brand products, I'd argue it wasn't that crucial.
Especially considering that there's little innovation in the consumer DRAM and SSD spaces vs their enterprise counterparts that Micron can flex their talent in.
Micron had infinite brand awareness in the electronics industry long before they made SSDs. Heck they don't even use their own name for those products. They've been a memory vendor for more than 40 years and they're the only vendor with US domestic memory fabs. Something tells me their future will be just fine. Disclosure: Micron stock holder.
Every low end IoT box made in china will be 'encouraged' to use CXMT aided by state subsidies. This will shrink the market for market price DRAM. When the AI bubble pops DRAM makers will discover the importance of diversification.
Almost certainly this is because of a windfall for Micron, at least in the short term. Datacenter memory demand is going through the roof, and that was where margins were highest already. It makes no sense to continue to try to milk a consumer brand that can be sold at, what, a 20% markup over generics?
Most likely Micron was planning this forever, and the current market conditions are such that it's time to pull the trigger and retool everything for GPU memory.
Micron is chasing AI glory. Their stock valuation has no room for consumer business, which is a distraction.
You can’t think about companies like it’s 2024. We’re in a gilded age with unlimited corruption… Anything can happen. They can sign a trillion dollar deal with OpenAI, get acquired by NVidia, merge with Intel, get nationalized by Trump, etc.
You’re exactly wrong. In the race to supply AI data center, there is no “consumer” (in the sense I think you mean) making or influencing a buying decision. Without a clear path to increase supply, why take $1 when you can have $6 or $7?
That's planned, not a disaster. They've deprioritized brand awareness. Siemens for instance doesn't need brand awareness, if they did, they'd pick an english name.
I don't know their breakdown for consumer vs enterprise, but the Crucial brand is consumer focussed. Obviously enterprise at this point is incredibly lucrative.
I am a huge believer in AI, but the build-out right now, justified or not, will definitely hit a slowdown at some point. Not being diverse in their customer base could really hurt them later on. Sometimes you keep something going for tomorrow's business even if it is costing you something today.
I hear the cries of a thousand people in marketing right now. Building a brand takes time. I could see this if they were thinking they needed to re-invent the brand and to help with that they were strategically taking a break but that seems like a stretch.
AI being a useful component in our futures (depending on how much as a society we'll shun slop) and the current scale of AI investment being a line-go-up bubble are complementary, not exclusive.
There's a huge difference between "AI" and "tech bros and finance guys getting amazed by an LLM that talks back to them without realising it's just a language model and not intelligence, so they started chucking the massive piles of cash they had lying around the world to evade taxes to them in a pyramid scheme of colossal scale". We currently are heading more and more towards the latter, and when it crashes it will sow so much distrust and curse the "AI" name so much that we'll probably get a decades-long AI winter after that. In the end none of this nonsense will help the world towards getting better AI any time soon.
Already most "AI researchers" outside of the big corps have basically turned in the last 3 years from "people training their models and doing research" to "webdev plugging into other people's APIs to use LLMs they don't know crap about". When, not if, the big AI bubble bursts, the damage done to the sector will be immense
The sad thing is that we enthusiasts are a small market compared to the overwhelming majority of computer users who don't mind locked-down devices, or at least until they've been bitten by the restrictions, but if there are no alternatives other than retrocomputing, then it's too late. For decades we enthusiasts have been able to benefit from other markets with overlapping needs such as gaming, workstations, and corporate servers. However, many on-premise servers have been replaced by cloud services, the workstation market has been subsumed by the broader PC market, and PC gaming has faced challenges, from a push toward locked-down consoles to challenges in the GPU market due to competition with cryptocurrency mining and now AI.
One of the things I'm increasingly disappointed with is the dominance of large corporations in computing. It seems harder for small players to survive in this ecosystem. Software has to deal with network effects and large companies owning major platforms, and building your own hardware requires tons of capital.
I wonder if it's possible even for a company to make even 1980s-era electronics without massive capital expenditures? How feasible is it for a small company to manufacture the equivalent of a Motorola 68000 or Intel 386?
I'd like to see a market for hobbyist computing by hobbyist computer shops, but I'm not sure it's economically feasible.
Yes, you'll probably have difficulty walking into a STORE to buy PC components, but only because online shopping has been killing local shops for decades now. You'll find it easy to get that stuff online, for better prices.
PCs, since the very start, have been going through a process of being ever more integrated each generation. Not too many people install sound cards, IDE controllers, etc., anymore. CPUs, GPUs, and RAM are about the only holdouts not integrated on the motherboard these days. It's possible that could change, if CPUs and GPUs becomes fast enough for 99% of people, and RAM gets cheap enough that manufacturers can put more on-board than 99% of people will need. And while you might not be happy about that kind of integration, it comes with big price reductions that help everyone. But we're not there yet, and I can't say how long down the road that might be.
Mouse replacement on a weekend coz old one broke same story (button smashed in and not usable at all any longer). Online not cheaper, no same day available at any price, Amazon delivery without Prime no next day either. Local chain store had it for immediate pickup and I was gaming again in 30 minutes.
People have been tinkering with electronic/electric modules for decades:
* https://en.wikipedia.org/wiki/Heathkit
* https://en.wikipedia.org/wiki/DigiKey
Rather: very commonly the local shops don't stock the parts that I would like to buy, and it is often hard to find out beforehand which kind of very specialized parts the local shop does or doesn't stock.
True story concerning electronic components: I went to some electronic store and wanted to buy a very specialized IC, which they didn't stock. But since the sales clerk could see my passion for tinkering with electronica, he covertly wrote down an address of a different, very small electronics store including instructions which tram line to take to get there (I was rather new to the city), which stocks a lock more stuff that tinkerers love. I guess the sales clerk was as disappointed with the range of goods that his employer has decided to concentrate on as I was. :-)
On the other hand, lots of former stores for PC component now have whole lots of shelf rows with mobile phone cases instead. I get that these have high sales margins, but no thanks ...
Thus, in my opinion it is not online shopping that killed local shops, but the fact that local shops simply don't offer and stock the products that I want to buy.
Increasingly, what we have are mobile terminals - possibly with a dock for monitor connections - for remote big iron. And the continuous push from governments for more control - seemingly synchronous demands for age gating (i.e. requiring IDs) and chat snooping - males me think this remote hardware won't really be yours before long.
Windows, caught up in the LLM mania, is to be left by the wayside too.
Number one, you become a recurring subscription instead of a one and done deal, making it incredibly profitable for industry
And number two, the government can more easily snoop on your data when it's all in the cloud versus a HDD box in your closet.
Granted, I think we're far away from that future, but I do feel that's the future the powers that be desire and they can use various mechanism to force that behavior, like convenience, and pricing, like for example making PC parts too expensive for consumers and subsidizing cloud and mobile devices to accelerate the move, and once enough consumers only know how to use Apple or Google devices they'll be less inclined to spend more money to build a PC and learn what a Linux is.
Give citizens computers and they have encryption. This alone gives them a fighting chance against police, judges, three letter agencies, militaries.
Give citizens computers and they can wipe out entire sectors of the economy via the sheer power of unrestricted copying.
The future is bleak. Computer freedom is dying. Everything then word "hacker" ever stood for is dying. Soon we will no longer have our own systems, we will have no control, we will be mere users of corporation and government systems. Hacking will go extinct, like phreaking.
This fact brings me a profound sadness, like something beautiful is about to perish from this earth. We used to be free...
Deleted Comment
But low end iGPUs don't need a lot of memory bandwidth (again witness Apple's entry level CPUs) and integrating high end GPUs makes you thermally limited. There is a reason that Apple's fastest (integrated) GPUs are slower than Nvidia and AMD's fastest consumer discrete GPUs.
And even if you are going to integrate all the memory, as might be more justifiable if you're using HBM or GDDR, that only makes it easier to not integrate the CPU itself. Because now your socket needs fewer pins since you're not running memory channels through it.
Alternatively, there is some value in doing both. Suppose you have a consumer CPU socket with the usual pair of memory channels through it. Now the entry level CPU uses that for its memory. The midrange CPU has 8GB of HBM on the package and the high end one has 32GB, which it can use as the system's only RAM or as an L4 cache while the memory slots let you add more (less expensive, ordinary) RAM on top of that, all while using the same socket as the entry level CPU.
And let's apply some business logic to this: Who wants soldered RAM? Only the device OEMs, who want to save eleven cents worth of slots and, more importantly, overcharge for RAM and force you to buy a new device when all you want is a RAM upgrade. The consumer and, more than that, the memory manufacturers prefer slots, because they want you to be able to upgrade (i.e. to give them your money). So the only time you get soldered RAM is when either the device manufacturer has you by the short hairs (i.e. Apple if you want a Mac) or the consumers who aren't paying attention and accidentally buy a laptop with soldered RAM when their competitors are offering similar ones for similar prices but with upgradable slots.
So as usual, the thing preventing you from getting screwed is competition and that's what you need to preserve if you don't want to get screwed.
Very feasible but it would have to be redesigned around the cell libraries used in newer nodes since the i386 was manufactured on >1um size nodes.
Prototypes would cost around $1-2k per sq mm at 130nm and $10k per sq mm at 28nm (min order usually around 9 sq mm). Legacy nodes are surprisingly cheap, so non-recurring engineering will generally be the bulk of the cost. The i386 was originally >104 sq mm but at 1um, so you could probably fit the entirety of a i386 clone in 1-2 sq mm of a 130nm chip. Packaging it in the original footprint and the lead times on prototypes would probably be more annoying than anything.
I'm really wondering about hardware as well but today's tech is surprising in it's scale and requirements - I wouldn't be surprised if we could do mid 70's tech as hobbyist today - but further than that...
No one is saying that it's the sole culprit. But when average PCs start costing $3000+ from now on, it seems like the end of an era.
You should look into what’s happening with DIY robotics because it looks eerily similar to what I experienced in the early to mid 90s with PC hardware and software
And you can do way more than just host a bbs with robots
- Hardware
We won't have any hardware without secure boot and we won't have the signing keys. Signed firmware is required for everything x86, everything Apple, everything mobile, probably everything ARM too. Rockchip ARM could still boot without signed firmware last time I checked a few years ago, but I'm not sure about the newer CPUs.
[ Short story: I have an Asus Tinkerboard S. It came with an unprovisioned emmc onboard. Just empty. A few years ago I made the mistake of trying the official OS from Asus. It automatically set up the CPU to boot from emmc first and provisioned the emmc with boot0, boot1 and rpmb. These partitions can't be written and can't be removed. Their creation is a one-way operation according to emmc standards. Now I have to keep the emmc masked because it won't boot otherwise. So beware of any devices with emmc. ]
You can, of course, use MCUs for general computing today. ESP32 is pretty powerful - it's probably 4 times faster than a 486, certainly more powerful than an i386 or a 68000 you suggested. The big problem here is memory, graphics and software. No MMU requires a completely new OS. Linux (uclinux) could boot at some point without a MMU, but it won't fit in 540KB memory. MCUs can access external memory (PSRAM), but via slow buses and it's paged. Also there are no hi-speed buses for graphics.
There is some hope coming from the chinese CPUs. AFAIK, they don't support secure boot at all. I'm planning on getting one as soon as their proprietary firmware/UEFI/ACPI and be replaced by uboot.
- Software
It's useless to make a i386 or 68000 today. There's nothing but old software for them. Not even linux has i386 support anymore. This is an even bigger problem than hardware. Much, much bigger. To have any hope of a minimally useful computing platform, we need a working browser on that platform. This is an absolute requirement. There's no way around this. I had to abandon Win98, then WinXP, and soon Win7 because of no working browser.
Linux is generally usable today, but as soon as Linus retires, it's going to fall into the same user-lockdown like all others. The basic infrastructure is all in place: secure boot, root login basically deprecated, access rights and security settings administered by the distro and package manager, not the user, no per-program firewall rules, automatic updates as standard, surveillance infrastructure via udev (hardware events), dbus (software events) and gtk accessibility (input events), etc. Linus fought hard to keep them outside the kernel, but he won't live forever.
To have any hope of privacy and/or freedom our personal computers we need to turn the security paradigm completely on it's head: users are not the threat - programs are. The user should login as root (or system account in Windows) and every program, including services, should run under their own limited accounts with no access to the rest of the system, especially user's files.
Of all OSs today, Gentoo portage is probably the easiest package manager to tweak into creating accounts and groups for programs instead of users and Gobo Linux has the best separation of programs in it's filesystem. I'd love to see a merger of these two.
Hobbyist computing? Ha! First get a browser working.
Deleted Comment
I'm actually a big fan of Apple hardware (when you crunch the numbers for base spec machine and when you're able to get discounts, the price/performance for the half-life you get is incredible), but I'm also planning to get back into home-brew builds a bit more over the next year: I need to build a NAS, a home lab, I might look at a gaming rig... and I'm far from alone.
So yes, it's a niche market, but a profitable one for a lot of players, and one that Micron will be glad is still available to them when the data centre bubble bursts.
Jump in the DeLorean and head to the 1980s / 1990s
Micron is exiting this business because it’s a commodity. It’s everywhere. There are numerous companies producing parts in this space. Their investments are better spent on other things.
> I wonder if it's possible even for a company to make even 1980s-era electronics without massive capital expenditures? How feasible is it for a small company to manufacture the equivalent of a Motorola 68000 or Intel 386?
I don’t know what your threshold is for massive capital expenditure, but you could get a tapeout for a 68000 clone easier than at any point in history. There are now even takeout services that will let you get a little piece of a shared wafer for experimenting with making your own chips for very low prices. There are even accessible tools available now.
The hobby and small scale situation is better now than at any point in history. I don’t know how anyone can be sad about it unless they’re ignoring reality and only speculating based on the most cynical online takes about how the future is going to do a complete 180.
Numerous as in plenty? Or basically three? Samsung, SK Hynix and Micron make up over 90% of the market share of DRAM. Micron saying goodbye to the consumer market basically leaves us with yet another duopoly.
Here is a video of Sam Aloof who now runs Atomic Semi with Jim Keller. It likely took thousands of dollars to make his own custom Z2 chip that only has 1200 transistors and its nowhere near the likes of the 68k or Intel 386. They might have more advanced stuff now at Atomic Semi but they haven't announced anything
Dead Comment
This is a big loss. Crucial offered a supply chain direct from Micron. Most other consumer DRAM sources pass through middlemen, where fake parts and re-labeled rejects can be inserted.
From what I understand, OpenAI just bought out a significant proportion of the capacity of Samsung and Hynix, which is the big reason prices just spiked. They're two of the three DRAM manufacturers, Micron being the third.
That gives us a good idea as to what Micron is doing here: They have contracts with their other customers to supply DRAM at whatever price they previously negotiated, but now prices are higher, and they don't have to honor supply contracts with Crucial because they own it. So they're taking all the supply that would have gone to Crucial and selling it to the high bidder instead.
Spinning off the brand in that context doesn't work, because then "Crucial" would need to come with supply contracts or they'd have no access to supply and have nothing to sell. Moreover, the supply constraint isn't likely to be permanent, and whenever prices come back down then Micron would still own the brand and could start selling under it again, which they couldn't if they sold it.
Why not just announce limited supply, then, instead of exiting?
This seems like a "automaker invests more in financing arm, because it's the most profitable" concentration mistake, towards an industry with wide concerns over intermediate term financial sustainability.
This is like developers shifting from building homes targeted at homeowners to building build-to-rent neighborhoods for Blackrock and company xD
It shouldn't be possible for one holding company (OpenAI) to silently buy all available memory wafer capacity from Samsung and SK Hynix, before the rest of civilization even has the opportunity to make a counteroffer.
You mean a central planning, command and control economy? There is a lot of history of countries trying these things and they don’t have the outcome you want.
DRAM manufacturing is a global business. If one country starts imposing purchase limits for whatever reason, the DRAM consumers are going to laugh as they move their data centers and operations to another country that doesn’t try to impose their laws on a global market.
> It shouldn't be possible for one holding company (OpenAI) to silently buy all available memory wafer capacity from Samsung and SK Hynix, before the rest of civilization even has the opportunity to make a counteroffer.
Good news: That’s not how markets work. DRAM manufacturers don’t list a price and then let OpenAI buy it all up. Contracts are negotiated. Market prices fluctuate.
No supplier of anything is going to let all of their inventory disappear to one buyer without letting the market have a chance to bid the price higher.
This isn’t antitrust because the companies aren’t reselling it to you at a much higher price after cornering the market (cough cough Ticketmaster & scalpers).
It’s these pesky pc things that people do bad things like piracy with/s
Large DIMM vendors are definitely not buying through middlemen.
Any vendor consuming a lot of RAM chips over a threshold will be negotiating contracts with RAM chip manufacturers. It’s not that hard even at medium scale.
Deleted Comment
https://www.klevv.com/ken/main
And don't forget about https://www.nanya.com/en/
While I never had a problem with https://semiconductor.samsung.com/dram/module/ , I think they will be rare/more expensive now, or 'soonish'.
For chinese CXMT and YMTC there is https://www.biwintech.com/
We live in interesting times!
(Cackling madly...)
Just looked at standard desktop: still no 64GB 5600MT/s modules. CUDIMMs are missing 32GB.
> And don't forget about Nanya
BTW, what is the status of Elpida now?
I really wouldn't want to buy any NAND vendor until a bunch of years after they build a reputation. It's too scary to get a decent bargain SSD drive that actually oh secretly dies really early, doesn't actually have anywhere near the endurance it claims.
This is a mistake. The consumer business is a bet , which they excel at. Yes, its not printing money right now, but it is an option. Exiting the consumer business will mean they may miss insights into the next hot consumer trend.
The game for large companies like this should be to keep small bets going and literally, just survive. That's what Micron was doing, that's what NVIDIA did for a better part of a decade. Now that both are printing money.
Yet, Micron has decided its retiring from placing more bets. Mistake.
Crucial is primarily a Marketing and Support company, they didn't really make anything although there was a small engineering team that did DIMM/Module binning, but mostly contracted out heatsinks to glue to Micron DIMMs. On the SSD side of things, they used Phison controllers with Micron flash, just like pretty much any other consumer SSD that isn't Samsung or SK/Solidigm.
Corsair, Gskill, Geil, etc don't buy components from Crucial, they get them Micron. Crucial closing their doors has no bearing on that as far as we can tell.
They probably considered dumping it on some Private Equity firm or something, but likely decided to keep the IP in case they decide to resurrect it in the future should the market landscape change.
It sucks that they're winding down crucial, but it makes sense. Frankly I'm surprised they didn't pull the trigger sooner, and by sooner I mean years sooner.
Now it feels like if you're not Facebook, Google, OpenAI, etc. etc. computation isn't for you.
I hope this is just a blip, but I think there is a trend over the past few years.
The democratization of technology was something that had the power to break down class barriers. Anyone could go get cheap, off the shelf hardware, a book, and write useful software & sell it. It became a way to take back the means of production.
Computing being accessible and affordable for everyone = working class power.
That is why its backsliding. Those in power want the opposite, they want to keep control. So we don't get to have open devices, we get pushed to thin clients & locked boot loaders, and we lose access to hardware as it increasingly only gets sold B2B (or if they do still sell to consumers, they just raise prices until most are priced out).
When the wealthy want something, that something becomes unavailable to everyone else.
While it's undeniable that MAFIAA et al have been heavily lobbying for that crap... the problem is, there are lots of bad actors out there as well.
I 'member the 00s/10s, I made good money cleaning up people's computers after they fell for the wrong porn or warez site. Driver signatures and Secure Boot killed entire classes of malware persistence.
In a naive way, when rich entities are interested in a limited resource it's basically over.
Somehow I can see a parallel with the housing crisis where the price go higher and higher.
I can't see both of them ending anytime soon unless there is a major paradigm shift in our life.
What high end technology do you want that you can't get?
In the 90s, I paid nearly $10k for a high-end PC. Today, I can get something like an Nvidia RTX Pro 6000 Blackwell for ~$8k, with 24,064 CUDA cores and 96 GB RAM, that's capable of doing LLM inference at thousands of tokens per second.
I realize the prices from this example are a bit steep for many people, but it's not out of line historically with high-end hardware - in fact the $10k from the 90s would be something like $25k today.
My point is I don't see how "if you're not Facebook, Google, OpenAI, etc. etc. computation isn't for you." I'd love an example if I'm missing something.
A little foil hat conspiracy i supposed, but the big companies saw nobodies become incredibly wealthy over the last decade, and this is the new companies protecting their position by limiting technology.
That being said, the only SSD I’ve ever had fail on me was from Crucial.
In recent builds I have been using less expensive memory from other companies with varying degrees of brand recognizability, and never had a problem. And the days of being able to easily swap memory modules seem numbered, anyway.
IBM really locked me in on the Ultrastar back in the mid '90s. Sure, it has proven itself to be a great product. But some of the first ones I bought, one of the drives arrived failed. I called the vendor I bought it from and they said they wouldn't replace it, I'd have to get a refurb from IBM. So I called IBM, when I told them I had just bought it they said I needed to call the place I bought it from because otherwise I'd get a refurb. I explained I had already called them. "Oh, who did you buy it from?" I told them. "Can you hold on a minute?" ... "Hi, I've got [NAME] on the line from [VENDOR] and they'll be happy to send you a replacement."
Was a huge relief that the machine come up successfully. But then it would lock up when it got warm, until I found the dodgy joint.
Was a very stressful afternoon, but a confidence builder!
I bet there are many people whose sole experience inside a computer is popping in some DIMMs. I’ll be kinda sad if those days are also gone. On the other hand, super integrated packages like Apple’s M-series make for really well-performing computers.
And before that I duct-taped the insanely large 16KB RAM extension (from 1KB), so it doesn't reset with the slightest movement, on my Sinclair ZX81, which I've also assembled and soldered from a kit :)
The only real difference between Crucial RAM and Micron's unbuffered RAM was which brand's sticker they put on it, with some binning and QA on the higher-end enthusiast SKU's and a heatsink. This sub-business-unit was almost entirely redundant to Micron.
> And the days of being able to easily swap memory modules seem numbered, anyway.
I keep seeing people say this in threads across platforms discussing this news, and it baffles me. Why?
This is bad for consumers though since DRAM prices are skyrocketing and now we have one less company making consumer DRAM.
Especially considering that there's little innovation in the consumer DRAM and SSD spaces vs their enterprise counterparts that Micron can flex their talent in.
Almost certainly this is because of a windfall for Micron, at least in the short term. Datacenter memory demand is going through the roof, and that was where margins were highest already. It makes no sense to continue to try to milk a consumer brand that can be sold at, what, a 20% markup over generics?
Most likely Micron was planning this forever, and the current market conditions are such that it's time to pull the trigger and retool everything for GPU memory.
You can’t think about companies like it’s 2024. We’re in a gilded age with unlimited corruption… Anything can happen. They can sign a trillion dollar deal with OpenAI, get acquired by NVidia, merge with Intel, get nationalized by Trump, etc.
Sounds to me like they are using the tried and true method of selling equipment to the people rushing for gold
Their 'smaller' market, SSDs - has an estimated 13% of global NAND revenue.
https://counterpointresearch.com/en/insights/global-dram-and...https://counterpointresearch.com/en/insights/global-nand-mem...
I don't know their breakdown for consumer vs enterprise, but the Crucial brand is consumer focussed. Obviously enterprise at this point is incredibly lucrative.
We're gonna need a bigger pin.
Consumers are so annoying. And by consumers, I mean "anyone can get an API key for the latest model."
We cut down our trees to build more AI datacenter sculptures to please the AI gods.
Or the nuclear craze of 50s where radioactuve material where stuffed in everything like toothpaste, cream etc.
It really is just dotcom all over again.
Already most "AI researchers" outside of the big corps have basically turned in the last 3 years from "people training their models and doing research" to "webdev plugging into other people's APIs to use LLMs they don't know crap about". When, not if, the big AI bubble bursts, the damage done to the sector will be immense
Diversification is resilience.
Putting consumer on hold makes some sense. An exit? This will be written about in business books.