Readit News logoReadit News
orev · a year ago
I’m glad they explained why RAM has become soldered to the board recently. It’s easy to be cynical and assume they were doing it for profit motive purposes (which might be a nice side effect), but it’s good to know that there’s also a technical reason to solder it. Even better to know that it’s been recognized and a solution is being worked on.
OJFord · a year ago
I didn't find that a particularly complete explanation - and the slot can't be closer to the CPU because? - I think it must be more about parasitic properties of the card edge connector on DIMMs being problematic at lower voltage (and higher frequencies) or something. Note the solution is a ball grid connection and the whole thing's shielded.

I suppose in fairness and to the explanation it does give, the other thing that footprint allows is a shorter path for the pins that would otherwise be near the ends of the daughter board (e.g. on a DIMM), since they can all go roughly straight across (on multiple layers) instead of a longer diagonal according to how far off centre they are. But even if that's it, that's what I mean by it seeming incomplete. :)

Tuna-Fish · a year ago
> and the slot can't be closer to the CPU because?

All the traces going into the slot need to be length-matched to obscene precision, and the physical width of the slot and the room required by the "wiggles" made in the middle traces to length-match them restrict how close you can put the slot. Most modern boards are designed to place it as close as possible.

LPCAMM2 fixes this by having a lot of the length-matching done in the connector.

throwaway48476 · a year ago
Competes with space for VRM's.
smolder · a year ago
Yeah, you can only make the furthest RAM chip in DIMM be so close to the CPU based on the form factor, and the other traces need to match that length. Distance is critical and edge connectors sure don't help.
klysm · a year ago
I didn’t really appreciate the insanity of the electrical engineering involved in high frequency stuff till I tried to design some PCBs. A simplistic mental model of wires and interconnects rapidly falls apart as frequencies increase
drivingmenuts · a year ago
The problem is getting manufacturers to implement the new RAM standard. While the justifications given are great for the consumer, I didn't see any reason for a manufacturer to sign on.

They are going to lose money when people buy new RAM, rather than a whole new laptop. While processor speeds and size haven't plateaued yet, it's going to take a while to develop significant new speed upgrades and in the meantime, the only other upgrade is disk size/long-term storage, which, aside from Apple, they don't totally control.

So, why should they relenquish that to the user?

cesarb · a year ago
> While the justifications given are great for the consumer, I didn't see any reason for a manufacturer to sign on. [...] So, why should they relenquish that to the user?

It makes sense that the first ones to use this new standard would be Dell and Lenovo. They both have "business" lines of computers, which usually offer on-site repairs (they send the parts and a technician to your office) for a somewhat long time (often 3 or 5 years). To them, it's a cost advantage to make these computers easier to repair. Having the memory (which is a part which not rarely fails) in a separate module means they don't have to replace and refurbish the whole logic board, and having it easy to remove and replace means less time used by the on-site technician (replacing the main logic board or the chassis often means dismantling nearly everything until it can be removed).

AnthonyMouse · a year ago
> They are going to lose money when people buy new RAM, rather than a whole new laptop.

You're thinking about this the wrong way around.

Suppose the user has $800 to buy a new laptop. That's enough to get one with a faster processor than they have right now or more memory, but not both. If they buy one and it's not upgradable, that's not worth it. Wait another year, save up another $200, then buy the one that has both.

Whereas if it can be upgraded, you buy the new one with the faster CPU right away and upgrade the memory in a year. Manufacturer gets your money now instead of later, meanwhile the manufacturer who didn't offer this not only doesn't sell to you in a year, they just lost your business to the competition.

makeitdouble · a year ago
I'd see two angles:

- the manufacturer themselves benefit from easier to repair machines. If DELL can replace the RAM and send back the laptop in a matter of minutes instead of replacing the whole motherboard to then have it salvaged somewhere else, it's a clear win.

- prosumers will be willing to invest more in a laptop that has better chance to survive a few years. Right now we're all expecting to have parts fail within 2 to 3 years on the higher end, and budget accordingly. You need a serious reason to buy a 3000$/€ laptop that might be dead in 2 years. Knowing it could weather RAM failure without manufacturer repair is a plus.

bugfix · a year ago
Even if it's just Lenovo using these new modules, I still think it's a win for the consumer (if the modules aren't crazy expensive).
rock_artist · a year ago
Unlike Apple, where they are in in-direct competition on computer hardware, For PCs, If Lenovo starts doing it, then it's a marketing point. now Asus, HP, Dell would try and get it.

So it's the egg and the chicken where if it'll be important to consumers, it might end up as catching up.

7speter · a year ago
These companies did plenty well 12+ years ago when users could upgrade their systems memory.
kjkjadksj · a year ago
They can have their technical fig leaf to hide behind but in practice, how many watts are we really saving between lpddr5 and ddr5? is it worth the ewaste tradeoff to have a laptop we can't modularly upgrade to meet our needs? I would guess not.
masklinn · a year ago
> how many watts are we really saving between lpddr5 and ddr5?

From what I gathered, it's around a watt per when idling (which is when it's most critical): the sources I found seem to indicate that ddr5 always runs at 1.1V (or more but probably not in laptops), while lpddr5 can be downvolted. That's an extra 10% idle power consumption per.

yread · a year ago
If they soldered a decent amount that gou can be sure you don't ever need to upgrade it would be fine (seriously, 64GB ram costs like 100eur, non issue in a 1000eur laptop). 8 is not enough already and 16 will soon be limiting too.
brookst · a year ago
Is the goal to not have any computers that are limited to a single task? Tons of corporate IT purchases go to someone only using e.g. Word all day. Do we really care if they are provisioned with “enough” memory for you or me?
nuancebydefault · a year ago
10 percent is not neglectible. Also 64GB is a lot _today_ but most probably not 5 years from now. The alternative of buying a new laptop feels like a big waste.
orev · a year ago
No matter how much the specs increase, developers find a way to use it all up. This approach would just accelerate that process.
tombert · a year ago
Yeah, I was actually surprised to learn there was a reason other than "Apple wants you to buy a new Macbook or overspec your current one". It's annoying, but at least there's a plausible reason to why they do it.
seanp2k2 · a year ago
"...and they charge 4x what the retail of premium RAM would otherwise be per GB"

do storage next.

klausa · a year ago
Apple's RAM is not soldered to the _motherboard_, it's part of the SoC package.
mmastrac · a year ago
Ugh, finally. And it's not just a repurposed desktop memory standard either! The overall space requirements look to be similar to the BGA that you'd normally solder on (perhaps 2-3x as thick?). I'm sure they can reduce that overhead going forward.

I love the disclosure at the bottom:

Full Disclosure: iFixit has prior business relationships with both Micron and Lenovo, and we are hopelessly biased in favor of repairable products.

Aurornis · a year ago
> Ugh, finally.

FYI, the '2' at the end is because this isn't the first time this has been done. :)

LPCAMM spec has been out for a while. LPCAMM2 is the spec for next-generation parts.

Don't expect either to become mainstream. It's relatively more expensive and space-consuming to build an LPCAMM motherboard versus dropping the RAM chips directly on to the motherboard.

nrp · a year ago
My recollection of this is that LPCAMM was a proposal from Dell that they put into the JEDEC standardization process, and LPCAMM2 is the resulting standard, named that way to avoid confusion with the non-standard LPCAMM that Dell trialed on a small number of commercial systems.
audunw · a year ago
Not to mention putting the RAM directly on a System-in-Package chip like Apple does now. That's going to be unbeatable in terms of space and possibly have an edge when it comes to power consumption too. I wouldn't be surprised if future standards will require on-package RAM.

I kind of wish we could establish a new level in the memory hierarchy. Like, just make a slot where you can add slower more power hungry DDR RAM that acts as a big cache for the NVM storage, or that the OS can offload some of the stuff in main memory if it's not used much. It could be unpopulated in base models, and then you can buy an upgrade to stick in there to get some extra performance later if needed.

cjk2 · a year ago
Yeah they even gloss over Lenovo's crappy soldered on the motherboard USB-C connectors which is always the weak point on modern thinkpads. Well that and Digital River (Lenovo's distributor) carries absolutely no spare parts at all for any Lenovos in Europe, and if they do they only rarely turn up, so you can't replace any replaceable bits because you can't get any.
sspiff · a year ago
Digital River is shit at everything. From spare parts, to delivery and tracking, to customer communications, to warranty claims. Every single interaction with them is a nightmare. It is the single reason I prefer to buy Lenovo from resellers rather than directly.
baby_souffle · a year ago
This is fantastic news. Hopefully the cost to manufacturers is only marginal and they find a suitable replacement for their current "each tier in RAM comes with a 5-20% price bump" pricing scheme.

Too bad apple is almost guaranteed to not adopt the standard. I miss being able to upgrade the ram in macbooks.

Aurornis · a year ago
> Too bad apple is almost guaranteed to not adopt the standard.

Apple would require multiple LPCAMM2 modules to provide the bus width necessary for their chips. Up to 4 x LPCAMM2 modules depending on the processor.

The size of each LPCAMM2 module is almost as big as the entire size of an Apple CPU combined with the unified RAM chips, so putting 2-4 LPCAMM2 modules on the board is completely infeasible without significantly increasing the size of the laptop.

Remember, the Apple architecture is a combined CPU/GPU architecture and has memory bandwidth to match. It's closer to your GPU than the CPU in your non-Mac machine. Asking to have upgradeable RAM on Apple laptops is akin to almost like asking for upgradeable RAM on your GPU (which would not be cheap or easy)

For every 1 person who thinks they'd want a bigger MacBook Pro if it enabled memory upgrades, there are many, many more people who would gladly take the smaller size of the integrated solution we have today.

coolspot · a year ago
> like asking for upgradeable RAM on your GPU

Can I please have upgradeable RAM on GPU? Pwetty pwease?

kokada · a year ago
> Up to 4 x LPCAMM2 modules depending on the processor.

The non-Pro/Max versions (e.g. M3) uses 128-bits, and arguably is the kind of notebook that mostly needs to be upgraded later since they commonly come with only 8GB of RAM.

Even the Pro versions (e.g. M3 Pro) use up-to 256-bits, that would be 2 x LPCAMM2 modules, that seem plausible.

For the M3 Max in the Macbook Pro, yes, 4 x LPCAMM2 would be impossible (probably). But I think you could have something like the Mac Studio have them, that is arguably also the kind of device that you probably want to increase memory in the future.

sliken · a year ago
Apple ships 128 bit, 256 bit, and 512 bit wide memory interfaces on laptops (up to 1024 bit wide on desktops).

Is it feasible to fit memory bandwidth like the M3 Max (512 bits wide LPDDR5-6400) with LPCAMM2 in a thin/light laptop?

pja · a year ago
This PDF[1] suggests that an LPCAMM2 module has a 128 bit wide memory interface, so the epic memory bandwidth of the M3 max won’t be achievable with one of these memory modules. High end devices could potentially have two or more of them arranged around the CPU though?

[1] https://investors.micron.com/node/47186/pdf

wmf · a year ago
For 512 bits you would need four LPCAMM2s. I could imagine putting two on opposite sides of the SoC but four might require a huge motherboard.
AnthonyMouse · a year ago
Apple does this because their CPU and GPU use the same memory, and it's generally the GPU that benefits from more memory bandwidth. Whereas in a PC optimized for GPU work you'd have a discrete GPU that has its own memory which is even faster than that.
jauntywundrkind · a year ago
Hoping we see AMD Strix Halo with it's 256-bit interface crammed into an aggressively cooled fairly-thin fairly-light. But it's going to require heavy cooling to make full use of.

Heck, make it only run full tilt when on an active cooling dock. Let it run half power when unassisted.

j16sdiz · a year ago
Unified memory is basically L3 cache speed with zero copy between CPU and GPU.

They have engineering difference. Depends on who you ask, it may or may not worth it

enragedcacti · a year ago
Assuming you mean latency, Apple's unified memory isn't lower latency than other soldered or socketed solutions e.g. M1 Max with 111ns latency on cache miss vs 13900k with 93ns latency. Certainly not L3 level latency. Zero copy between CPU/GPU is great but not unique to unified memory or soldered ram.

As far as bandwidth goes, you would only need one or two LPCAMM2 modules to match or exceed the bandwidth of non-Max M series chips. Accommodating Max chips in a macbook with LPCAMM2 would definitely be a difficult packaging problem.

https://www.anandtech.com/show/17024/apple-m1-max-performanc...

https://www.anandtech.com/show/17047/the-intel-12th-gen-core...

redeeman · a year ago
and they wont so long as people buy regardless
cjk2 · a year ago
Given enough pressure ...
armarr · a year ago
You mean pressure from regulators, surely. Because 99% of consumers will not notice or know the difference in a spec sheet.
colinng · a year ago
They will maliciously comply. They might even have 4 sockets for the 512-bit wide systems. But then they’ll keep the SSD devices soldered - just like they’ve done for a long time. Or cover them with epoxy, or rig it with explosives. That’ll show you for trying to upgrade! How dare you ruin the beautiful fat profit margin that our MBAs worked so hard to design in?!?
zxcvgm · a year ago
I remember when Dell was the first to introduce [1] these Compression Attached Memory Modules in their laptops in an attempt to move away from soldered-on RAM. Glad this is now being more widely adopted and standardized.

[1] https://www.pcworld.com/article/693366/dell-defends-its-cont...

AlexDragusin · a year ago
> The first iteration, known as CAMM, was an in-house project at Dell, with the first DDR5-equipped CAMM modules installed in Dell Precision 7000 series laptops. And thankfully, after doing the initial R&D to make the tech a reality, Dell didn’t gatekeep. Their engineers believed that the project had such a good chance at becoming the next widespread memory standard that instead of keeping it proprietary, they went the other way and opened it up for standardization.
jimbobthrowawy · a year ago
Trying to make it a standard is one of the least surprising things about it. You want accessories/components in your product to be as commodity as possible to drive costs down.
doublextremevil · a year ago
Cant wait to see this in a framework laptop
OJFord · a year ago
For the presumed improvement to battery life? Because Fw already uses SO-DIMMs.
universa1 · a year ago
That's also nice, but the memory speed is also higher, Ddr5-7266 vs 5600 iirc. The resulting higher bandwidth translates more or less directly into more performance for the iGPU.
wmf · a year ago
It's also faster (7500 vs. 5600).
userbinator · a year ago
A bit of a disingenious argument intended to sell this as being more revolutionary than it really is --- BGA sockets already exist for LPDDR as well as other things like CPUs/SoCs, but they're very expensive due to low volumes. If the volume went up, they'd go down in price significantly just like LGA sockets for CPUs have.

https://www.ironwoodelectronics.com/products/lpddr/

zokier · a year ago
I wonder if this will bring a new widely available high-performance connector to the wider market. SO-DIMM connectors have been occasionally repurposed to other uses, most notably by Raspberry Pi Compute Models 1-3 among other similar SOM/COM boards. RPi CM4 switched to 2x 100pin mezzanine connectors; maybe some future module could use CAMM connectors, I'd imagine they are capable enough
wmf · a year ago
The compression connector looks flimsier than a mezzanine so it should probably be a last resort for multi-gigahertz single-ended signaling.
kristianp · a year ago
So this is going into the ThinkPad P1 (Gen 7), which is too expensive and power hungry for my use cases. How long until it filters down into less expensive SKUs? Are we talking next years generation?

Ifixit also links to a repair guide:

https://www.ifixit.com/Device/Lenovo_ThinkPad_P1_Gen_7

CoolCold · a year ago
My personal understanding - for Thinkpads, it's next year. I guess Lenovo is making real-life testes with P1 here, gather feedback before addressing other families like T14/T14s