At my former job at a FAANG, I did the math on allocating developers machines with 16GB vs 64GB based on actual job tasks with estimates of how much thumb twiddling waiting time that this would save and then multiplied that out by the cost of the developer's time. The cost benefit showed a reasonable ROI that was realized in Weeks for Senior dev salaries (months for juniors).
Based on this, I strongly believe that if you're providing hardware for software engineers, it rarely if ever makes sense to buy anything but the top spec Macbook Pro available, and to upgrade every 2-3 years. I can't comment on non desktop / non-mac scenarios or other job families. YMMV.
No doubt the math checks out, but I wonder if developer productivity can be quantified that easily. I believe there's a lot of research pointing to people having a somewhat fixed amount of cognitive capacity available per day, and that aligns well with my personal experience. A lot of times, waiting for the computer to finish feels like a micro-break that saves up energy for my next deep thought process.
Your brain tends to do better if you can stay focused on your task for consecutive, though not indefinite, periods of time. This varies from person to person, and depends on how long a build/run/test takes. But the challenge for many is that 'break' often becomes a context switch, a potential loss of momentum, and worse may open me up to a distraction rather than a productive use of my time.
For me, personally, a better break is one I define on my calendar and helps me defragment my brain for a short period of time before re-engaging.
I recommend investigating the concept of 'deep work' and drawing your own conclusions.
>"A lot of times, waiting for the computer to finish feels like a micro-break that saves up energy for my next deep thought process."
As an ISV I buy my own hardware so I do care about expenses. I can attest that to me waiting for computer to finish feels like a big irritant that can spoil my programming flow. I take my breaks whenever I feel like and do not need a computer to help me. So I pay for top notch desktops (within reason of course).
There’s also the time to market and bureaucracy cost. I took over a place where there was a team of people devoted making sure you had exactly what PC you need.
Configuring devices more generously often lets you get some extra life out of it for people who don’t care about performance. If the beancounters make the choice, you’ll buy last years hardware at a discount and get jammed up when there’s a Windows or application update. Saving money costs money because of the faster refresh cycle.
My standard for sizing this in huge orgs is: count how many distinct applications launch per day. If it’s greater than 5-7, go big. If it’s less, cost optimize with a cheaper config or get the function on RDS.
Simple estimates work surprisingly well for a lot of things because a lot of the 'unquantifiable' complexity being ignored behaves like noise. When you have dozens of factors pulling in different directions—some developers multitask better, some lose flow more easily, some codebases are more memory-hungry, and so on—it all tends to just average out, and the result is reasonably accurate. Accurate enough that it's useful data to make a decision with, at least.
For me the issue is at work with 16gb of ram, I'm basically always running into swap and having things grind to a halt. My personal workstation has 64gb and the only time I experience issues is when something's leaking memory
Well depends what kind of time periods you're talking. I've seen one in the past that was 60 minutes vs. 20 minutes (for a full clean compile, but often that is where you find yourself) - that is far more than a micro-break, that is a big chunk of time wasted.
You’re not waiting for the end of a thing though. You might hope you are, but the truth is there’s always one little thing you still have to take care of. So until the last build is green and the PR is filed, you’re being held hostage by the train of thought that’s tied to this unit of work. Thinking too much about the next one just ends up adding time to this one.
You’re a grownup. You should know when to take a break and that’ll be getting away from the keyboard, not just frittering time waiting for a slow task to complete.
The hours I sometimes spend waiting on a build are time that won't come back latter. Sometimes I've done other tasks but I can only track so much and so often it isn't worth it.
a faster machine can get me to productive work faster.
Most of my friends at FAANG all do their work on servers remotely. Remote edit, remote build. The builds happen in giant networked cloud builders, 100s to 1000s per build. Giving them a faster local machine would do almost nothing because they don't do anything local.
On the laptop you need:
- low weight so you can easily take it with you to work elsewhere
- excellent screen/GPU
- multiple large connected screens
- plenty of memory
- great keyboard/pointer device
Also: great chair
Frankly, what would be really great is a Mac Vision Pro fully customised as a workstation.
When I worked at a FAANG, most developers could get remote virtual machine for their development needs. They could pick the machine type and size. It was one of the first thing you'd learn how to do in your emb^H^H^H onboarding :)
So it wasn't uncommon to see people with a measly old 13" macbook pro doing the hard work on a 64cpu/256GB remote machine. Laptops were essentially machines used for reading/writing emails, writing documents and doing meetings. The IDEs had a proprietary extensions to work with remote machines and the custom tooling.
more than that, in the faang jobs I've had you could not even check code out onto your laptop. it had to live on the dev desktop or virtual machine, and be edited remotely.
> it rarely if ever makes sense to buy anything but the top spec Macbook Pro available
God I wish my employers would stop buying me Macbook Pros and let me work on a proper Linux desktop. I'm sick of shitty thermally throttled slow-ass phone chips on serious work machines.
Just Friday I was dealing with a request from purchasing asking if a laptop with an ultra-low-power 15W TDP CPU and an iGPU with "8GB DDR4 graphics memory (shared) was a suitable replacement for one with a 75W CPU (But also a Core i9) and NVidia RTX4000 mobile 130W GPU in one of our lead engineer's CAD workstations.
No, those are not the same. There's a reason one's the size of a pizza box and costs $5k and the other's the size of an iPad and costs $700.
And yes, I much prefer to build tower workstations with proper thermals and full-sized GPUs, that's the main machine at their desk, but sometimes they need a device they can take with them.
Curious perspective. Apple silicon is both performant and very power efficient. Of course there are applications where even a top spec MacBook would be unsuitable, but I imagine that would be a very small percentage of folks needing that kind of power.
Sadly, the choice is usually between Mac and Windows—not a Linux desktop. In that case, I’d much prefer a unix-like operating system like MacOS.
To be clear, I am not a “fanboy” and Apple continues to make plenty of missteps. Not all criticisms against Apple are well founded though.
Gave developers 16GB RAM and 512MB storage. Spent way too much time worrying about available disk space and needlessly redownloading docker images off the web.
But at least they saved money on hardware expenses!
FAANG manages the machines. Setting aside the ethics of this level of monitoring, I'd be curious to validate this by soft-limiting OS memory usage and tracking metrics like number of PRs and time someone is actively on the keyboard.
My personal experience using virtual desktops vs a MacBook aligns with your analysis. This despite the desktop virtual machines having better network connections. A VM with 16 GB of memory and 8 VCPUs can't compete with an M1 Max laptop.
To put a massive spanner in this, companies are going to be rolling out seemingly mandatory AI usage, which has huge compute requirements .. which are often fulfilled remotely. And has varying, possibly negative, effects on productivity.
I think those working on user-facing apps could do well having a slow computer or phone, just so they can get a sense of what the actual user experience is like.
No doubt you mean well. In some cases it’s obvious- low memory machine can’t handle some docket setup, etc.
In reality, you can’t even predict time to project completion accurately. Rarely is a fast computer a “time saver”.
Either it’s a binary “can this run that” or a work environment thing “will the dev get frustrated knowing he has to wait an extra 10 minutes a day when a measly $1k would make this go away”
This article skips a few important steps - how a faster CPU will have a demonstrable improvement on developer performance.
I would agree with the idea that faster compile times can have a significant improvement in performance. 30s is long enough for a developer to get distracted and go off and check their email, look at social media, etc. Basically turning 30s into 3s can keep a developer in flow.
The critical thing we’re missing here is how increasing the CPU speed will decrease the compile time. What if the compiler is IO bound? Or memory bound? Removing one bottleneck will get you to the next bottleneck, not necessarily get you all the performance gains you want
I wish I was compiler bound. Nowadays, with everything being in the cloud or whatever I'm more likely to be waiting for Microsoft's MFA (forcing me to pick up my phone, the portal to distractions) or getting some time limited permission from PIM.
The days when 30 seconds pauses for the compiler was the slowest part are long over.
The circuit design software I use, Altium Designer, has a SaaS cloud for managing libraries of components, and version control of projects. I probably spend hours a year waiting for simple things like "load the next 100 parts in this list" or "save this tiny edit to the cloud" as it makes API call after call to do simple operations.
And don't get me started on the cloud ERP software the rest of the company uses...
Compiler is usually IO bound on windows due to NTFS and the small files in MFT and lock contention problem. If you put everything on a ReFS volume it goes a lot faster.
I've seen a test environment which has most assets local but a few shared services and databases accessed over a VPN which is evidently a VIC-20 connected over dialup.
The dev environment can take 20 seconds to render a page that takes under 1 second on prod. Going to a newer machine with twice the RAM bought no meaningful improvement.
They need a rearchitecture of their dev system far more than faster laptops.
I got my boss to get me the most powerful server we could find, $15000 or so. In benchmarks there was minimal benefit and sometimes a loss going with more than 40 cores even though it has 56. (52? - I can't check now) sometimes using more cores slows the build down. We have concluded that memory bandwidth is the limit, but are not sure how to prove it.
I don’t think that we live in an era where a hardware update can bring you down to 3s from 30s, unless the employer really cheaped out on the initial buy.
Now in the tfa they compare laptop to desktop so I guess the title should be “you should buy two computers”
Another thing to keep in mind when compiling is adding more cores doesn't help with link time, which is usually stuck to a single core and can be a bottleneck.
I still run a 6600 (65W peak) from 2016 as my daily driver. I have replaced the SSD once (MLC lasted 5 years, hopefully forever with SLC drive from 2011?), 2x 32GB DDR4 sticks (Kingston Micron lasted 8 years, with aliexpress "samsung" sticks for $50 a pop) and Monitor (Eizo FlexScan 1932 lasted 15! years RIP with Eizo RadiForce 191M, highly recommend with f.lux/redshift for exceptional quality of image without blue light)
It's still powerful enough to play any games released this year I throw at it at 60 FPS (with a low profile 3050 from 2024) let alone compile any bloat.
Keep your old CPU until it breaks, completely... or actually until the motherboard breaks; I have a Kaby Lake 35W replacement waititng for the 6600 to die.
Really depends on your use case. I personally still run my 17 year old 2.4GHz Core 2 duo with 4GB of RAM as my daily runner. I simply do not hit the walls even on that thing. Most folks here simply would not accept that and not because they are spoilt but their works loads demand more.
I love those as routers, firewalls, and other random devices on my mess of a home network where I just set things up for fun. Or as little NASes for friends and family that I can give to them for free or whatever.
I bought a geforce RTX 3080 at launch and boy was I surprised at the power draw and heat/noise it pumps out. I wonder why anybody bothers with the 90 series at all.
I actually run it ~10% underclocked, barely affects performance, but greatly reduces heat/noise. These cards are configured to deliver maximum performance at any cost (besides system instability).
My next GPU I am probably going mid-range to be honest, these beefy GPUs are not worth it anymore cost and performance-wise. You are better off buying the cheaper models and upgrading more often.
When it comes to: "is it better to throw away the 6600 and replace it with a 5600, than keep running the 6600" I'm torn but: you probably need to use the 5600 for maybe 20 years to compensate for it's manufacturing energy cost (which is not directly linked to $ cost) and I think the 6600 might last that long with new RAM 10 years down the road, not so sure the newer 5600 motherboard and the CPU itself will make it that long becuse it was probably made to break to a larger extent.
Also the 6600 can be passively cooled in the Streacom case I allready have, the 5600 is to hot.
It very much depends on the games you play though. When I upgraded from a 7700k, it was struggling with my Factorio world. My AMD 5700X3D handles it completely smoothly. Though now in Path of Exile 2, my CPU can barely maintain 30 fps during big fights.
CPU is now the bottleneck for games that struggle, which makes sense since GPU most often is configurable, while gameplay well is the hardcoded gameplay.
See PUBG that has bloated Unreal so far past what any 4-core computer can handle because of anti-cheats and other incremental changes.
Factorio could add some "how many chunks to simulate" config then? If that does not break gameplay completely.
I upgraded two years ago to a Ryzen 5700 rather than a 5800 specifically for the lower TDP. I rarely max out the cores and the cooler system means the fan rarely spins up to audible levels.
E5-2650v2 in a Chinese mATX motherboard for me. Got the cpu years ago for like $50 as an eBay server pull. 970 Evo SSD. 24GB of mismatched DDR3. Runs all my home server junk and my dev environment (containerized with Incus). Every year I tell myself I should grab a newer Ryzen to replace it but it honestly just keeps chugging along and doesn't really slow me down.
Bought a 7700X two years ago. My 3600X went to my wife. Previous machine (forgot which one it was but some Intel CPU) went to my mother-in-law. Machine three machines before that, my trusty old Core i7-6700K from 2015 (I think 2015): it's now a little Proxmox server at home.
I'll probably buy a 9900X or something now: don't want to wait late 2026/2027 for Zen 6 to come out. 7700X shall go to the wife, 3600X to the kid.
My machines typically work for a very long time: I carefully pick the components and assemble them myself and then test them. Usually when I pull the plug for the final time, it's still working fine.
But yet I like to be not too far behind: my 7700X from 2022 is okay. But I'll still upgrade. Doesn't mean it's not worth keeping: I'll keep it, just not for me.
All got new RAM this year and when the SSDs break (all have SLC) I have new SLC SSDs and will install headless linux for server duty on 1Gb/s symmertic fiber until the motherboards break in a way I can't repair. Will probably resolder caps.
There are peaks in long-term CPU value. That is, CPUs that are 1) performant enough to handle general purpose computing for a decade and 2) outperform later chips for a long time.
The i7-4770 was one. It reliably outperformed later Intel CPUs until near 10th gen or so. I know shops that are still plugging away on them. The first comparable replacements for it is the i7-12700 (but the i5-12400 is a good buy).
At 13th gen, Intel swaps E for P cores. They have their place but I still prefer 12th gen for new desktops.
Past all that, the author is right about the AMD Ryzen 9950x. It's a phenomenal chip. I used one in a friend's custom build (biz, local llm) and it'll be in use in 2035.
> The i7-4770 was one. It reliably outperformed later Intel CPUs until near 10th gen or so.
Per which benchmarks?
> At 13th gen, Intel swaps E for P cores.
One nit, Intel started adding (not swapping) E-cores to desktop parts with 12th gen, but i3 parts and most i5 parts were spared. More desktop i5 parts got them as starting with 13th gen.
What's wrong with E cores? They're the best bang for the buck for both baseline low-power usage (and real-world systems are idle a lot of the time) and heavy multicore workloads. An E-core cluster takes a tiny fraction of area and power compared to a P-core, so it's not just a one-on-one comparison.
Important caveat that the author neglects to mention since they are discussing laptop CPUs in the same breath:
The limiting factor on high-end laptops is their thermal envelope. Get the better CPU as long as it is more power efficient. Then get brands that design proper thermal solutions.
You simply cannot cram enough cooling and power into a laptop to have it equal a desktop high end desktop CPU of the same generation. There is physically not enough room. Just about the only way to even approach that would be to have liquid cooling loop ports out the back that you had to plug into an under-desk cooling loop and I don't think anyone is doing that because at that point just get a frickin desktop computer + all the other conveniences that come with it (discrete peripherals, multiple monitors, et cetera). I honestly do not understand why so many devs seem to insist on doing work on a laptop. My best guess is this is mostly the apple crowd because apple "desktops" are for the most part - just the same hardware in a larger box instead of being actually a different class of machine. A little better on the thermals, but not the drastic jump you see between laptops and desktops from AMD and Intel.
> the only way to even approach that would be to have liquid cooling loop ports out the back that you had to plug into an under-desk cooling loop and I don't think anyone is doing that
It is (maybe was) done by XMG and Schenker. Called Oasis IIRC. Yep
If you have to do any travel for work, a lightweight but fast portable machine that is easy to lug around beats any productivity gains from two machines (one much faster) due to the challenge of keeping two devices in sync.
Carrying a desktop on the backpack is kind of hard for the work I do as a developer, not everyone is seating on a desk the full day, or has an assigned work desk at a specific office.
I work mostly remote, and also need to jump between office locations and customer sites as well.
Member of Windows/UNIX crowd since several decades.
> I honestly do not understand why so many devs seem to insist on doing work on a laptop.
I hate having more than one machine to keep track of and maintain. Keeping files in sync, configuration in sync, everything updated, even just things like the same browser windows with the same browser tabs, organized on my desktop in the same way. It's annoying enough to have to keep track of all that for one machine. I do have several machines at home (self-built NAS, media center box, home automation box), and I don't love dealing with them, but fortunately I mainly just have to ensure they remain updated, not keep anything in sync with other things.
(I'm also one of those people who gets yelled at by the IT security team when they find out I've been using my personal laptop for work... and then ignores them and continues to do it, because my personal laptop is way nicer than the laptop they've given me, I'm way more productive on it, and I guarantee I know more about securing a Linux laptop and responsibly handling company data than the Windows/Mac folks in the company's IT department. Yes, I know all the reasons, both real and compliance-y, why this is still problematic, but I simply do not care, and won't work for a company that won't quietly look the other way on this.)
I also rarely do my work at a desk; I'm usually on a couch or a comfy chair, or out working in a public place. If all I had was a desktop, I'd never do any work. If I had a desktop in addition to my laptop, I'd never use the desktop. (This is why I sold my personal home desktop computer back in the late '00s: I hadn't even powered it on in over a year.)
> ...why so many devs seem to insist...
I actually wonder if this was originally driven by devs. At my first real job (2001-2004) I was issued a desktop machine (and a Sun Ray terminal!), and only did work at the office. I wouldn't even check work email from home. At my second job (2004-2009), I was given a Windows laptop, and was expected to be available to answer the odd email in my off hours, but not really do much in the way of real work. I also had to travel here and there, so having the laptop was useful. I often left the laptop in the office overnight, though. When I was doing programming at that company, I was using a desktop machine running Linux, so I was definitely not coding at home for work.
At the following job, in 2009, I was given a MacBook Pro that I installed Linux on. I didn't have a choice in this, that's just what I was given. But now I was taking my work laptop home with me every day, and doing work on my off hours, even on weekends. Sneaky, right? I thought it was very cool that they gave me a really nice laptop to do work on, and in return, I "accidentally" started working when I wasn't even in the office!
So by giving my a laptop instead of a desktop, they turned me from a 9-5 worker, into something a lot more than that. Pretty good deal for the company! It wasn't all bad, though. By the end of the '10s I was working from home most days, enjoying a flexible work schedule where I could put in my hours whenever it was most convenient for me. As long as I was available for meetings, spent at least some time in the office, and produced solid work in a timely manner, no one cared specifically when I did it. For me, the pandemic just formalized what I'd already been doing work-wise. (Obviously it screwed up everything outside of work, but that's another story.)
> My best guess is this is mostly the apple crowd...
Employers, even the rich FANG types, are quite penny-wise and pound-foolish when it comes to developer hardware.
Limiting the number and size of monitors. Putting speedbumps (like assessments or doctor's notes) on ergo accessories. Requiring special approval for powerful hardware. Requiring special approval for travel, and setting hotel and airfare caps that haven't been adjusted for inflation.
To be fair, I know plenty of people that would order the highest spec MacBook just to do web development and open 500 chrome tabs. There is abuse. But that abuse is really capped out at a few thousand in laptops, monitors and workstations, even with high-end specs, which is just a small fraction of one year's salary for a developer.
Every well funded startup I’ve worked for went through a period where employees could get nearly anything they asked for: New computers, more monitors, special chairs, standing desks, SaaS software, DoorDash when working late. If engineers said they needed it, they got it.
Then some period of time later they start looking at spending in detail and can’t believe how much is being spent by the 25% or so who abuse the possibly. Then the controls come.
> There is abuse. But that abuse is really capped out at a few thousand in laptops, monitors and workstations, even with high-end specs,
You would think, but in the age of $6,000 fully specced MacBook Pros, $2,000 monitors, $3,000 standing desks, $1500 iPads with $100 Apple pencils and $300 keyboard cases, $1,000 chairs, SaaS licenses that add up, and (if allowed) food delivery services for “special circumstances” that turns into a regular occurrence it was common to see individuals incurring expenses in the tens of thousands range. It’s hard to believe if you’re a person who moderates their own expenditures.
Some people see a company policy as something meant to be exploited until a hidden limit is reached.
There also starts to be some soft fraud at scales higher than you’d imagine: When someone could get a new laptop without questions, old ones started “getting stolen” at a much higher rate. When we offered food delivery for staying late, a lot of people started staying just late enough for the food delivery to arrive while scrolling on their phones and then walking out the door with their meal.
Not an expert here, but from what I heard, that would be a bargain for a good office chair. And having a good chair or not - you literally feel the difference.
If $20k is misspent by 1 in 100 employees, that's still $200 per employee per year: peanuts, really.
Just like with "policing", I'd only focus on uncovering and dealing with abusers after the fact, not on everyone — giving most people "benefits" that instead makes them feel valued.
> It’s hard to believe if you’re a person who moderates their own expenditures.
Yeah, it's hard to convey to people who've never been responsible for setting (or proposing) policy that it's not a game of optimizing the average result, but of minimizing the worst-case result.
You and I and most people are not out to arbitrage the company's resources but you and I and most people are also not the reason policy exists.
It was depressing to run into that reality myself as policy controls really do interfere sometimes in allowing people to access benefits the organization wants them to have, but the alternative is that the entire budget for perks ends up in the hands of a very few people until the benefit goes away completely.
Is it “soft fraud” when a manager at an investment bank regularly demands unreasonable productivity from their junior analysts, causing them to work late and effectively reduce their compensation rate? Only if the word “abuse” isn’t ambiguous and loaded enough for you!
$3,000 standing desks?? It's some wood, metal and motors. I got one from IKEA in about 2018 for 500 gbp and it's still my desk today. You can get Chinese ones now for about 150 gbp.
1. My brothers (I have a number of them) mostly work in construction somehow. It feels most of them drive a VW Transporter, a large pickup or something, each carrying at least $30 000 in equipment.
Seeing people I work with get laptops that use multiple minutes to connect to a postgres database that I connect to in seconds feels really stupid. (I'm old enough that I get what I need, they usually rather pay for a decent laptop rather than start a hiring process.)
2. My previous employer did something really smart:
They used to have a policy that you got a basic laptop and an inexpensive phone, but you could ask for more if you needed. Which of course meant some people got nothing and some people got custom keyboards and what not.
That was replaced with a $1000 budget on your first day an $800 every year that was meant to cover phones and everything you needed. You could alsp borrow from next year. So if someone felt they needed the newest iPhone or Samsung? Fine, save up one year(or borrow from next year) and you have it.
Others like me who don't care that much about phones could get a reasonably priced one + a gpod monitor for my upstairs office at home + some more gear.
And now the rules are the same for everyone so even I get (I feel I'm hopeless when it comes to arguing my case with IT, but now it was a simple: do you have money for it? yes/no)
Netflix, at least the Open Connect org, was still open ended adjacent to whatever NTech provided (your issued laptop and remote working stuff). It was very easy to get "exotic" hardware. I really don't think anyone abused it. This is an existence proof to the comment parents, it's neither a startup and I don't see engineers screwing the wheels off the bus anywhere I've ever worked.
There also starts to be some soft fraud at scales higher than you’d imagine: When someone could get a new laptop without questions, old ones started “getting stolen” at a much higher rate. When we offered food delivery for staying late, a lot of people started staying just late enough for the food delivery to arrive while scrolling on their phones and then walking out the door with their meal.
Ehh. Neither of these are soft fraud. The former is outright law-breaking, the latter…is fine. They stayed till they were supposed to.
Where do you even get the $3,000 standing desk? I am don't even compare prices and I got mine from Amazon for $200-$300. Sure the quality might not be the best but I just can't see there are people buying $3000 standing desks.
I know a FAANG company whose IT department, for the last few years, has been "out of stock" for SSD drives over 250GB . They claim its a global market issue (it's not). There's constant complaining in the chats for folks who compile locally. The engineers make $300k+ so they just buy a second SSD from Amazon on their credit cards and self-install them without mentioning it to the IT dept. I've never heard a rational explanation for the "shortage" other than chronic incompetence from the team supplying engineers with laptops/desktops. Meanwhile, spinning up a 100TB cloud VM has no friction whatsoever there. It's a cushy place to work tho, so folks just accept the comically dumb aspects everyone knows about.
I've wondered if that's to make dealing with full disk backup/forensic collections/retention legal hold/etc easier: keep the official amount of end-user device storage to a minimum. And/or it forces the endpoint to depend on network/cloud storage, giving better business intelligence on what data is "hot".
Unfortunately, there isn’t much you can do other than fuss at some random director or feedback form. Or quit, I guess. But that seems a little extreme.
Anyway, your choices of what to do about idiocy like this are pretty limited.
I think you're maybe underestimating the aggregate cost of totally unconstrained hardware/travel spending across tens or hundreds of thousands of employees, and overestimating the benefits. There need to be some limits or speedbumps to spending, or a handful of careless employees will spend the moon.
Scaling cuts both ways. You may also be underestimating the aggregate benefits of slight improvements added up across hundreds or thousands of employees.
For a single person, slight improvements added up over regular, e.g., daily or weekly, intervals compound to enormous benefits over time.
> But that abuse is really capped out at a few thousand
That abuse easily goes into the tens of thousands of dollars, even several hundred thousand, even at a relatively small shop. I just took a quick look at Apple's store, and wow! The most expensive 14" MacBook Pro I could configure (minus extra software) tops out at a little over $7,000! The cheapest is at $1,600, and a more reasonably-specced, mid-range machine (that is probably perfectly sufficient for dev work), can be had for $2,600.
Let's even round that up to $3,000. That's $4,000 less than the high end. Even just one crazy-specced laptop purchase would max out your "capped out at a few thousand" figure.
And we're maybe not even talking about abuse all the time. An employee might fully earnestly believe that they will be significantly more productive with a spec list that costs $4,000, when in reality that $3,000 will be more or less identical for them.
Multiply these individual choices out to a 20 or 40 or 60 person team, and that's real money, especially for a small startup. And we haven't even started talking about monitors and fancy ergonomic chairs and stuff. 60 people spending on average $2,000 each more than they truly need to spend will cost $120k. (And I've worked at a place that didn't eliminate their "buy whatever you think you'll need" policies until they had more than 150 employees!)
Just to do web development? I regularly go into swap running everything I need on my laptop. Ideally I'd have VScode, webpack, and jest running continuously. I'd also occasionally need playwright. That's all before I open a chrome tab.
Always amuses me when I see someone use web development as an example like this. Web dev is very easily in the realm of game dev as far as required specs for your machine, otherwise you're probably not doing much actual web dev. If anything, engineers doing nothing but running little Java or Python servers don't need anything more than a PI and a two-color external display to do their job.
FANG is not monolithic. Amazon is famously cheap. So is Apple in my opinion based on what I have heard (you get random refurbished hardware that is available not some standardized thing, sometimes with 8GB RAM sometimes something nicer) Apple is also famously cheap on their compensation. Back in the day they proudly said shit to the effect of "we deliberately don't pay you top of the market because you have to love Apple" to which the only valid answer is "go fuck yourself."
Google and Facebook I don't think are cheap for developers. I can speak firsthand for my past Google experience. You have to note that the company has like 200k employees and there needs to be some controls and not all of the company are engineers.
Hardware -> for the vast majority of stuff, you can build with blaze (think bazel) on a build cluster and cache, so local CPU is not as important. Nevertheless, you can easily order other stuff should you need to. Sure, if you go beyond the standard issue, your cost center will be charged and your manager gets an email. I don't think any decent manager would block you. If they do, change teams. Some powerful hardware that needs approval is blanket whitelisted for certain orgs that recognize such need.
Trips -> Google has this interesting model you have a soft cap for trips and if you don't hit the cap, you pocket half of the trips credit in your account which you can choose to spend later when you are overcap or you want to get something slightly nicer the next time. Also, they have clear and sane policies on mixing personal and corporate travel. I encourage everyone to learn about and deploy things like that in their companies. The caps are usually not unreasonable, but if you do hit them, it is again an email to your management chain, not some big deal. Never seen it blocked. If your request is reasonable and your manager is shrugging about this stuff, that should reflect on them being cheap not the company policy.
iOS development is still mostly local which is why most of the iOS developers at my previous Big Tech employer got Mac Studios as compiler machines in addition to their MacBook Pros. This requires director approval but is a formality.
I read Google is now issuing Chromebooks instead of proper computers to non-engineers, which has got to be corrosive to productivity and morale.
The soft cap thing seems like exactly this kind of penny-foolish behavior though. I’ve seen people spend hours trying to optimize their travel to hit the cap — or dealing with flight changes, etc that come from the “expense the flight later” model.
All this at my company would be a call or chat to the travel agent (which, sure, kind of a pain, but they also paid for dedicated agents so wait time was generally good).
> Back in the day they proudly said shit to the effect of "we deliberately don't pay you top of the market because you have to love Apple" to which the only valid answer is "go fuck yourself."
So people started slacking off, because "you have to love your employees"?
What would be a good incentivizing strategy to prevent over spending on hardware? I can think of giving a budget and the amount not spend is payed out to them (but when the salary is that high it might not make sense) or like having a internal dashboard where everybody can see every body’s spending on hardware, so people feel bad when they order to much.
Probably better to just request an unreviewed but detailed justification, and then monitor spend and police the outliers after the fact (or when requesting above an invisible threshold, e.g. any fully-specced Apple products).
The outliers will likely be two kinds:
1) People with poor judgement or just an outright fraudulent or entitled attitude. These people should be watched for performance issues and managed out as needed. And their hardware reclaimed.
2) People that genuinely make use of high end hardware, and likely have a paper trail of trying to use lower-end hardware and showing that it is inefficient.
This doesn't stop the people that overspend slightly so that they are not outliers, but those people are probably not doing substantial damage.
It's straightforward to measure this; start a stopwatch every time your flow gets interrupted by waiting for compilation or your laptop is swapping to keep the IDE and browser running, and stop it once you reach flow state again.
We managed to just estimate the lost time and management (in a small startup) was happy to give the most affected developers (about 1/3) 48GB or 64GB MacBooks instead of the default 16GB.
At $100/hr minimum (assuming lost work doesn't block anyone else) it doesn't take long for the upgrades to pay off. The most affected devs were waiting an hour a day sometimes.
This applies to CI/CD pipelines too; it's almost always worth increasing worker CPU/RAM while the reduction in time is scaling anywhere close to linearly, especially because most workers are charged by the minute anyway.
I think you wanted to say "especially". You're exchanging clearly measurable amounts of money for something extremely nebulous like "developer productivity". As long as the person responsible for spend has a clear line of view on what devs report, buying hardware is (relatively) easy to justify.
Once the hardware comes out of a completely different cost center - a 1% savings for that cost center is promotion-worthy, and you'll never be able to measure a 1% productivity drop in devs. It'll look like free money.
With compiler development work, a low end machine will do just fine, as long as it has a LARGE monitor. (Mine is 3840x2160, and I bought a satellite monitor to extend it.)
P.S. you can buy a satellite monitor often for $10 from the thrift store. The one I bought was $10.
I don't buy used keyboards because they are dirty and impossible to clean.
> highest spec MacBook just to do web development and open 500 chrome tabs. There is abuse.
Why is that abuse? Having many open browser tabs is perfectly legitimate.
Arguably they should switch from Chrome to Safari / lobby Google to care about client-side resource use, but getting as much RAM as possible also seems fine.
This is especially relevant now that docker has made it easy to maintain local builds of the entire app (fe+be). Factor in local AI flows and the RAM requirements explode.
I have a whisper transcription module running at all times on my Mac. Often, I'll have a local telemetry service (langfuse) to monitor the 100s of LLM calls being made by all these models. With AI development it isnt uncommon to have multiple background agents hogging compute. I want each of them to be able to independently build + host and test their changes. The compute load apps up quickly. And I would never push agent code to a cloud env (not even a preview env) because I don't trust them like that and neither should you.
Anything below an M4 pro 64GB would be too weak for my workflow. On that point, Mac's unified VRAM is the right approach in 2025. I used windows/wsl devices for my entire life, but their time is up.
This workflow is the first time I have needed multiple screens. Pre-agentic coding, I was happy to work on a 14 inch single screen machine with standard thinkpad x1 specs. But, the world has changed.
Isn't it about equal treatment? You can't buy one person everything they want, just because they have high salary, otherwise the employee next door will get salty.
I previously worked at a company where everyone got a budget of ~$2000. The only requirement was you had to get a mac (to make it easier on IT I assume), the rest was up to you. Some people bought a $2000 macbook pro, some bought a $600 mac mini and used the rest on displays and other peripherals.
If we're talking about rich faang type companies, no, it's not about equal treatment. These companies can afford whatever hardware is requested. This is probably true of most companies.
Where did this idea about spiting your fellow worker come from?
That doesn’t matter. If I’m going to spend 40% of my time alive somewhere, you bet a requirement is that I’m not working on ridiculously outdated hardware. If you are paying me $200k a year to sit around waiting for my PC to boot up, simply because Joe Support that makes 50k would get upset, that’s just a massive waste of money.
Yes, just went from i3770 (12 years old!) to a 9900x as I tend to wait for a doubling of single core performance before upgrading (got through a lot of PCs in the 386/486 era!). It's actually only 50% faster according to cpubenchmark [0] but is twice as fast in local usage (multithread is reported about 3 times faster).
Also got a Mac Mini M4 recently and that thing feels slow in comparison to both these systems - likely more of a UI/software thing (only use M4 for xcode) than being down to raw CPU performance.
M4 is amazing hardware held up by a sub-par OS. One of the biggest bottlenecks when compiling software on a Mac is notarization, where every executable you compile causes a HTTP call to Apple. In addition to being a privacy nightmare, this causes the configure step in autoconf based packages to be excruciatingly slow.
I jumped ahead about 5 generations of Intel, when I got my new laptop and while the performance wasn't much better, the fact that I changed from a 10 pound workstation beast that sounded like a vacuum cleaner, to a svelte 13 inch laptop that works with a tiny USB C brick, and barely runs its fans while being just as fast made it worthwhile for me.
When ever I've built a new desktop I've always gone near the top performance with some consideration given to cache and power consumption (remember when peeps cared about that? lol).
From dual pentium pros to my current desktop - Xeon E3-1245 v3 @ 3.40GHz built with 32 GB top end ram in late 2012 which has only recently started to feel a little pokey, I think largely due to cpu security mitigations added to Windows over the years.
So that extra few hundred up front gets me many years extra on the backend.
I think people overestimate the value of a little bump in performance. I recently built a gaming PC with a 9700X. The 9800X3D is drastically more popular, for an 18% performance bump on benchmarks but double the power draw. I rarely peg my CPU, but I am always drawing power.
Higher power draw means it runs hotter, and it stresses the power supply and cooling systems more. I'd rather go a little more modest for a system that's likely to wear out much, much slower.
Is it really 2x or is it 2x at max load ? Since, as you say, you're not peggig the CPU - would be interesting to compare power usage on a task basis and the duration. Could be that the 3D cache is really adding that much overhead even to idle CPU.
Anyway I've never regretted buying a faster CPU (GPU is a different story, burned some money there on short time window gains that were marginally relevant), but I did regret saving on it (going with M4 air vs M4 pro)
Based on this, I strongly believe that if you're providing hardware for software engineers, it rarely if ever makes sense to buy anything but the top spec Macbook Pro available, and to upgrade every 2-3 years. I can't comment on non desktop / non-mac scenarios or other job families. YMMV.
For me, personally, a better break is one I define on my calendar and helps me defragment my brain for a short period of time before re-engaging.
I recommend investigating the concept of 'deep work' and drawing your own conclusions.
As an ISV I buy my own hardware so I do care about expenses. I can attest that to me waiting for computer to finish feels like a big irritant that can spoil my programming flow. I take my breaks whenever I feel like and do not need a computer to help me. So I pay for top notch desktops (within reason of course).
Configuring devices more generously often lets you get some extra life out of it for people who don’t care about performance. If the beancounters make the choice, you’ll buy last years hardware at a discount and get jammed up when there’s a Windows or application update. Saving money costs money because of the faster refresh cycle.
My standard for sizing this in huge orgs is: count how many distinct applications launch per day. If it’s greater than 5-7, go big. If it’s less, cost optimize with a cheaper config or get the function on RDS.
Tying this back to your point, those limited hours of focus time come in blocks, in my experience, and focus time is not easily "entered", either.
You’re a grownup. You should know when to take a break and that’ll be getting away from the keyboard, not just frittering time waiting for a slow task to complete.
a faster machine can get me to productive work faster.
On the laptop you need: - low weight so you can easily take it with you to work elsewhere - excellent screen/GPU - multiple large connected screens - plenty of memory - great keyboard/pointer device
Also: great chair
Frankly, what would be really great is a Mac Vision Pro fully customised as a workstation.
So it wasn't uncommon to see people with a measly old 13" macbook pro doing the hard work on a 64cpu/256GB remote machine. Laptops were essentially machines used for reading/writing emails, writing documents and doing meetings. The IDEs had a proprietary extensions to work with remote machines and the custom tooling.
I nearly went insane when I was forced to code using Citrix.
God I wish my employers would stop buying me Macbook Pros and let me work on a proper Linux desktop. I'm sick of shitty thermally throttled slow-ass phone chips on serious work machines.
No, those are not the same. There's a reason one's the size of a pizza box and costs $5k and the other's the size of an iPad and costs $700.
And yes, I much prefer to build tower workstations with proper thermals and full-sized GPUs, that's the main machine at their desk, but sometimes they need a device they can take with them.
Sadly, the choice is usually between Mac and Windows—not a Linux desktop. In that case, I’d much prefer a unix-like operating system like MacOS.
To be clear, I am not a “fanboy” and Apple continues to make plenty of missteps. Not all criticisms against Apple are well founded though.
Gave developers 16GB RAM and 512MB storage. Spent way too much time worrying about available disk space and needlessly redownloading docker images off the web.
But at least they saved money on hardware expenses!
best money ever spent. lasted years and years.
for cpus - I wonder how the economics work out when you get into say 32 or 64 core threadrippers? I think it still might be worth it.
I've had the misfortune of being in a phone signal dead spot at times in my life.
On slow connections sites are not simply slow, but unusable whatsoever.
https://danluu.com/slow-device/
In reality, you can’t even predict time to project completion accurately. Rarely is a fast computer a “time saver”.
Either it’s a binary “can this run that” or a work environment thing “will the dev get frustrated knowing he has to wait an extra 10 minutes a day when a measly $1k would make this go away”
I would agree with the idea that faster compile times can have a significant improvement in performance. 30s is long enough for a developer to get distracted and go off and check their email, look at social media, etc. Basically turning 30s into 3s can keep a developer in flow.
The critical thing we’re missing here is how increasing the CPU speed will decrease the compile time. What if the compiler is IO bound? Or memory bound? Removing one bottleneck will get you to the next bottleneck, not necessarily get you all the performance gains you want
The days when 30 seconds pauses for the compiler was the slowest part are long over.
And don't get me started on the cloud ERP software the rest of the company uses...
I think just having LSP give you answers 2x faster would be great for staying in flow.
Applies to git operations as well.
I've seen a test environment which has most assets local but a few shared services and databases accessed over a VPN which is evidently a VIC-20 connected over dialup.
The dev environment can take 20 seconds to render a page that takes under 1 second on prod. Going to a newer machine with twice the RAM bought no meaningful improvement.
They need a rearchitecture of their dev system far more than faster laptops.
There’s your problem. If your expectation was double-digit milliseconds in prod, then non-prod and its VPN also wouldn’t be an issue.
You do need a good SSD though. There is a new generation of pcie5 SSDs that came out that seems like it might be quite a bit faster.
Now in the tfa they compare laptop to desktop so I guess the title should be “you should buy two computers”
https://github.com/rui314/mold?tab=readme-ov-file#why-is-mol...
https://llvm.org/devmtg/2017-10/slides/Ueyama-lld.pdf
I still run a 6600 (65W peak) from 2016 as my daily driver. I have replaced the SSD once (MLC lasted 5 years, hopefully forever with SLC drive from 2011?), 2x 32GB DDR4 sticks (Kingston Micron lasted 8 years, with aliexpress "samsung" sticks for $50 a pop) and Monitor (Eizo FlexScan 1932 lasted 15! years RIP with Eizo RadiForce 191M, highly recommend with f.lux/redshift for exceptional quality of image without blue light)
It's still powerful enough to play any games released this year I throw at it at 60 FPS (with a low profile 3050 from 2024) let alone compile any bloat.
Keep your old CPU until it breaks, completely... or actually until the motherboard breaks; I have a Kaby Lake 35W replacement waititng for the 6600 to die.
Yes you can do everything, but not without added complexity, that will end up failing faster.
We have peaked in all tech. Nothing will ever get as good as the raw peak in longevity:
- SSDs ~2011 (pure SLC)
- RAM ~2013 (DDR3 fast low latency but low Hz = cooler = lasts longer)
- CPUs ~2018 (debatable but I think those will outlast everything else)
I actually run it ~10% underclocked, barely affects performance, but greatly reduces heat/noise. These cards are configured to deliver maximum performance at any cost (besides system instability).
My next GPU I am probably going mid-range to be honest, these beefy GPUs are not worth it anymore cost and performance-wise. You are better off buying the cheaper models and upgrading more often.
Also the 6600 can be passively cooled in the Streacom case I allready have, the 5600 is to hot.
See PUBG that has bloated Unreal so far past what any 4-core computer can handle because of anti-cheats and other incremental changes.
Factorio could add some "how many chunks to simulate" config then? If that does not break gameplay completely.
I just "re-cycle" them.
Bought a 7700X two years ago. My 3600X went to my wife. Previous machine (forgot which one it was but some Intel CPU) went to my mother-in-law. Machine three machines before that, my trusty old Core i7-6700K from 2015 (I think 2015): it's now a little Proxmox server at home.
I'll probably buy a 9900X or something now: don't want to wait late 2026/2027 for Zen 6 to come out. 7700X shall go to the wife, 3600X to the kid.
My machines typically work for a very long time: I carefully pick the components and assemble them myself and then test them. Usually when I pull the plug for the final time, it's still working fine.
But yet I like to be not too far behind: my 7700X from 2022 is okay. But I'll still upgrade. Doesn't mean it's not worth keeping: I'll keep it, just not for me.
Thinkpad X61s(45nm) DDR2 / D512MO(45nm) DDR2 / 3770S(22nm) DDR3 / 4430S(22nm) DDR3
All still in client use.
All got new RAM this year and when the SSDs break (all have SLC) I have new SLC SSDs and will install headless linux for server duty on 1Gb/s symmertic fiber until the motherboards break in a way I can't repair. Will probably resolder caps.
The i7-4770 was one. It reliably outperformed later Intel CPUs until near 10th gen or so. I know shops that are still plugging away on them. The first comparable replacements for it is the i7-12700 (but the i5-12400 is a good buy).
At 13th gen, Intel swaps E for P cores. They have their place but I still prefer 12th gen for new desktops.
Past all that, the author is right about the AMD Ryzen 9950x. It's a phenomenal chip. I used one in a friend's custom build (biz, local llm) and it'll be in use in 2035.
Per which benchmarks?
> At 13th gen, Intel swaps E for P cores.
One nit, Intel started adding (not swapping) E-cores to desktop parts with 12th gen, but i3 parts and most i5 parts were spared. More desktop i5 parts got them as starting with 13th gen.
The limiting factor on high-end laptops is their thermal envelope. Get the better CPU as long as it is more power efficient. Then get brands that design proper thermal solutions.
It is (maybe was) done by XMG and Schenker. Called Oasis IIRC. Yep
https://www.xmg.gg/en/xmg-oasis/
I work mostly remote, and also need to jump between office locations and customer sites as well.
Member of Windows/UNIX crowd since several decades.
Their employers made it the culture so that working from home/vacation would be easy.
I hate having more than one machine to keep track of and maintain. Keeping files in sync, configuration in sync, everything updated, even just things like the same browser windows with the same browser tabs, organized on my desktop in the same way. It's annoying enough to have to keep track of all that for one machine. I do have several machines at home (self-built NAS, media center box, home automation box), and I don't love dealing with them, but fortunately I mainly just have to ensure they remain updated, not keep anything in sync with other things.
(I'm also one of those people who gets yelled at by the IT security team when they find out I've been using my personal laptop for work... and then ignores them and continues to do it, because my personal laptop is way nicer than the laptop they've given me, I'm way more productive on it, and I guarantee I know more about securing a Linux laptop and responsibly handling company data than the Windows/Mac folks in the company's IT department. Yes, I know all the reasons, both real and compliance-y, why this is still problematic, but I simply do not care, and won't work for a company that won't quietly look the other way on this.)
I also rarely do my work at a desk; I'm usually on a couch or a comfy chair, or out working in a public place. If all I had was a desktop, I'd never do any work. If I had a desktop in addition to my laptop, I'd never use the desktop. (This is why I sold my personal home desktop computer back in the late '00s: I hadn't even powered it on in over a year.)
> ...why so many devs seem to insist...
I actually wonder if this was originally driven by devs. At my first real job (2001-2004) I was issued a desktop machine (and a Sun Ray terminal!), and only did work at the office. I wouldn't even check work email from home. At my second job (2004-2009), I was given a Windows laptop, and was expected to be available to answer the odd email in my off hours, but not really do much in the way of real work. I also had to travel here and there, so having the laptop was useful. I often left the laptop in the office overnight, though. When I was doing programming at that company, I was using a desktop machine running Linux, so I was definitely not coding at home for work.
At the following job, in 2009, I was given a MacBook Pro that I installed Linux on. I didn't have a choice in this, that's just what I was given. But now I was taking my work laptop home with me every day, and doing work on my off hours, even on weekends. Sneaky, right? I thought it was very cool that they gave me a really nice laptop to do work on, and in return, I "accidentally" started working when I wasn't even in the office!
So by giving my a laptop instead of a desktop, they turned me from a 9-5 worker, into something a lot more than that. Pretty good deal for the company! It wasn't all bad, though. By the end of the '10s I was working from home most days, enjoying a flexible work schedule where I could put in my hours whenever it was most convenient for me. As long as I was available for meetings, spent at least some time in the office, and produced solid work in a timely manner, no one cared specifically when I did it. For me, the pandemic just formalized what I'd already been doing work-wise. (Obviously it screwed up everything outside of work, but that's another story.)
> My best guess is this is mostly the apple crowd...
Linux user here, with a non-Apple laptop.
Limiting the number and size of monitors. Putting speedbumps (like assessments or doctor's notes) on ergo accessories. Requiring special approval for powerful hardware. Requiring special approval for travel, and setting hotel and airfare caps that haven't been adjusted for inflation.
To be fair, I know plenty of people that would order the highest spec MacBook just to do web development and open 500 chrome tabs. There is abuse. But that abuse is really capped out at a few thousand in laptops, monitors and workstations, even with high-end specs, which is just a small fraction of one year's salary for a developer.
Then some period of time later they start looking at spending in detail and can’t believe how much is being spent by the 25% or so who abuse the possibly. Then the controls come.
> There is abuse. But that abuse is really capped out at a few thousand in laptops, monitors and workstations, even with high-end specs,
You would think, but in the age of $6,000 fully specced MacBook Pros, $2,000 monitors, $3,000 standing desks, $1500 iPads with $100 Apple pencils and $300 keyboard cases, $1,000 chairs, SaaS licenses that add up, and (if allowed) food delivery services for “special circumstances” that turns into a regular occurrence it was common to see individuals incurring expenses in the tens of thousands range. It’s hard to believe if you’re a person who moderates their own expenditures.
Some people see a company policy as something meant to be exploited until a hidden limit is reached.
There also starts to be some soft fraud at scales higher than you’d imagine: When someone could get a new laptop without questions, old ones started “getting stolen” at a much higher rate. When we offered food delivery for staying late, a lot of people started staying just late enough for the food delivery to arrive while scrolling on their phones and then walking out the door with their meal.
Not an expert here, but from what I heard, that would be a bargain for a good office chair. And having a good chair or not - you literally feel the difference.
Just like with "policing", I'd only focus on uncovering and dealing with abusers after the fact, not on everyone — giving most people "benefits" that instead makes them feel valued.
Yeah, it's hard to convey to people who've never been responsible for setting (or proposing) policy that it's not a game of optimizing the average result, but of minimizing the worst-case result.
You and I and most people are not out to arbitrage the company's resources but you and I and most people are also not the reason policy exists.
It was depressing to run into that reality myself as policy controls really do interfere sometimes in allowing people to access benefits the organization wants them to have, but the alternative is that the entire budget for perks ends up in the hands of a very few people until the benefit goes away completely.
1. My brothers (I have a number of them) mostly work in construction somehow. It feels most of them drive a VW Transporter, a large pickup or something, each carrying at least $30 000 in equipment.
Seeing people I work with get laptops that use multiple minutes to connect to a postgres database that I connect to in seconds feels really stupid. (I'm old enough that I get what I need, they usually rather pay for a decent laptop rather than start a hiring process.)
2. My previous employer did something really smart:
They used to have a policy that you got a basic laptop and an inexpensive phone, but you could ask for more if you needed. Which of course meant some people got nothing and some people got custom keyboards and what not.
That was replaced with a $1000 budget on your first day an $800 every year that was meant to cover phones and everything you needed. You could alsp borrow from next year. So if someone felt they needed the newest iPhone or Samsung? Fine, save up one year(or borrow from next year) and you have it.
Others like me who don't care that much about phones could get a reasonably priced one + a gpod monitor for my upstairs office at home + some more gear.
And now the rules are the same for everyone so even I get (I feel I'm hopeless when it comes to arguing my case with IT, but now it was a simple: do you have money for it? yes/no)
peanuts compared to their 500k TC
Ehh. Neither of these are soft fraud. The former is outright law-breaking, the latter…is fine. They stayed till they were supposed to.
Plus, not to mention the return on investment you get from retaining the talent and the value they add to your product and organization.
If you walk into a mechanic shop, just the Snap On primary tool kit is like 50k.
It always amazes me that companies go cheap on basic tools for their employees, yet waste millions in pointless endeavors.
Anyway, your choices of what to do about idiocy like this are pretty limited.
You're underestimating the scope of time lost by losing a few percent in productivity per employee across hundreds of thousands of employees.
You want speed limits not speed bumps. And they should be pretty high limits...
For a single person, slight improvements added up over regular, e.g., daily or weekly, intervals compound to enormous benefits over time.
XKCD: https://xkcd.com/1205/
I am 100x more expensive than the laptop. Anything the laptop can do instead of me is something the laptop should be doing instead of me.
> But that abuse is really capped out at a few thousand
That abuse easily goes into the tens of thousands of dollars, even several hundred thousand, even at a relatively small shop. I just took a quick look at Apple's store, and wow! The most expensive 14" MacBook Pro I could configure (minus extra software) tops out at a little over $7,000! The cheapest is at $1,600, and a more reasonably-specced, mid-range machine (that is probably perfectly sufficient for dev work), can be had for $2,600.
Let's even round that up to $3,000. That's $4,000 less than the high end. Even just one crazy-specced laptop purchase would max out your "capped out at a few thousand" figure.
And we're maybe not even talking about abuse all the time. An employee might fully earnestly believe that they will be significantly more productive with a spec list that costs $4,000, when in reality that $3,000 will be more or less identical for them.
Multiply these individual choices out to a 20 or 40 or 60 person team, and that's real money, especially for a small startup. And we haven't even started talking about monitors and fancy ergonomic chairs and stuff. 60 people spending on average $2,000 each more than they truly need to spend will cost $120k. (And I've worked at a place that didn't eliminate their "buy whatever you think you'll need" policies until they had more than 150 employees!)
Google and Facebook I don't think are cheap for developers. I can speak firsthand for my past Google experience. You have to note that the company has like 200k employees and there needs to be some controls and not all of the company are engineers.
Hardware -> for the vast majority of stuff, you can build with blaze (think bazel) on a build cluster and cache, so local CPU is not as important. Nevertheless, you can easily order other stuff should you need to. Sure, if you go beyond the standard issue, your cost center will be charged and your manager gets an email. I don't think any decent manager would block you. If they do, change teams. Some powerful hardware that needs approval is blanket whitelisted for certain orgs that recognize such need.
Trips -> Google has this interesting model you have a soft cap for trips and if you don't hit the cap, you pocket half of the trips credit in your account which you can choose to spend later when you are overcap or you want to get something slightly nicer the next time. Also, they have clear and sane policies on mixing personal and corporate travel. I encourage everyone to learn about and deploy things like that in their companies. The caps are usually not unreasonable, but if you do hit them, it is again an email to your management chain, not some big deal. Never seen it blocked. If your request is reasonable and your manager is shrugging about this stuff, that should reflect on them being cheap not the company policy.
I read Google is now issuing Chromebooks instead of proper computers to non-engineers, which has got to be corrosive to productivity and morale.
They eventually became so cheap they blanket paused refreshing developer laptops...
All this at my company would be a call or chat to the travel agent (which, sure, kind of a pain, but they also paid for dedicated agents so wait time was generally good).
I have a pretty high end MacBook Pro, and that pales in comparison to the compute I have access to.
Apple have long thought that 8Gb ram is good enough for anything, and will continue to for some time now.
Don’t worry, they’ll tell you
So people started slacking off, because "you have to love your employees"?
The outliers will likely be two kinds:
1) People with poor judgement or just an outright fraudulent or entitled attitude. These people should be watched for performance issues and managed out as needed. And their hardware reclaimed.
2) People that genuinely make use of high end hardware, and likely have a paper trail of trying to use lower-end hardware and showing that it is inefficient.
This doesn't stop the people that overspend slightly so that they are not outliers, but those people are probably not doing substantial damage.
Deleted Comment
We managed to just estimate the lost time and management (in a small startup) was happy to give the most affected developers (about 1/3) 48GB or 64GB MacBooks instead of the default 16GB.
At $100/hr minimum (assuming lost work doesn't block anyone else) it doesn't take long for the upgrades to pay off. The most affected devs were waiting an hour a day sometimes.
This applies to CI/CD pipelines too; it's almost always worth increasing worker CPU/RAM while the reduction in time is scaling anywhere close to linearly, especially because most workers are charged by the minute anyway.
I think you wanted to say "especially". You're exchanging clearly measurable amounts of money for something extremely nebulous like "developer productivity". As long as the person responsible for spend has a clear line of view on what devs report, buying hardware is (relatively) easy to justify.
Once the hardware comes out of a completely different cost center - a 1% savings for that cost center is promotion-worthy, and you'll never be able to measure a 1% productivity drop in devs. It'll look like free money.
P.S. you can buy a satellite monitor often for $10 from the thrift store. The one I bought was $10.
I don't buy used keyboards because they are dirty and impossible to clean.
Why is that abuse? Having many open browser tabs is perfectly legitimate.
Arguably they should switch from Chrome to Safari / lobby Google to care about client-side resource use, but getting as much RAM as possible also seems fine.
I have a whisper transcription module running at all times on my Mac. Often, I'll have a local telemetry service (langfuse) to monitor the 100s of LLM calls being made by all these models. With AI development it isnt uncommon to have multiple background agents hogging compute. I want each of them to be able to independently build + host and test their changes. The compute load apps up quickly. And I would never push agent code to a cloud env (not even a preview env) because I don't trust them like that and neither should you.
Anything below an M4 pro 64GB would be too weak for my workflow. On that point, Mac's unified VRAM is the right approach in 2025. I used windows/wsl devices for my entire life, but their time is up.
This workflow is the first time I have needed multiple screens. Pre-agentic coding, I was happy to work on a 14 inch single screen machine with standard thinkpad x1 specs. But, the world has changed.
AMD's Strix Halo can have up to 128GB of unified RAM, I think. The bandwidth is less than half the Mac one, but it's probably going to accelerate.
Windows doesn't inherently care about this part of the hardware architecture.
Equality doesn't have to mean uniformity.
Where did this idea about spiting your fellow worker come from?
Single thread performance of 16-core AMD Ryzen 9 9950X is only 1.8x of my poor and old laptop's 4-core i5 performance. https://www.cpubenchmark.net/compare/6211vs3830vs3947/AMD-Ry...
I'm waiting for >1024 core ARM desktops, with >1TB of unified gpu memory to be able to run some large LLMs with
Ping me when some builds this :)
Also got a Mac Mini M4 recently and that thing feels slow in comparison to both these systems - likely more of a UI/software thing (only use M4 for xcode) than being down to raw CPU performance.
[0] https://www.cpubenchmark.net/compare/Intel-i9-9900K-vs-Intel...
From dual pentium pros to my current desktop - Xeon E3-1245 v3 @ 3.40GHz built with 32 GB top end ram in late 2012 which has only recently started to feel a little pokey, I think largely due to cpu security mitigations added to Windows over the years.
So that extra few hundred up front gets me many years extra on the backend.
Higher power draw means it runs hotter, and it stresses the power supply and cooling systems more. I'd rather go a little more modest for a system that's likely to wear out much, much slower.
Anyway I've never regretted buying a faster CPU (GPU is a different story, burned some money there on short time window gains that were marginally relevant), but I did regret saving on it (going with M4 air vs M4 pro)