Development cost difference between AWS CDK / Terraform + GitHub Actions vs Docker / K8s / Ansible + any CI pipeline? I don't know, I really don't see how "bare metal" saves any engineering time. I also don't see anything complicated about using IaC Fargate + RDS template.
Now, if you actually need to decouple your file storage, and make that durable, and scalable. Or need to dynamically create subdomains, or any number of other things... The effort of learning and integrated a different dedicated services at the infrastructure level to run all this seems much more constraining.
I've been doing it since before the "Cloud", and I have to say, if you have a project that makes money, cloud costs are absolutely a worthwhile investment that will be the last thing that constrains your project. If the cloud costs are too constraining for your project, then perhaps your project is more of a hobby, and not a business.
Just thinking about maintaining multiple cluster filesystems, and disk arrays, it's just not what I would want to be doing with most companies resources, or my time. Maybe it's the difference between people who want to use an Arch, and setup emacs just right, but I'm happy with a MacBook. If I feel like changing my kernel scheduler was a constraint, I would probably also recommend Arch, but otherwise, I recommend a MacBook :)
On the flip side, I have also tried to turn a startup idea into a profitable project, with no budget, and raw throughput integral to the idea. In that situation, a dedicated server was absolutely the right choice, saving us thousands of dollars. But, the idea did not pan out. If we did get more traction, I suspect we have just vertically scaled for a while. But, it’s unusual.
This is because you are looking only at provisioning/deployment. And you are right -- node size does not impact DevOps all that much.
I am looking at the solution space available to the engineers who write the software that ultimately gets deployed on the nodes. And that solution space is different when the nodes have 10x the capability. Yes, cloud providers have tons of aggregate capability. But designing software to run on a fleet of small machines is very different from accomplishing the same tasks on a single large machine.
It would not be controversial to suggest that targeting code at an Apple Watch or Raspberry Pi imposes constraints on developers that do not exist when targeting desktops. I am saying the same dynamic now applies to targeting cloud providers.
This isn't to say there's a single best solution for everything. But there are tradeoffs that are now always apparent. The art is knowing when it makes sense to pay the Cloud Tax, and whether to go 100% Cloud vs some proportion of dedicated.
Is there a normie search mode, or is this to be expected?