This is more of a thought experiment, I know a multitude of ways to do this which require loads of setup, jumping through AWS hoops, etc. I'd like ideally way to deploy to a hosted service with a single command. Imagine Heroku but even easier.
I've tried Github pages, Netlify, and Cloudflare Pages. Moved last night to the latter.
Github pages is fine, but the extra gh-pages branch is kind of annoying. It's fairly clear (for obvious reasons) that they mainly think about Jekyll users, though I had a Hugo blog there for a while.
Netlify sounds great, but in practice I ran into an annoying issue that I couldn't resolve where I had to rerun every automatic build. This probably isn't really their fault but the forums didn't work.
Cloudflare Pages kind of just worked out of the box, and since I was already using it for my DNS, changing where the domain pointed took a few seconds.
I'd suspect, if you corrected for how many issues were particular to me, I'd say Cloudflare Pages ~ Netlify > Github Pages. I'm kind of surprised the latter charges you for custom domains; I do have to say there's more documentation for Github than any of the others.
Just as a clarification, much of what you mention has only changed in the last year or so. For a very long time gh-pages branch and Jekyll were assumed (you couldn't even use plugins with Jekyll). Custom domains weren't free. SSL wasn't free. Github is doing great stuff but it has taken time to evolve into the all free all you can eat buffet you see today.
I think we've both said correct things, but I think you've missed my meaning: yes, I understand that Github Pages is static hosting, I was point out that by default the build system uses Jekyll. This is nonobvious.
This is in the docs: "GitHub Pages will use Jekyll to build your site by default. If you want to use a static site generator other than Jekyll, disable the Jekyll build process by creating an empty file called .nojekyll in the root of your publishing source, then follow your static site generator's instructions to build your site locally." (https://docs.github.com/en/github/working-with-github-pages/...). Netlify and CF will both run some other static site generators (e.g. Hugo in my case) for you, while Github Pages will run Jekyll (or you can build locally).
And yes, custom domains are free for public repos. I made a mistake reading the page about custom domains, which mentioned that private repos need a Github Pro account. You're also right about configuring any branch.
I should warn that bike shedding about this distinction is absolutely not worth your time if you’re reading this and have less than like 100 blog posts written or are running a business on these sites (in which case I think you should pick Netlify). Go write something.
And also, who cares it's a static host and you can change to another one in an afternoon. There's zero lock-in when you're only rsyncing some files around. Just make sure you have total control over your DNS (buy a domain, put it on a good DNS host) and you really can't screw things up too badly.
I actually just went the other way. I found CF is a bit too aggressive with anti-DDOS measures, and the page would not work sometimes in Safari with default settings.
The default setup for GH Pages is not great. Jekyll is very dated. But if you use GH Actions, you can build yourself a pretty great setup. My site is open source at https://github.com/jacobp100/jacob-does-code - there's also a blog post about how the build system works.
> I found CF is a bit too aggressive with anti-DDOS measures
As someone who does a lot of browsing with unpopular web browsers, Cloudflare is basically a way to ensure that it is a coin toss as to whether or not I will see your site; it is not the worst CDN/DoS protection racket service out there though. I would like to see how much legitimate traffic Cloudflare or the others block, and how much website operators are spending on that "service."
Usually when I need static site hosting, I don't want to build myself a great setup - I want to click a few buttons and have a great setup.
Third party free hosting excels at the ability to do a start to finish page in inside an hour. I want to spend 55 minutes of that hour writing the text, not fiddling with oddities of how to make GitHub actions show me the parse errors of my markdown...
I had cloudflare set up on a mediawiki site until last year when for some reason I disabled it and discovered that a lot of issues I'd had accessing the site in Safari went away (like it not letting me login except in private browsing).
I use both pages.dev and netlify.com: pages.dev, even in open beta, is a fantastic product if it suits all your needs, but the limitations [0], if you hit them, are a dead-end and there are no ETAs as to when some of those limitations would be addressed.
That is not the case with netlify.com which boasts a myriad of features, is equally as fast, production-ready, and competitively priced.
With pages.dev, I believe, you pay per every domain you host, whilst with netlify.com, there's no cap on number of domains per account, though they do charge for bandwidth (unlike pages.dev).
Workers Sites [1] is a viable alternative and I quite prefer it over pages.dev.
How do you know that Netlify is production ready? I found Netlify Identity to not be production ready* and now I'm not so sure about the rest of the platform.
* Three reasons. 1) The identity widget has people setting it up in such a way that tokens aren't refreshed and logins last only an hour. https://github.com/netlify/netlify-identity-widget/issues/11... 2) Netlify Identity keeps bumping me out despite having gone out of my way to get it configured properly. I don't know why but the issue has lasted for weeks. 3) In development mode, the token can't be verified without making a request to Netlify Functions, and the suggested path (in Redwood.js at least) is to just parse the JWT without verifying it.
There is a strong chance that when Pages comes out of beta, Worker Sites is going to be transitioned into Pages, at least thats kind of how I feel about some of the verbiage on this page:
> […] Github Pages. I'm kind of surprised the latter charges you for custom domains
What do you mean by that? I've been (and still am) hosting quite a few pages with their service using custom domains and have never been charged for that.
One pretty huge downside for GitHub Pages, is that they still don't seem to serve content to IPv6 addresses.
Saying that because I tried it out again a few weeks ago for a friend's website (https://neasair.org), and nope... it doesn't provide AAAA DNS records (IPv6), just A (IPv4).
So, had to put the website on some other infrastructure already hanging around, that does do AAAA records.
I used to host my site on GitHub but switched to Gitlab pages which doesn't require a special branch or extra finesse. I love it, it's really easy for me to add new posts.
Gitlab pages has an option to look at the /build folder for web pages to serve up. So as long as that folder is in the .gitignore, my Gitlab ci just runs `hugo` to build the site which then gets served up.
Notably, it's easy to make a review site that integrates with a merge request so you can see it before deploying and get visual reviews any time you have multiple stakeholders. https://docs.gitlab.com/ee/ci/review_apps/
Haven't tried Netlify or Cloudfare pages, but I use GH pages for my personal blog (running Jekyll) and I've never seen any reason to switch. You need to know how to use Git and Github, but since you're reading HN you probably know these things. It's really convenient to be able to just push to GH and have it auto-deploy - and it's free.
Granted, my blog is an insignifcant site with a tiny readership and I rarely update it, so if you need something more heavy duty than I have no idea if GH pages is up to the task.
Get your facts straight, they have had custom domains for a looong time & there was absolutely no need for Jekyll either at any point to host a page, it just happened to be one the most documented/blogged about way to use GitHub pages with automatic builds instead of having static content.
Hosting a personal blog on old / cheap hardware at home and accepting some downtime now and then should be a trade-of worth making for most. An interconnected web of documents served from personal computers residing under a desk or in a basement – it is a beautiful thing.
We nerds often seem to fall into the trap of over-engineering things, like, spending a lot of time building personal stuff that can scale into the infinity – only to end up never having more than 10 concurrent visitors at any given time.
By accepting and embracing the intermittent and slightly unstable nature of serving websites from our homes, we can take back much of the in(ter)dependence and hacky, quirky fun that many of us fondly remember from the 90's and 2000's.
Setting up a webserver on a Pi and serving it from home can be a fun project, not necessarily trivial for everyone, but definitely within reach for most technically inclined people.
Might be among the cheapest, easiest AND fun ways to host a static site :)
Adding to this: If you want/need to mask your residential IP, still want access over clearnet, and have e.g. a VPS with a public IP that can front the traffic, the easiest while still robust way I found to do this is to set up a Wireguard VPN and use either microsocks+proxychains, or vopono.
Bonus that this works even if your server is behind some DMZ or bastion without having to open up the mothership.
(For the use-case of a public personal static site this probably doesn't make much sense; you could just publish to the VPS in the first place. But if you have server-side stuff you must or want to run locally for whatever reason, this is way more straightforward to get right than manually fiddling with iptables, using some SSH tunnel, or setting up redundant reverse proxies)
I have hosted all my websites on a server at home since 1993 until 2021. I never had any downtime or problems.
I ran a ISP and datacenter from my home for a decade as well.
Nowadays I have a $2 per month VPS to forward a static IP at 1 Gbps to both my home servers and a failover server at a friends home.
Works fine up to 600 Mbps sustained web traffic. Its been cheaper than colocation of dedicated servers and much cheaper than hosting in the cloud.
I am doing this and it's fun! The only hassles I had are a. setting up DDNS (since I don't have a static IP) and b. getting a certificate from Let's encrypt with only port 443 open (80 is blocked by my ISP). But other than that, it has been great! Hopefully my ISP won't have a problem with that ;)
I just use a script on a cron that runs every 15 minutes and updates my DNS provider. And for let’s encrypt, I use the DNS verification method so you don’t have to even expose HTTP(S) ports if you don’t need to
I'll go one step further: were it not for self-hosting since the early 2000s, I probably wouldn't have a career. My first "server" was a Pentium laptop with no screen. Now I run thousands of servers for a company you've probably heard of.
Same here! I'm working on a small site for pentesting / CTF commands. Like a cheat sheet but interactive so you have to copy and change less code during my day. It's entirely static so any nginx server just caches all the responses. A simple NUC does the trick.
Put the free Cloudflare proxy in front just to speed it up a little.
That's how I do it. I absolutely agree with the overengineering. I'm pretty confident my low-end Celeron server could handle a large number of concurrent visitors over my fiber connection at home.
I do the same thing. The only problem I have is with my mail server. Any email that comes from my dynamic IP address auto goes to junk for everyone. I send probably one every two weeks for password resets.
I wonder if 3rd parties could DOS the telephone (VoIP) this way?
Do you take precautions or just accept the theoretically lower reliability of the phone – which isn't anything reliable anymore anyway, thanks to VoIP?
GitHub Pages or Netlify for sure. With either of these options, your static site can be set up to deploy whenever there's new commits to the specified branch.
Me and a friend mentor for Code Louisville, and we're able to get the beginner frontend students up and running very quickly using these options - they're free and much easier to use.
I see the "GitHub Pages" suggested a lot whenever this question comes up, but it's worth noting that it's against their ToS[0] to use it for commercial purposes:
"GitHub Pages is not intended for or allowed to be used as a free web hosting service to run your online business, e-commerce site, or any other website that is primarily directed at either facilitating commercial transactions or providing commercial software as a service (SaaS)."
So it's great if you want to host a personal site / blog or some other non-revenue-generating website, but for anything more than that you could run into issues.
In my opinion, there are some cases where it is a gray area: professional portfolio, case studies website with an email collection form, blog posts where you mention that you are a freelancer, a site where you link to your YouTube videos, some simple JavaScript app that don't require a backend but might make some money with ads...
So I'm just wondering how often they strike down at people who operate in this gray area.
That's a great point. I actually looked at GitHub pages and my main issue with it is there's no way to password lock the site. I don't need serious security here, but I also don't exactly want the entire world to see my HTML playground.
I'm not sure why they're doing it this way, it should be a nominal task to restrict viewers to those with read access to your repo, but I think it underscores the use cases they have in mind here. They want you to use it as a blog and not as a serious web hosting solution.
Even for private stuff you're under dictation of a 3rd party that decides what you're allowed to say to whom and who eavesdrops. US was once famous how they valued free speech. Tempi passati.
I discovered Netlify through Netlify CMS. I had a pretty bad opinion about Netlify CMS, so I was pretty sure Netlify was going to be at least as bad.
Damn, I have never been so wrong:
- the interface is so intuitive I never opened the documentation
- this is FAST (the bottleneck to see my website live is now my browser cache)
- no need to create a Github Actions pipeline to deploy, just say "hey, my repo is here"
- automatically integrated with Let's Encrypt for custom domains
- deploy previews!!!
My favorite feature is automatic deployment of pull requests. I love it, and I wish there was something as cheap and easy for backend services as well. Though I guess it's not too necessary if you implement testing properly :p
I swear by Netlify. It's fantastic, and I use it for several sites, not least of which is my own personal site.
One of their neat features is the ability to spin up "pull request sites" automatically, so you can share versions of your site with stakeholders before it's made live.
Your pricing page is confusing to me. There is a free tier, and it says "Pay as you grow", but I can't see what the limits on the free tier are or how growth is measured. How does it work?
The paid plans let you deploy more often, has longer timeouts, and lets you have team members. Also, hobby plans are limited to personal, non-commercial use. https://vercel.com/docs/platform/limits
I tried out vercel for the first time this week after Github pages failed to deploy an already built static site 30 mins before a demo. I took a gamble on vercel about 10 mins before I had to demo and it was running with a few minutes to spare, from starting sign up to having a live site - all free. I'm upgrading us today, because that was an exceptionally good experience. Thanks for making a simple product.
Interesting that you chose to actually host it given the time constraints! I would probably cheat and run a server locally (say python -m http.server 80) and add the domain to /etc/hosts
I've been a happy user of Netlify, Github Pages and now Vercel. I'm really enjoying Vercel lately! It's been really great. Recently also tested out Cloudflare Pages, but I think they have a bit of work to do before they can match Vercel.
Best part of Vercel is that it supports web apps with back-ends that can make API calls (it's like "static site+"). It's my goto host these days. Love their framework next.js too.
Not to bring down the mood on how easy it is for static sites on Vercel, but I still get sad about it moving away from server side or Docker based deploys. I really miss that workflow.
Neocities must have an interesting story behind it. I seem to remember something about it being part of an effort to salvage Geocities. Has the story been written down somewhere?
Use it to host my blog; free tier is good, but I really like them and have really phenomenal limits, etc, so just pay (even though my blog is not updated nearly as much as I'd like)
I personally use Firebase for all my static sites. Completely free and fast. If you need to run some backend function they have Firebase functions which is free to a certain degree too.
The fact that you cannot put a cap on your spending is infuriating (the official AWS answer is "we do not want to break your business" (even if I want to)).
To be fair, some complaining or crying in such cases usually gets your bill reversed.
You don't mean literally distributed denial of service attack, right? It's super easy to keep serving a static resource, so I think you'd need a huge amount of traffic to put a dent in Firebase's static hosting capacity.
I've had spikes of up to 300k visitors to my Firebase blog in a month, and the bill was like $150. It seems hard to get bitten by huge surprise costs from a blog on Firebase static hosting, even if someone's using a botnet to try to drive up your bill.
Edit: Thinking about it a little more, I guess you could find the largest resource on a blog and direct your botnet to download it nonstop repeatedly, and that would be orders of magnitude more expensive than even large organic traffic.
Check with your ISP, they often offer some 10 or sometimes 50! MB of free space to host your homepage and images under an account url, eg: http://example.com/~you. Uploading is easy, just FTP your files to their server and your done!
The in-between days (after dial-up ISPs, but before "cloud" meant people forgot they could do things themselves) was to host the website on your own computer. For example, an old laptop, a Raspberry Pi, or an always-on-anyway desktop.
I host my personal site this way.
I also host about 100 mostly-static mostly-low-use websites at work similarly: with Apache on a couple of VMs. They are more reliable than the free versions of Netlify etc, and the systems work to maintain 100 static websites is little different to maintaining 5 static websites.
AWS S3 + Cloudfront.
The initial setup can be a little finicky, but once you do get it setup, the rest of the time you only do an S3 sync from your local dir to the S3 bucket.
It will also be stupid fast (cloudfront is edge networks) + have ssl + it will not go down if you ever see a surge in traffic.
I have several of these and they work beautifully. As far as cost goes: 12$/domain per year + 0.5$/month for route53 zone + 0.2$/month for the actual traffic and hosting the content in S3. So it's under 2$/month for "enterprise grade"
Only the domain, and to some extend, the S3 costs are the "fixed" cost that you can even predict at first. A surge in traffic will surely increase the egress data, and that is easily $0.05-0.10 per GB.
Cloudfront is one of the most expensive CDNs out there, and after 2-3 years of use, I came to realize how much I was wasting money on. CDNs often work quite flawlessly if you have a good cache busting mechanism, and I was paying a huge premium for what was essentially a set-once operation.
let’s talk numbers. how much egress traffic would you have?
at 10GB per month you pay 0.8$/month
yes the traffic it costs extra, but a static website should not be GB (or even MB) in size.
Also, if you have a lot of traffic you should learn how to control the caching aspect.
i would also challenge you to look at what egress traffic costs in other places.
I put my site on s3, gatsby deployed from GitHub actions automatically after merge with master, and put Cloudflare in front of it. Remember reading that Cloudflare might throttle the free plan sometime but never noticed! And it’s free so can’t complain!
Yes, I have a similar setup on Azure - Azure Storage account with an Azure CDN fronting it. It's less than a single US cent per month for the hosting and bandwidth costs atm.
Github pages is fine, but the extra gh-pages branch is kind of annoying. It's fairly clear (for obvious reasons) that they mainly think about Jekyll users, though I had a Hugo blog there for a while.
Netlify sounds great, but in practice I ran into an annoying issue that I couldn't resolve where I had to rerun every automatic build. This probably isn't really their fault but the forums didn't work.
Cloudflare Pages kind of just worked out of the box, and since I was already using it for my DNS, changing where the domain pointed took a few seconds.
I'd suspect, if you corrected for how many issues were particular to me, I'd say Cloudflare Pages ~ Netlify > Github Pages. I'm kind of surprised the latter charges you for custom domains; I do have to say there's more documentation for Github than any of the others.
GitHub Pages is static hosting. If you don’t need Jekyll, don’t use it. There’s nothing special about it.
GitHub Pages does NOT charge for custom domains.
You can configure any branch for GitHub Pages, or even just the /docs folder on the main branch.
This is in the docs: "GitHub Pages will use Jekyll to build your site by default. If you want to use a static site generator other than Jekyll, disable the Jekyll build process by creating an empty file called .nojekyll in the root of your publishing source, then follow your static site generator's instructions to build your site locally." (https://docs.github.com/en/github/working-with-github-pages/...). Netlify and CF will both run some other static site generators (e.g. Hugo in my case) for you, while Github Pages will run Jekyll (or you can build locally).
And yes, custom domains are free for public repos. I made a mistake reading the page about custom domains, which mentioned that private repos need a Github Pro account. You're also right about configuring any branch.
The default setup for GH Pages is not great. Jekyll is very dated. But if you use GH Actions, you can build yourself a pretty great setup. My site is open source at https://github.com/jacobp100/jacob-does-code - there's also a blog post about how the build system works.
As someone who does a lot of browsing with unpopular web browsers, Cloudflare is basically a way to ensure that it is a coin toss as to whether or not I will see your site; it is not the worst CDN/DoS protection racket service out there though. I would like to see how much legitimate traffic Cloudflare or the others block, and how much website operators are spending on that "service."
Third party free hosting excels at the ability to do a start to finish page in inside an hour. I want to spend 55 minutes of that hour writing the text, not fiddling with oddities of how to make GitHub actions show me the parse errors of my markdown...
That is not the case with netlify.com which boasts a myriad of features, is equally as fast, production-ready, and competitively priced.
With pages.dev, I believe, you pay per every domain you host, whilst with netlify.com, there's no cap on number of domains per account, though they do charge for bandwidth (unlike pages.dev).
Workers Sites [1] is a viable alternative and I quite prefer it over pages.dev.
[0] https://developers.cloudflare.com/pages/platform/limits
[1] https://developers.cloudflare.com/workers/platform/sites
nope, we do unlimited sites on pages! here's the free plan [1]:
> 1 build at a time
> 500 builds per month
> Unlimited sites
> Unlimited requests
> Unlimited bandwidth
you can upgrade to the pro ($20/mo) or business ($200/mo) plans for more concurrent builds and builds per month. still not priced per domain or site!
[1]: https://pages.cloudflare.com/#pricing
* Three reasons. 1) The identity widget has people setting it up in such a way that tokens aren't refreshed and logins last only an hour. https://github.com/netlify/netlify-identity-widget/issues/11... 2) Netlify Identity keeps bumping me out despite having gone out of my way to get it configured properly. I don't know why but the issue has lasted for weeks. 3) In development mode, the token can't be verified without making a request to Netlify Functions, and the suggested path (in Redwood.js at least) is to just parse the JWT without verifying it.
https://developers.cloudflare.com/pages/migrations/migrating...
What do you mean by that? I've been (and still am) hosting quite a few pages with their service using custom domains and have never been charged for that.
Saying that because I tried it out again a few weeks ago for a friend's website (https://neasair.org), and nope... it doesn't provide AAAA DNS records (IPv6), just A (IPv4).
So, had to put the website on some other infrastructure already hanging around, that does do AAAA records.
Gitlab pages has an option to look at the /build folder for web pages to serve up. So as long as that folder is in the .gitignore, my Gitlab ci just runs `hugo` to build the site which then gets served up.
"With GitLab Pages, you can publish static websites directly from a repository in GitLab. Use for any personal or business website."
That is Netlify and Cloudflare Pages' defining feature, and GitHub has had that for a few years now.
You can configure github projects to source from a different branch (including your main/master branch if you check in build artifacts directly)
Granted, my blog is an insignifcant site with a tiny readership and I rarely update it, so if you need something more heavy duty than I have no idea if GH pages is up to the task.
You don't need that anymore. You can run Github Pages out of a branch like /docs, which makes life much simpler. That's how I run most of my sites.
- I've got several sites on gh-pages with custom domains and pay exactly $0
- I don't use Jekyll, I literally just `git push` a static repo
For everything else it expects your SPA to handle it.
We nerds often seem to fall into the trap of over-engineering things, like, spending a lot of time building personal stuff that can scale into the infinity – only to end up never having more than 10 concurrent visitors at any given time.
By accepting and embracing the intermittent and slightly unstable nature of serving websites from our homes, we can take back much of the in(ter)dependence and hacky, quirky fun that many of us fondly remember from the 90's and 2000's.
Setting up a webserver on a Pi and serving it from home can be a fun project, not necessarily trivial for everyone, but definitely within reach for most technically inclined people.
Might be among the cheapest, easiest AND fun ways to host a static site :)
Bonus that this works even if your server is behind some DMZ or bastion without having to open up the mothership.
(For the use-case of a public personal static site this probably doesn't make much sense; you could just publish to the VPS in the first place. But if you have server-side stuff you must or want to run locally for whatever reason, this is way more straightforward to get right than manually fiddling with iptables, using some SSH tunnel, or setting up redundant reverse proxies)
https://xkcd.com/908/
Put the free Cloudflare proxy in front just to speed it up a little.
If I switched to a different ISP, I used to be able to get a fixed IP for a one-time $40 fee. I think that offer is no longer available though.
Deleted Comment
Do you take precautions or just accept the theoretically lower reliability of the phone – which isn't anything reliable anymore anyway, thanks to VoIP?
Deleted Comment
Me and a friend mentor for Code Louisville, and we're able to get the beginner frontend students up and running very quickly using these options - they're free and much easier to use.
"GitHub Pages is not intended for or allowed to be used as a free web hosting service to run your online business, e-commerce site, or any other website that is primarily directed at either facilitating commercial transactions or providing commercial software as a service (SaaS)."
So it's great if you want to host a personal site / blog or some other non-revenue-generating website, but for anything more than that you could run into issues.
[0] https://docs.github.com/en/github/working-with-github-pages/...
In my opinion, there are some cases where it is a gray area: professional portfolio, case studies website with an email collection form, blog posts where you mention that you are a freelancer, a site where you link to your YouTube videos, some simple JavaScript app that don't require a backend but might make some money with ads...
So I'm just wondering how often they strike down at people who operate in this gray area.
I'm not sure why they're doing it this way, it should be a nominal task to restrict viewers to those with read access to your repo, but I think it underscores the use cases they have in mind here. They want you to use it as a blog and not as a serious web hosting solution.
Damn, I have never been so wrong:
And so much more.One of their neat features is the ability to spin up "pull request sites" automatically, so you can share versions of your site with stakeholders before it's made live.
Disclaimer: I work at Vercel, happy to answer any questions.
[1]: http://vercel.com/
Dead Comment
Disclaimer: I don't work at Vercel and I'm just a regular guy
[1] - https://neocities.org/
Use it to host my blog; free tier is good, but I really like them and have really phenomenal limits, etc, so just pay (even though my blog is not updated nearly as much as I'd like)
Example site: https://jeremyshaw.co.nz
Other than that I'm sure Github Pages and Netlify are fine, both very popular options.
Worst case you put it on Amazon S3 and get charged a fraction of a dollar every month .
The fact that you cannot put a cap on your spending is infuriating (the official AWS answer is "we do not want to break your business" (even if I want to)).
To be fair, some complaining or crying in such cases usually gets your bill reversed.
I've had spikes of up to 300k visitors to my Firebase blog in a month, and the bill was like $150. It seems hard to get bitten by huge surprise costs from a blog on Firebase static hosting, even if someone's using a botnet to try to drive up your bill.
Edit: Thinking about it a little more, I guess you could find the largest resource on a blog and direct your botnet to download it nonstop repeatedly, and that would be orders of magnitude more expensive than even large organic traffic.
You can set up billing alarms and use alarms as triggers for actions.
I miss the old days :(
I host my personal site this way.
I also host about 100 mostly-static mostly-low-use websites at work similarly: with Apache on a couple of VMs. They are more reliable than the free versions of Netlify etc, and the systems work to maintain 100 static websites is little different to maintaining 5 static websites.
Deleted Comment
It will also be stupid fast (cloudfront is edge networks) + have ssl + it will not go down if you ever see a surge in traffic.
I have several of these and they work beautifully. As far as cost goes: 12$/domain per year + 0.5$/month for route53 zone + 0.2$/month for the actual traffic and hosting the content in S3. So it's under 2$/month for "enterprise grade"
Except that it isn't.
Only the domain, and to some extend, the S3 costs are the "fixed" cost that you can even predict at first. A surge in traffic will surely increase the egress data, and that is easily $0.05-0.10 per GB.
Cloudfront is one of the most expensive CDNs out there, and after 2-3 years of use, I came to realize how much I was wasting money on. CDNs often work quite flawlessly if you have a good cache busting mechanism, and I was paying a huge premium for what was essentially a set-once operation.
at 10GB per month you pay 0.8$/month yes the traffic it costs extra, but a static website should not be GB (or even MB) in size. Also, if you have a lot of traffic you should learn how to control the caching aspect.
i would also challenge you to look at what egress traffic costs in other places.