Readit News logoReadit News
embedding-shape · a month ago
I don't know exactly what the website was, but if it's just HTML, CSS, some JS and some images, why would you ever host that on a "pay per visit/bandwidth" platform like AWS? Not only is AWS traffic extra expensive compared to pretty much any alternative, paying for bandwidth in that manner never made much sense to me. Even shared hosting like we did early 00s would have been a better solution for hosting a typical website than using AWS.
throwup238 · a month ago
Especially when Cloudflare Pages is free with unlimited bandwidth, if you don't need any other backend. The only limit is 100 custom domains and 500 builds per month in their CI/CD, the latter of which you can bypass by just building everything in Github Actions and pushing it to Pages.
fragmede · a month ago
Because Cloudflare pages has this doofy deploy limit hanging over your head. Even if you won't reasonably run into it, it's still weird. R2's free limits are significantly higher.
anamexis · a month ago
AWS CloudFront pricing seems pretty competitive with other CDNs, at least for sites that are not very high traffic.

https://aws.amazon.com/cloudfront/pricing/

embedding-shape · a month ago
> AWS CloudFront pricing seems pretty competitive with other CDNs

Sure, but it's unlikely you actually have to place a CDN in front of your manual, it's mostly text with few images. People default to using CDNs way too quickly today.

SatvikBeri · a month ago
Seems like the maintainer just tried AWS out of curiosity, and never needed to optimize hosting until scrapers suddenly slammed the site.
tarsius · a month ago
That's exactly what I did! Though I would call it morbid curiosity. :P

After initial setup it was smooth sailing. Other more reasonable setups would also have been smooth sailing, but... they weren't setup yet. I was uneasy about the possibility of a surprise bill happening, as it eventually did, but until the brain dead LLM leeches came along, that just never happened. After a decade of it not happening, I wasn't that concerned anymore, but I guess when it comes to the AI bots, I had my head in the sand a bit. I still though something like a 500% bill might happen, not 5000%.

Once it did happen, I immediately shut my sides down, and within the hour the account was no more. On the way out I saw that you can now actually set a "spending limit", it still had a [new] next to it. I tried setting it up, but could only quickly figure out how to setup a notification. It might be possible to set an actual spending limit, but not in a few minutes -- probably got to read some documentation for that.

But even if this were a one click setting, it wouldn't have made a difference at this point. You do this once and I am gone. Also, I wanted to move away from Amazon anyway, so really, this was the kick in the pants that I needed.

For now I am using Github Pages for the very static parts, and the free hosting provided by my email provider, for the slightly less static manuals generated with Github Actions. I would have made sense to use Github for both (not least so that Microsoft could cover the cost of the bots they have unleashed), but I wanted to avoid the complexity of committing to the same pages repository from the CI pipelines of multiple package repositories.

AceJohnny2 · a month ago
I love Magit, it's the only git gui I can stomach (it helps that I use Emacs already).

I donated a bit of money to help tarsius offset the cost of AWS LLM abuse, well deserved for the value I've gotten from his tools.

tarsius · a month ago
Thanks!

Yesterday evening I saw that I had a few new sponsors and was wondering where they had come from.

So in the end something good came of it. The one time donations covered the bill, and I also got a few new monthly sponsors. (Well, unless you also take the hours into account that it took me to move to new hosting, then its way way below minimal wage, but as a maintainer of free software, I am used to that by now.)

Sooo... I guess I should take the opportunity and do a bit of marketing. I am still making a living maintaining Magit et al., so please consider sponsoring my day to day work too. Thanks!

kace91 · a month ago
>I immediately burned down my account with that hosting provider1, because they did not allow setting a spending limit.

Is this true? He mentions the provider being AWS, surely some sort of threshold can be set?

nijave · a month ago
If it's AWS, yes it's true. All the billing is async and some as slow as daily (although it can be very granular/accurate).

In addition, it's a pay-per-use platform

forgotpwd16 · a month ago
Unless something has changed recently, all you can do is set budget alerts on billing updates. Runaway costs for people simply testing AWS is common. (On the bright side, again unless something has changed recently, asking them in support to scrap them works.)
no_wizard · a month ago
As far as I am aware, there is not. It’s been a long standing complaint about the platform.
electroly · a month ago
There are two widely understood downsides of AWS:

1. High egress costs

2. No hard spending limits

Both of these were problems for the author. I don't mean to "blame the victim" but the choice of AWS here had a predictable outcome. Static documentation is the easiest content to host and AWS is the most expensive way to host it.

nijave · a month ago
Really high bandwidth costs in general. I've never worked anywhere large enough to hit them, but I've heard inter-AZ traffic in the same region can become quite expensive once you're big enough
herewulf · a month ago
This is a good reminder of the real financial costs incurred by maintainers of your favorite Emacs packages.

Here's a nice repo with details on how to support them!

https://github.com/tarsius/elisp-maintainers

Also worth pointing out that the author of Magit has made the unusual choice to make a living off developing Emacs packages. I've been happy to pitch in my own hard earned cash in return for the awesomeness that Magit is!

cratermoon · a month ago
"Thanks to LLM scrapers, hosting costs went up 5000% last month"
ssivark · a month ago
Uggghhhh! AI crawling is fast becoming a headache for self-hosted content. Is using a CDN the "lowest effort" solution? Or is there something better/simpler?
embedding-shape · a month ago
Nah, just add a rate limiter (which any public website should have anyways). Alternatively, add some honeypot URLs to robots.txt, then setup fail2ban to ban any IP accessing those URLs and you'll get rid of 99% of the crawling in half a day.
nijave · a month ago
Depending on the content and software stack, caching might be a fairly easy option. For instance, Wordpress W3 Total Cache used to be pretty easy to configure and could easily bring a small VPS from 6-10req/sec to 100-200req/sec.

Also some solutions for generating static content sites instead of "dynamic" CMS where they store everything in a DB

If it's new, I'd say the easiest option is start with a content hosting system that has built-in caching (assuming that exists for what you're trying to deploy)

d4rkp4ttern · a month ago
Magit was my favorite terminal/TUI way to interact with Git, until I found GitUI.

Deleted Comment

phplovesong · a month ago
I quit emacs 10 years ago. But i have fond menories from magit. Why was the manual taken offline?
IceDane · a month ago
Why didn't you just follow the link and find out?
agumonkey · a month ago
he's waiting for the magit-transient shortcut :fff
phplovesong · a month ago
Link did not show any data on the why. Im not reading a long blamegame on the why on some random issue.