Readit News logoReadit News
TrueDuality commented on A blog does not need “analytics”   thisdaysportion.com/posts... · Posted by u/FromTheArchives
_heimdall · 11 hours ago
I have always thought of blogs as being written primarily for the author. Maybe they write because they enjoy writing, or to think through something, or to leave notes for later.

When someone does it for the audience I always consider it more of a publication. Maybe that just semantics, but that's been the distinction for me.

TrueDuality · 10 hours ago
I write primarily as a means to collect my thoughts and outcomes around projects. I keep analytics on my site not to optimize for any particular audience but because it feels validating and that I'm contributing in another form.

I still see high traffic on a post explaining oddities in some of Route53's unintuitive behaviors and hope I'm making someone's day a little better in giving them a solution.

That drives me to write more.

TrueDuality commented on About Containers and VMs   linuxcontainers.org/incus... · Posted by u/Bogdanp
jiggawatts · 3 days ago
Which just validates my point that a generic-sounding domain is the wrong place to host content that even within the Linux ecosystem is a relatively minor player.
TrueDuality · 3 days ago
LXC far predates docker regardless of size or impact. It's not disingenuous if you were literally the foundation docker was able to package into a shiny accessible tool.
TrueDuality commented on Internet Access Providers Aren't Bound by DMCA Unmasking Subpoenas–In Re Cox   blog.ericgoldman.org/arch... · Posted by u/hn_acker
sumtechguy · 3 days ago
Would have to read thru Tittle II of the communications act and see what portions they are under. Title I is POTs and Title II is ISPs and Cellphone providers. Title I tends to be much more strict.
TrueDuality · 3 days ago
Remember that under the last reign of the current present, information services were removed from Title II regulation. Biden did vote to restore the net neutrality status last year but that was challenged in court and never went into effect. It was ultimately overturned in January and we're left without net neutrality protections.
TrueDuality commented on Proposal: AI Content Disclosure Header   ietf.org/archive/id/draft... · Posted by u/exprez135
yahoozoo · 4 days ago
It says in the first paragraph it’s for crawlers and bots. How many humans are inspecting the headers of every page they casually browse? An immediate problem that could potentially be addressed by this is the “AI training on AI content” loop.
TrueDuality · 4 days ago
How many of the makers of these trash SEO sites are going to voluntarily identify their content as AI generated?
TrueDuality commented on SSL certificate requirements are becoming obnoxious   chrislockard.net/posts/ss... · Posted by u/unl0ckd
mrgaro · 4 days ago
What dictates that certificate update needs to have a manual change process? I'd bet that it's just legal team saying that "this is how it's always been" instead of adjusting their interpretation as the environment around changes.
TrueDuality · 4 days ago
The references I'd direct you to are NIST 800-53r5 controls CM-3 (Configuration Change Control) and CM-4 (Impact Analyses) along with their enhancements, require that configuration changes go through documented approval, security impact analysis, and testing before implementation. A certificate change is unfortunately consider a configuration change to the services.

Each change needs a documented approval trail. While you can get pre-approval for automated rotations as a class of changes, many auditors interpret the controls conservatively and want to see individual change tickets for each cert rotation, even routine ones.

TrueDuality commented on SSL certificate requirements are becoming obnoxious   chrislockard.net/posts/ss... · Posted by u/unl0ckd
Intermernet · 4 days ago
Since the advent of LetsEncrypt, ACME, and Caddy I haven't thought about SSL/TLS for more than about an hour per year, and that's only because I forget the steps required to setup auto-renewal. I pay nothing, I spend a tiny amount of time dealing with it, and it works brilliantly.

I'm not sure why many people are still dealing with legacy manual certificate renewal. Maybe some regulatory requirements? I even have a wildcard cert that covers my entire local network which is generated and deployed automatically by a cron job I wrote about 5 years ago. It's working perfectly and it would probably take me longer to track down exactly what it's doing than to re-write it from scratch.

For 99.something% of use cases, this is a solved problem.

TrueDuality · 4 days ago
Speaking as someone who has worked in tightly regulated environment, certificates are kind of a nasty problem and there are a couple of requirements that are in conflict for going to full automation of certificates.

- Rotation of all certificates and authentication material must be renewed at regular intervals (no conflict here, this is the goal)

- All infrastructure changes need to have the commands executed and contents of files inspected and approved in writing by the change control board before being applied to the environment

That explicit approval of any changes being made within the environment go against these being automated in any way shape or form. These boards usually meet monthly or ad-hoc for time-sensitive security updates and usually have very long lists of changes to review causing the agenda to constantly overflow to the next meeting.

You could probably still make it work as a priority standing agenda idea but its going to still involve manual process and review every month. I wouldn't want to manually rotate and approve certificates every month and many of these requirements have been signed into law (at least in the US).

Starting to see another round of modernization initiatives so maybe in the next few years something could be done...

TrueDuality commented on Everything I know about good API design   seangoedecke.com/good-api... · Posted by u/ahamez
0x1ceb00da · 6 days ago
So a refresh token on its own isn't more secure than a simple api key. You need a lot of plumbing and abuse detection analytics around it as well.
TrueDuality · 6 days ago
Almost every one of those benefits _doesn't_ require anything else. You need one more API endpoint to exchange refresh tokens for bearer token (over a simple static API key) and you get those benefits.
TrueDuality commented on Everything I know about good API design   seangoedecke.com/good-api... · Posted by u/ahamez
0x1ceb00da · 6 days ago
> The refresh token/bearer token combo is pretty powerful and has MUCH stronger security properties than a bare API key

I never understood why.

TrueDuality · 6 days ago
The quick rundown of refresh token I'm referring to is:

1. Generate your initial refresh token for the user just like you would a random API key. You really don't need to use a JWT, but you could.

2. The client sends the refresh token to an authentication endpoint. This endpoint validates the token, expires the refresh token and any prior bearer tokens issued to it. The client gets back a new refresh token and a bearer token with an expiration window (lets call it five minutes).

3. The client uses the bearer token for all requests to your API until it expires

4. If the client wants to continue using the API, go back to step 2.

The benefits of that minimal version:

Client restriction and user behavior steering. With the bearer tokens expiring quickly, and refresh tokens being one-time use it is infeasible to share a single credential between multiple clients. With easy provisioning, this will get users to generate one credential per client.

Breach containment and blast radius reduction. If your bearer tokens leak (logs being a surprisingly high source for these), they automatically expire when left in backups or deep in the objects of your git repo. If a bearer token is compromised, it's only valid for your expiration window. If a refresh token is compromised and used, the legitimate client will be knocked offline increasing the likelihood of detection. This property also allows you to know if a leaked refresh token was used at all before it was revoked.

Audit and monitoring opportunities. Every refresh creates a logging checkpoint where you can track usage patterns, detect anomalies, and enforce policy changes. This gives you natural rate limiting and abuse detection points.

Most security frameworks (SOC 2, ISO 27001, etc.) prefer time-limited credentials as a basic security control.

Add an expiration time to refresh tokens to naturally clean up access from broken or no longer used clients. Example: Daily backup script. Refresh token's expiration window is 90 days. The backups would have to not run for 90 days before the token was an issue. If it was still needed the effort is low, just provision a new API key. After 90 days of failure you either already needed to perform maintenance on your backup system or you moved to something else without revoking the access keys.

TrueDuality commented on Everything I know about good API design   seangoedecke.com/good-api... · Posted by u/ahamez
maxwellg · 6 days ago
Refresh tokens are only really required if a client is accessing an API on behalf of a user. The refresh token tracks the specific user grant, and there needs to be one refresh token per user of the client.

If a client is accessing an API on behalf of itself (which is a more natural fit for an API Key replacement) then we can use client_credentials with either client secret authentication or JWT bearer authentication instead.

TrueDuality · 6 days ago
That is a very specific form of refresh token but not the only model. You can just easily have your "API key" be that refresh token. You submit it to an authentication endpoint, get back a new refresh token and a bearer token, and invalidate the previous bearer token if it was still valid. The bearer token will naturally expire and if you're still using it, just use the refresh immediately, if its days or weeks later you can use it then.

There doesn't need to be any OIDC or third party involved to get all the benefits of them. The keys can't be used by multiple simultaneous clients, they naturally expire and rotate over time, and you can easily audit their use (primarily due to the last two principles).

TrueDuality commented on Everything I know about good API design   seangoedecke.com/good-api... · Posted by u/ahamez
cyberax · 6 days ago
> You should let people use your APIs with a long-lived API key.

Sigh... I wish this were not true. It's a shame that no alternatives have emerged so far.

TrueDuality · 6 days ago
There are other options that allow long-lived access with naturally rotating keys without OAuth and only a tiny amount of complexity increase that can be managed by a bash script. The refresh token/bearer token combo is pretty powerful and has MUCH stronger security properties than a bare API key.

u/TrueDuality

KarmaCake day2178July 21, 2016View Original