Readit News logoReadit News
ve55 · 7 years ago
Disagree with HSTS being 'dangerous' in 2019. There are not really any good excuses left to have any parts of your website (new/different subdomains included) unable to use https. On the other hand, HPKP is a lot easier to mess up and is more situational, but HSTS should be standard by now.

The author's recommendations are still good (If everyone tried to set up strict HPKP+CSP on their websites, I can imagine how many would break), but I view things like "If you've a sub-site that you never got round to securing (e.g. http://blog.example.com), then you've just forced yourself to upgrade with this policy." as a positive, not a negative (hence the word 'upgrade').

ficklepickle · 7 years ago
I definitely agree there's no excuse not to use TLS anymore.

I recently ran into this on a city of Vancouver website for voting information. While any page that had a form used HTTPS, all other pages forced the user to use HTTP. Like, for those pages, it redirected to unencrypted even if you typed in https://. Including the page with polling location information.

So any malicious actor in a privileged position, like a public WiFi network operator, could have effectively prevented people from getting accurate poll location information, effectively DoSing prospective voters.

I tried to bring this to their attention, but I got a response telling me their IT guy says it's not a problem because no user data is submitted on those pages. Never mind that it was probably more work to selectively enable TLS per URL and leaves important content vulnerable to manipulation. Incredible!

It's like the old "all of our download links use HTTPS", but the downloads page is served unencrypted. Frustrating.

sildur · 7 years ago
Maybe all they can afford is a really old or weak server machine, and are afraid of the extra computational power that enabling https on all pages would incur.
lisper · 7 years ago
> there's no excuse not to use TLS anymore

I have a personal site that consists entirely of low-value static assets:

http://flownet.com/ron/

Why should I use TLS for that?

Boulth · 7 years ago
HPKP as people know it is being withdrawn from browsers so there is no need to discuss it further.

Source: https://chromestatus.com/feature/5903385005916160

nijave · 7 years ago
Caching. If Netflix wants to let company or University x cache their shows, they can just distribute over http encrypted blocks of content. Same with package distribution (although these are usually verified via signing)
JoshTriplett · 7 years ago
SXG would allow that without breaking security: https://wicg.github.io/webpackage/draft-yasskin-http-origin-...

And Netflix and similar already handle caching in a much better way: they effectively hand CDN nodes to ISPs, and then direct requests specifically to the right CDN.

jeroenhd · 7 years ago
Netflix can't really be cached anyway because of DRM + rotating keys. You'd need to cache for every device (type/model) and get the big wigs from Hollywood to accept allowing data storage in caches everywhere.

Also, I don't want my employer to know what movies I'm watching. I'd personally opt out (or not use Netflix) if this became an option.

geofft · 7 years ago
(2015), edited (2018).

I've had week-long HSTS on my personal website for a few years (which is short enough that most clients ignore it) out of an abundance of caution/FUD, and it hasn't really been a problem - I have had periods where my cert expired (for complicated reasons, I renew Let's Encrypt certs manually every three months, and sometimes I don't get around to it in time), but I didn't remove the regular HTTP 301 to HTTPS during that time. So I don't think permanent / preloaded HSTS would have been a problem.

On other sites I've set up since then, I've built them on top of hosting that assumes reliable HTTPS and renews it for me, e.g., Twisted with txacme or AWS Cloudfront with Amazon's CA. So I've been able to assume working HTTPS from day one.

In October 2017, Google announced plans to deprecate self-service HPKP for exactly the reasons outlined in this article, which took effect in late 2018. See https://developers.google.com/web/updates/2018/04/chrome-67-... and the links provided . If you're a major site and you really, really know what you're doing and you're confident the risks outlined in this article , you can still get a hard-coded HPKP entry in the browser source code.

tunetheweb · 7 years ago
As others have pointed out, this post is from 2015 and a lot has changed since then. I've added an updated section at the end, to clarify my thoughts since then (mostly unchanged to be honest, except for CSP): https://www.tunetheweb.com/blog/dangerous-web-security-featu...
Cpoll · 7 years ago
> The impact of an incorrect CSP policy, or browser issue could vary from a "Tweet This" button not loading (no big deal), to ads not loading (hurting your income), to stylesheets not loading (basically your whole website is broken).

I really don't like the sentiment that you shouldn't add a security feature because it might be difficult. Any change comes with risk of regression, but CSP isn't even domain-level like the other ones, it only affects the resources it's attached to. It shouldn't be any scarier than making any other change to your site/app.

Replace with 'the impact of replacing your md5-ed passwords with properly hashed ones... basically your whole website is broken'. Of course, that's a reductio ad absurdum in some cases, but if it protects you from XSS it might be a good analogy.

tunetheweb · 7 years ago
OP here and I disagree. Implementing a CSP for a page is hard (given the many different browsers), implementing it for a site is really hard! And yes it does pretty much need to be "domain level" to be effective.

It's easy to test if a password algorithm change fails, not so much for CSP. And the reporting options are next to useless because they are so noisy.

That's not to say people shouldn't implement CSP - it's a great option (now - less so in 2015 when this post was written). But they shouldn't just copy and paste a CSP policy from a random blog post they found, get an A+ on a security scanning tool and feel proud, without realising that they may have broken part of their website or implemented a pointless CSP. That was intention of this post and apologies if it read as "don't use then cause they are hard".

Cpoll · 7 years ago
> It's easy to test if a password algorithm change fails, not so much for CSP.

Probably a bad example, because the former is server-side. But why is CSP harder to test than any other client-side change, like rewriting your login page/component?

> And yes it does pretty much need to be "domain level" to be effective.

I meant to say that you can add it as a XSS prevention to example.com/app/ and not worry about example.com/static/ or example.com/blog/

maletor · 7 years ago
Can we add minimum password complexity requirements to this list? There is nothing more annoying than having to adjust my already 128-bits of entropy password because the website feels I need a special character. Plus, now hackers have a guide for what the password looks like.
bqe · 7 years ago
NIST 800-63b actually recommends against character class requirements[1] in favor of minimum length requirement and blacklists of breached passwords and other obvious passwords. Sites that require special characters are not following the current best practice.

[1]: https://pages.nist.gov/800-63-3/sp800-63b.html

tpetry · 7 years ago
Isn‘t any obvious password already in the list of breached passwords? ;)

Dead Comment

userbinator · 7 years ago
And then we come to the most dangerous item, because you have least control over it: preloading HSTS right into the browser.

I can't say specifically why, but there's something about a browser that treats a certain list of sites specially, by default, that just doesn't sit well with me. I've had this feeling ever since I heard about the feature. Not exactly net neutrality, but somewhat reminds me of it.

rocqua · 7 years ago
Really, this solution is because the inverse would be to extreme. We cannot say "https by default" except for this list. So instead we go with a list of https only. Really though, most traffic should move to https.

Moreover, this solves a real problem. We want sites like paypal, facebook, and gmail to really demand HTTPS. There should not be a race to MitM fresh browser installs.

defanor · 7 years ago
Sounds similar to web browsers coming with their own CA certificates instead of using system-wide ones, leading to poor integration and inconsistencies. Though a centralized database of rules for websites sounds awkward on its own.
tialaramex · 7 years ago
> Sounds similar to web browsers coming with their own CA certificates instead of using system-wide ones, leading to poor integration and inconsistencies.

Well, the only notable browser which does this is Mozilla's Firefox, and not coincidentally the only root trust store where you can actually see how the sausage is made is Mozilla's. All the other big trust stores (Apple, Microsoft, Google) are black boxes. Presumably they have one or more employees dedicated to this stuff, but since we're not shown their working it might equally be the product of an intern throwing darts at a list.

Right now for example Mozilla is discussing Certinomis, a small CA which doesn't seem to be very good at the technical aspects of their job, issuing certificates for DNS names with spaces in them, typo'ing dates, filling parameters out incorrectly, nothing that screams "evil" but certainly more clumsy than we'd prefer. Are other trust stores thinking about Certinomis? You'll only find out if one of them announces a change.

nokya · 7 years ago
This article is probably much more harmful than the security features it describes as dangerous...