Responsible Disclosures and their consequences have been a disaster for the human race. Companies need to feel a lot more pain a lot more often in order for them to take the security of their customers a lot more serious. If you just give them month to fix an issue and spoon-feed them the solution it's just another ticket in their Backlog. But if every other security issue becomes enough news online that their CEOs are involved and a solution must be find in hours not month, they will become a lot more proactive. Of course it's the end users that would suffer most from this. But then again, they buy ASUS so they suffer already...
I think ASUS' turnaround time on this was quite good, I don't see the problem here. ASUS didn't deny the bug, didn't threaten to prosecute anyone for reverse engineering their software, and quickly patched their software. I have no doubt that before the days of responsible disclosure, this process would've taken months and might have involved the police.
Normal people don't care about vulnerabilities. They use phones that haven't received updates in three years to do their finances. If you spam the news with CVEs, people will just get tired of hearing about how every company sucks and become apathetic once there's a real threat.
The EU is working on a different solution. Stores are not permitted to sell products with known vulnerabilities under new cybersecurity regulations. That means if ASUS keeps fucking up, their motherboards become dead stock and stores won't want to sell their hardware anymore. That's not just computer hardware, but also smart fridges and smart washing machines. Discover a vulnerability in your dish washer and you may end up costing the dish washer industry millions in unusable stock if their vendors haven't bothered to add a way to update the firmware.
>They say “This issue is limited to motherboards and does not affect laptops, desktop computers”, however this affects any computer including desktops/laptops that have DriverHub installed
>instead of them saying it allows for arbitrary/remote code execution they say it “may allow untrusted sources to affect system behaviour”.
> Stores are not permitted to sell products with known vulnerabilities under new cybersecurity regulations.
What are the specifics on that? Like does the vulnerability need to be public or is it enough if just the vendor knows about it? Does everyone need to stop selling it right away if new vulnerability is discovered or do they some time patch it? I'm pretty sure software like Windows almost definitely has some unfixed vulnerabilities that Microsoft knows about and is in process of fixing every single day of the year. Currently even if they do have a fix, they would end up postponing it until next patch Tuesday.
And what even is "vulnerability" in this context? Remote RCE? DRM bypass?
"Responsible" disclosure is paradoxically named because actually it is completely irresponsible. The vast majority of corporations handle disclosures badly in that they do not fix in time (i.e. a week), do not attribute properly, do not inform their users and do not learn from their mistakes. Irresponsibly delayed limited disclosure reinforces those behaviors.
The actually responsible thing to do is to disclose immediately, fully and publically (and maybe anonymously to protect yourself). Only after the affected company has repeatedly demonstrated that they do react properly, they might earn the right for a very time-limited heads-up of say 5 work days or something.
That irresponsibly delayed limited disclosure is even called "responsible disclosure" is an instance of newspeak.
I make software. If you discover a vulnerability, why would you put my tens of thousands of users at risk, instead of emailing me and have the vulnerability fixed in an hour before disclosing?
I get that companies sit on vulnerabilities, but isn't fair warning... fair?
That's because nobody actually cares about security nor do they want to pay for it. I'm a security champion at my company and security related work gets pushed off as much as possible to focus on feature work. If we actually wanted security to be a priority, they would employ security champions who's only job was to work on security aspects of the system instead of trying to balance security and feature work, because feature work will always prevail.
> "Responsible" disclosure is paradoxically named because actually it is completely irresponsible.
It's only paradoxical if you've never considered the inherent conflicts present in everything before.
The "responsible" in "responsible disclosure" relates to the researchers responsibility to the producer, not the companies responsibility to their customers. The philosophical implication is that the product does what it was designed to do, now you (the security researcher) is making it do something you don't think it should do, and so you should be responsible for how you get that out there. Otherwise you are damaging me, the corporation, and that's just irresponsible.
As software guys we probably consider security issues a design problem. The software has a defect, and it should be fixed. A breakdown in the responsibility of the corporation to their customer. "Responsible disclosure" considers it external to the software. My customers are perfectly happy, you have decided to tell them that they shouldn't be. You've made a product that destroys my product, you need to make sure you don't destroy my product before you release it.
The security researcher is not primarily responsible to the public, they are responsible to the corporation.
It's not a paradox, it's just a simple inversion of responsibility.
What about damage control? I would argue your "anonymous, immediate disclosure" to the public (filled with bad actors) would be rubbing salt in the wound (allow more people to exploit the vulnerability before it's fixed). That's why nobody publishes writeups before the vuln is fixed. Even if corporations don't fix vulns in time, I can only see harm being done from not privately reporting them.
I mean, to be a bit more reasonable, there's a middle ground here. Maybe disclosing a massive RCE Vulnerability in software used by a lot of companies on 25th of December is not a good Idea. And perhaps an Open Source Dev with a security@project mail deserves a tad more help and patience than a megacorp with a record of shitty security management. And if you are a company that takes security serious and is responsive to security researchers inquiries they deserve at least the chance to fix it fast and before it becomes public.
It's just that there are some companies EVERYONE knows are shitty. ASUS is one of them.
The problem is just one of legislation of liability. Car manufacturers are ordered to recall and fix their cars, but software/hardware companies face just too little pressure. I think customers should be able to get full refund for broken devices (with unfixed CVE for example).
The devices and core functionality (including security updates, which are fixes to broken core functionality) must survive the manufacturer and should not require ongoing payments of any type*. (new updates being created? maybe, access to corrections to basic behavior? Bug / security fixes should remain free.)
Citing CGPGrey: Solutions that are the first thing you can think of are terrible and ineffective.
Good safety/security culture encourages players to not hide their problems. Corporations are greedy bastards. They'll do everything to hide their security mistakes.
You are also making legitimate, fixable in a month issues available for everyone which increases their chances to be exploited a lot.
> You are also making legitimate, fixable in a month issues available for everyone which increases their chances to be exploited a lot.
I don't think you can fathom the amount of people that have phones with roughly 3 years of no android updates as their primary device with which they use all the digital services they use, Banking, Texting, Doomscrolling, Porn, ...
Users, especially the most likely to be exploited are already vulnerable to so much shit and even when there's a literal finished fix available, these vendors do shit about it. Only when their bottomline is threatened because even my mom knows "Don't buy anything with ASUS on it, your bank account gets broken into if you do" will we see change.
> Good safety/security culture encourages players to not hide their problems. Corporations are greedy bastards. They'll do everything to hide their security mistakes.
This is why I despise the Linux CNA for working against the single system that tries to hold vendors accountable. Their behavior is infantile.
Business idea. Maybe this already exists. A disclosure aggregator/middle man which:
- protects the privacy of folks submitting
- vets security vulns. Everything they disclose is exploitable.
- publishes disclosures publicly at a fixed cadence.
- allows companies to pay to subscribe to an "early feed" of disclosures which impact them. This money is used to reward those submitting disclosures, pay the bills, and take some profit.
A bug bounty marketplace, if you will. That is slightly hostile to corporations. Would that be legal, or extortion?
> I asked ASUS if they offered bug bounties. They responded saying they do not, but they would instead put my name in their “hall of fame”. This is understandable since ASUS is just a small startup and likely does not have the capital to pay a bounty.
I wonder how worried they would get if more people actually started selling exploits on the black market, instead of reporting and not getting a bug bounty. If you don’t offer a bug bounty program in the first place, my gut feeling is that they probably wouldn’t care in that case either. Either way, this is a super good reason to not do business with such a company.
For me it's them lying about providing a way to unlock the bootloader of my soon to be 1000€ paperweight(2 android updates only) called an Asus zenfone 10.
>so I could see if anyone else had a domain with driverhub.asus.com.* registered. From looking at other websites certificate transparency logs, I could see that domains and subdomains would appear in the logs usually within a month. After a month of waiting I am happy to say that my test domain is the only website that fits the regex, meaning it is unlikely that this was being actively exploited prior to my reporting of it.
This only remains true in so far as no-one directly registered for a driverhub subdomain. Anyone with a wildcard could have exploited this, silent to certificate transparency?
A wildcard certificate is only for a single label level, '*.example.com.' would not allow 'test.test.example.com.', but would allow 'test.example.com.'. If someone issued a wildcard for '*.asus.com.example.com.', then could present a webserver under 'driverhub.asus.com.example.com.' and be seen as valid.
You're right about the wildcard certificate blind spot. An attacker with a wildcard cert for .example.com could have exploited this without appearing in CT logs specifically for driverhub.asus.com. domains. This is why CT log monitoring alone isn't sufficient for detecting these types of subdomain takeover vulnerabilities.
It's 'driverhub.asus.com.example.com.' not 'driverhub.example.com.', therefore entirely discoverable in CT logs by searching for (regex): (driverhub|\*)\.asus\.com\.
> When submitting the vulnerability report through ASUS’s Security Advisory form, Amazon CloudFront flagged the attached PoC as a malicious request and blocked the submission.
> This is understandable since ASUS is just a small startup.
A small startup with a marketcap of only 15 B. What is more than understandable is that you give a shit not only about your crappy products but the researcher that did a HUGE work for your customers.
I truly feel bad for researchers doing this kind of work only to get them dismissed/trashed like this. So unfair.
The only thing that is ought to be done is not to purchase ASUS products.
I asked ASUS if they offered bug bounties. They responded saying they do not, but they would instead put my name in their “hall of fame”. This is understandable since ASUS is just a small startup[1] and likely does not have the capital to pay a bounty.
I'm surprised to find that this is just a random person's blog. Was very prepared for an ad page, scalped domain, or some corporation trying to make money out of it. On the sadder side, it doesn't seem like this person makes any use of the domain's name at all; they could have had firstlast.cctld for their blog and given this to someone who wants to put a sarcastic joke on it. But better this than ad farms so I don't blame them for keeping it!
Normal people don't care about vulnerabilities. They use phones that haven't received updates in three years to do their finances. If you spam the news with CVEs, people will just get tired of hearing about how every company sucks and become apathetic once there's a real threat.
The EU is working on a different solution. Stores are not permitted to sell products with known vulnerabilities under new cybersecurity regulations. That means if ASUS keeps fucking up, their motherboards become dead stock and stores won't want to sell their hardware anymore. That's not just computer hardware, but also smart fridges and smart washing machines. Discover a vulnerability in your dish washer and you may end up costing the dish washer industry millions in unusable stock if their vendors haven't bothered to add a way to update the firmware.
>instead of them saying it allows for arbitrary/remote code execution they say it “may allow untrusted sources to affect system behaviour”.
Sounds like Asus did in fact deny the bug.
What are the specifics on that? Like does the vulnerability need to be public or is it enough if just the vendor knows about it? Does everyone need to stop selling it right away if new vulnerability is discovered or do they some time patch it? I'm pretty sure software like Windows almost definitely has some unfixed vulnerabilities that Microsoft knows about and is in process of fixing every single day of the year. Currently even if they do have a fix, they would end up postponing it until next patch Tuesday.
And what even is "vulnerability" in this context? Remote RCE? DRM bypass?
Do stores have to patch known vulnerabilities before releasing the product to customers or can customers install the patch?
The actually responsible thing to do is to disclose immediately, fully and publically (and maybe anonymously to protect yourself). Only after the affected company has repeatedly demonstrated that they do react properly, they might earn the right for a very time-limited heads-up of say 5 work days or something.
That irresponsibly delayed limited disclosure is even called "responsible disclosure" is an instance of newspeak.
I get that companies sit on vulnerabilities, but isn't fair warning... fair?
(and in the world of FOSS you might have "maintainer-coordinated" too)
It's only paradoxical if you've never considered the inherent conflicts present in everything before.
The "responsible" in "responsible disclosure" relates to the researchers responsibility to the producer, not the companies responsibility to their customers. The philosophical implication is that the product does what it was designed to do, now you (the security researcher) is making it do something you don't think it should do, and so you should be responsible for how you get that out there. Otherwise you are damaging me, the corporation, and that's just irresponsible.
As software guys we probably consider security issues a design problem. The software has a defect, and it should be fixed. A breakdown in the responsibility of the corporation to their customer. "Responsible disclosure" considers it external to the software. My customers are perfectly happy, you have decided to tell them that they shouldn't be. You've made a product that destroys my product, you need to make sure you don't destroy my product before you release it.
The security researcher is not primarily responsible to the public, they are responsible to the corporation.
It's not a paradox, it's just a simple inversion of responsibility.
It's just that there are some companies EVERYONE knows are shitty. ASUS is one of them.
Good safety/security culture encourages players to not hide their problems. Corporations are greedy bastards. They'll do everything to hide their security mistakes.
You are also making legitimate, fixable in a month issues available for everyone which increases their chances to be exploited a lot.
I don't think you can fathom the amount of people that have phones with roughly 3 years of no android updates as their primary device with which they use all the digital services they use, Banking, Texting, Doomscrolling, Porn, ...
Users, especially the most likely to be exploited are already vulnerable to so much shit and even when there's a literal finished fix available, these vendors do shit about it. Only when their bottomline is threatened because even my mom knows "Don't buy anything with ASUS on it, your bank account gets broken into if you do" will we see change.
This is why I despise the Linux CNA for working against the single system that tries to hold vendors accountable. Their behavior is infantile.
- protects the privacy of folks submitting
- vets security vulns. Everything they disclose is exploitable.
- publishes disclosures publicly at a fixed cadence.
- allows companies to pay to subscribe to an "early feed" of disclosures which impact them. This money is used to reward those submitting disclosures, pay the bills, and take some profit.
A bug bounty marketplace, if you will. That is slightly hostile to corporations. Would that be legal, or extortion?
I think there is serious potential for this.
Most folks don't put up with faulty products unless by decision, like those 1 euro/dollar shops, so why should software get a pass.
This is a prime example where a hyperbole completely obliterates the point one is trying to make.
This is a prime example of someone not getting the joke everyone else got. [0] [0] https://www.washingtonpost.com/wp-srv/national/longterm/unab...
Deleted Comment
Dead Comment
:(
Cisco have gone even further, by forgetting about their security announcements page, so any recognition is now long lost into the void.
https://sec.cloudapps.cisco.com/security/center/resources/ci...
that or full public disclosure.
I'm not sure where they got that from, Asus have been making motherboards and other pc parts since at least the 90s...
https://www.techspot.com/news/95425-years-gigabyte-asus-moth...
https://www.reddit.com/r/ASUS/comments/tg3u2n/removing_bloat...
https://www.reddit.com/r/ASUS/comments/ojsq80/nahimic_servic...
https://cve.mitre.org/data/board/archives/2016-06/msg00006.h...
(my old blog is long gone from tumblr, but I archived it:)
https://gist.github.com/indrora/2ae05811a2625a6c5e69c677db6e...
This only remains true in so far as no-one directly registered for a driverhub subdomain. Anyone with a wildcard could have exploited this, silent to certificate transparency?
- Would a self-signed cert work? Those aren’t in transparency logs.
- Does it have to be HTTPS?
All this, for literally nought
Reminder that WAFs are an anti-pattern: https://thedailywtf.com/articles/Injection_Rejection
A small startup with a marketcap of only 15 B. What is more than understandable is that you give a shit not only about your crappy products but the researcher that did a HUGE work for your customers.
I truly feel bad for researchers doing this kind of work only to get them dismissed/trashed like this. So unfair.
The only thing that is ought to be done is not to purchase ASUS products.
[1]: https://companiesmarketcap.com/asus/marketcap/