Readit News logoReadit News
parliament32 · 4 months ago
> First of all, a linear “score” like CVSS just cannot work in cybersecurity. Instead, we should have a system on the attributes of a vulnerability.

This is exactly what CVSS is: a scoring system based on attributes.

> In the first category, we might have attributes such as: Needs physical access to the machine, Needs to have software running on the same machine, even if in a VM, Needs to run in the same VM.

This is exactly what the AV vector in CVSS is.

> In the second category, we might have attributes such as: Arbitrary execution, Data corruption (loss of integrity), Data exfiltration (loss of confidentiality).

This is exactly what impact metrics in CVSS are.

I fear the author has a severe misunderstanding of what CVSS is and where the scores come from. There's even an entire CVSS adjustment section for how to modify a score based on your specific environment. I'd recommend playing around with the calculator a little to understand how the scores work better: https://nvd.nist.gov/vuln-metrics/cvss/v3-calculator

meander_water · 4 months ago
And CVSS4 has added more metrics - including an AT (Attack Requirements) field:

> Are there any conditions necessary for an attack which the attacker cannot influence?

https://nvd.nist.gov/vuln-metrics/cvss/v4-calculator

aja12 · 4 months ago
As a pentester, who does not love CVSS[0], I found the article explaining how to replace CVSS with CVSS very amusing

[0] CVSS is often poorly understood and used by internal teams so for our internal engagements, we prefer words like "minor", "medium", "major", "critical" to describe criticity and impact and "easy", "medium", "hard" to describe exploitation difficulty (which loosely translates to likelihood), and the reasoning behind all this is very similar to what CVSS does

shagie · 4 months ago
Have you ever stumbled across the PEF/REV method for classifying bugs?

https://www.fincher.org/tips/General/SoftwareDevelopment/Bug...

The essence of it is that "PEF" is from the user's point of view - pain, effort (work around), frequency. "REV" is from the developer's point of view- risk, effort (fix), verifiability.

Something that has a low PEF score and high REV score would not be practical to fix while something that is high PEF and low REV is something that should be prioritized high.

1970-01-01 · 4 months ago
Less than 24 hrs into it, and now we have two problems from this defunding fiasco:

All the original problems that exist within CVE.

"Let's just reinvent the wheel!"

Yes, you have a dev background, which entitles you to an opinion, and you also have good intentions. This road is noble. However, the crux of this disaster is not technical, it's political. Maybe reinventing the wheel will be a huge success. Maybe it can wear the crown of free and open source for a while. But it's much more likely this fails as things become difficult to maintain, and you become tired, or poor, and are forced to stop, with nobody, or even worse, an enemy (this is an internationally critical database) controlling the database. So let's focus on solving the original funding disaster without jumping to forking and fracturing as a knee-jerk solution.

saulpw · 4 months ago
Okay, so what's your proposal (or any proposal) to fix this funding disaster? As a technical person I don't know how to affect the political process, and am thus disinclined to participate meaningfully. People with political ability and/or power have bigger problems to deal with right now, and also seem unable to affect the current political process. So what options do we have but to try our best, even if those ways are ultimately doomed?
freeone3000 · 4 months ago
https://euvd.enisa.europa.eu/

This one is funded by the EU and accepts direct submissions. It’s probably the best replacement: state-backed, long-running, and reliable.

ElevenLathe · 4 months ago
Just because there is a problem doesn't necessarily mean you personally (or anybody, or any group of people) can solve it. Go ahead and try of you want, but also be wide-eyed about what success looks like, what the chances of it are, and what the costs will be.
iAMkenough · 4 months ago
Moving to more stable financial markets, away from the United States, would be a start.
1970-01-01 · 4 months ago
>As a technical person I don't know how to affect the political process

Vote.

xeromal · 4 months ago
I mean everyone could just pitch in to support it rather than the US government? I'm sure a few big corps could od it
jMyles · 4 months ago
> let's focus on solving the original funding disaster

The _original_ funding disaster is that this problem was delegated to the economic machinery of a nation-state, and humanity is presently in the process of evolving beyond nation-states.

Innovation in communication and information archival is an extremely long evolutionary process, persisting across aeons in the case of media like DNA and language, while the trivial shuffling through different varieties of state happens on comparatively extremely short (sometimes century or shorter) scales.

So, any solution that truly addresses the _original_ funding disaster must be future-compatible with an internet in which we've overcome the burden of nation-states.

ddejohn · 4 months ago
> humanity is presently in the process of evolving beyond nation-states

No, it isn't. This is some Curtis Yarvin BS.

> So, any solution that truly addresses the _original_ funding disaster must be future-compatible with an internet in which we've overcome the burden of nation-states.

What does this even mean? How do you envision a "future-compatible" CVE database? And what does it have to do with nation states?

jabiko · 4 months ago
So if I understand it correctly, the blog author proposes to create a professional certification, require companies that produce software to have at least one of this certified individuals be responsible for reporting vulnerabilities of the companies software, complete with creating authorities that issue such certifications, training and also compliance enforcement.

And all this to fix a broken CVE system? I assume that the friction this generates has a bigger negative impact on the overall ecosystem than the non-optimal CVE system that exists right now.

gavinhoward · 4 months ago
Not just to fix the broken CVE system, but to fix a lot of things that are broken in our industry.
rstuart4133 · 4 months ago
Getting agreement on a better scoring system for CVE's will be hard enough, assuming it's possible at all given the competing interests.

It makes a top down imposed set of technical fixes for a lot of things broken in our industry look at best like an impossible dream. If anyone claiming they have an oracle that tells you how much effort should be put into QA for any given piece of software is a bullshitter. If you let the bullshitters loose they will create quagmire of rules leading to a huge amount of busy work that mainly benefits them.

A huge amount of experimentation is required to figure out what approaches work. Granted, that experimentation isn't happening now. That's why EU's approach looks like the right one to me. Prevent vendors from shrugging off all liability to defects in their product in their licences, which gives bugs (of all sorts) potential for a serious financial bite. The severity of the bite is determined largely by the customer - did it hurt so badly perusing the vendor in the courts (perhaps via a class action) is worth it? That IMO is where the severity should be determined. Vendors and bug hunters have their own agendas that numerous examples have shown seriously compromise their ability to grade bugs. Finally it leaves the software developers free to experiment and invent their own responses. That's far better than giving handing that responsibility to bureaucrats. There are far more computer engineers out there, and their solutions will be much better at making their products reliable than forcing them to follow some universal set of rules, no matter how well intentioned those rules may be.

frumplestlatz · 4 months ago
Paternalistic interventionism wrapped up in the usual engineering propensity to overestimate our ability to understand and solve political and human problems well outside our immediate expertise.

What could possibly go wrong?

VyseofArcadia · 4 months ago
I feel like requiring software "engineers" to be actual capital E Engineers would fix a lot of problems in our industry. You can't build even a small bridge without a PE, because what if a handful of people get hurt? But on the other hand your software that could cause harm to millions by leaking their private info, sure, whatever, some dork fresh out of college is good enough for that.

And in the current economic climate, even principled and diligent SEs might be knowingly putting out broken software because the bossman said the deadline is the end of the month, and if I object, he'll find someone who won't. But if SEs were PEs, they suddenly have standing, and indeed obligation, to push back on insecure software and practices.

While requiring SEs to be PEs would fix some problems, I'm sure it would also cause some new ones. But to me, a world where engineers have the teeth to push back against unreasonable or bad requirements sounds fairly utopian.

zamalek · 4 months ago
I agree completely with you, in principle. The problem is that Engineers don't struggle with a mountain appearing in the middle of the river partway through construction.

It is a significantly broader problem. Processes are nearly always to blame for failure, not disciplines or people. For example, the sales team would need to come on board (don't sell anything that isn't planned or - better - completed), product would have to commit to features well in advance, the c-suite would need to learn how to say "no."

With all of that you would lose the ability to pivot. Software projects would takes years before any results could be shown. Just how things used to be. Maybe this can be done without that trade-off, but I'm not aware of any means.

Wowfunhappy · 4 months ago
I'm a (relatively new) math teacher. I realized I don't like writing on the whiteboard, so I bought myself a cheap Wacom Tablet off eBay. But then I couldn't find any existing Wacom-compatible software that was designed for my usecase—teaching in front of a live class of ten-year-olds, so last weekend I "vibe-coded" an app for myself. I just used the app for the first time while teaching today, it was great.

This codebase is probably terrible, because it was mostly written by AI. I manually edited certain bits, but there are large sections of the codebase I literally haven't looked at.

Is this a problem? The app works well for me!

My point here is, I'd really hate to gatekeep software development to a small group of "licensed" engineers. In fact, I want the opposite: to empower more people to make software for themselves, so they can control their own computers instead of being at the whims of tech giants. (This is also why I dislike iOS so much.)

I do also take your point about safety, but I think we need to acknowledge that not all software is security critical and it doesn't need to be treated in the same way!

VyseofArcadia · 4 months ago
> My point here is, I'd really hate to gatekeep software development to a small group of "licensed" engineers. If anything, I want the opposite--to enpower more people to make software for themselves, so they can make their computers work for them. (This is why I dislike iOS so much.)

I 100% agree. I wouldn't want to gatekeep software development in general. I would only put the PE requirement on companies that are running a service connected to the internet that collects user data.

Want to make an application that never phones home at all? Go nuts. Want to run a service that never collects any sensitive data? Sure thing! Want to run a service that needs sensitive data to function? Names, addresses, credit card info? Yeah, you're going to need a PE to sign off of that.

Side note, I was a math teacher in a previous life. Congrats on the relatively new career, and thanks for your service.

parliament32 · 4 months ago
Good job.

What's the plan for when one of your vibecoded app's vulnerabilities is exploited and a stranger's penis appears in front of your class of ten-year-olds? Is "AI did it" going to save your job / keep you off the sex offender registry?

dylan604 · 4 months ago
The same company that hires bossman to push deadlines would just stop hiring "licensed" SEs. Problem solved with mouthy SEs pushing back
daveguy · 4 months ago
You would, of course, have to have similar enforcement that goes along with PE.

Then it would be a matter of criminal negligence on the part of the bossman.

danaris · 4 months ago
I think part of the problem with that is that for physical engineering, there are clear, well-understood, deterministic and enumerable requirements that, as long as you as the engineer understand them and take them properly into account, your bridges and buildings won't fall down.

With software engineering, yes, there are best practices you can follow, and we can certainly do much better than we've been doing...but the actual dangers of programming aren't based on physical laws that remain the same everywhere; they're based on the code that you personally write, and how it interacts with every other system out there. The requirements and pitfalls are not (guaranteed to be) knowable and enumerable ahead of time.

Frankly, what would make a much greater difference, IMNSHO, would be an actual industry-wide push for ethics and codes of conduct. I know that such a thing would be pretty unpopular in a place like Y Combinator (and thus HackerNews), because it would, fundamentally, be saying "put these principles ahead of making the most money the fastest"—but if we could start a movement to actually require this, and some sort of certification for people who join in, which can then be revoked from those who violate it...

If we could get such a cultural shift to take place, it would (eventually) make it much harder for unscrupulous managers and executives to say "you'll ship with these security holes (or without doing proper QA), because if you don't we make less money" and actually have it stick.

VyseofArcadia · 4 months ago
I think we're basically describing the same thing. Asking a software engineering process to be the same as a physical engineering process is not realistic. A PE for SEs would look more like a code of ethics and conduct than a PE for say civil engineering.

The key thing to borrow from physical engineering is the concept of a sign off. A PE would have to sign off on a piece of software, declaring that it follows best practices and has no known security holes. More importantly, a PE would have the authority and indeed obligation to refuse to sign off on bad software.

But expecting software to have clear, well-understood, deterministic requirements and follow a physical engineering requirements-based process? Nah. Maybe someday, I doubt in my lifetime.

vsgherzi · 4 months ago
I think about this a lot and I tend to agree. There’s so much misinformation and ghost in the machine these days. I wish swes went to seek out the truth more. I’m not saying it dosent happen I just wish we had more engineering in this field.
ziddoap · 4 months ago
>"So yes, I get it: we shouldn’t trust companies, or even FOSS projects, to self-report.

Unless…what if we made penalties so large for not reporting, and for getting it wrong, that they would fall over themselves to do so?"

We know this doesn't work, and author admits as much.

However, the proposed solution is to add another cert into the mix. But it's not clear how this designation would be applied globally, with agreement across the globe on the requirements, punishments, etc. Not to be rude to the author, but it sort of seems like they forgot that not all software is developed in the US. (Not to mention, I really don't want another cert)

bruce511 · 4 months ago
Yeah, it seemed to jump the shark a bit here. Professional bodies, certified engineers, taking on liability for Open Source code... there's a LOT going on here...

Here's what'll really happen; no one cares or wants yo be a certified professional. Companies don't care about it. We carry on as is...

This is a classic over-engineered solution that nobody wants to a problem that barely exists. Just add burocracy, what could possibly go wrong...??

gavinhoward · 4 months ago
I do want to be certified. I think it would be a great way to make money building Open Source Software.

> Companies don't care about it. We carry on as is...

If required by law, companies would care.

> This is a classic over-engineered solution that nobody wants to a problem that barely exists.

The sorry state of our industry means the opposite: the problem is big, but lack of teeth means companies can ignore it and externalize the costs.

> Just add burocracy, what could possibly go wrong...??

I'd prefer to create our own bureaucracy, not have governments push one on us, like the Cyber Resilience Act does in the EU.

gavinhoward · 4 months ago
> We know this doesn't work, and author admits as much.

Where do I admit this? About fines? Yes, fines don't work.

The difference with my proposal is that companies wouldn't lose a few days' worth of revenue to a fine, they would lose 100% of revenue. That goes from being a "cost of doing business" to an existential threat.

> Not to be rude to the author, but it sort of seems like they forgot that not all software is developed in the US.

I didn't forget. In fact, it's because of worldwide things that I keep pushing this here in the US. The EU already passed the Cybersecurity Resilience Act [1].

Sure, we may not have things apply globally, but we don't need agreement on the punishments globally. We just need agreement on the certification globally.

We have done global agreements before. ICANN, International Telecommunications Union, etc. ICANN is interesting because it started as US-only and expanded.

[1]: https://en.wikipedia.org/wiki/Cyber_Resilience_Act

ziddoap · 4 months ago
>Where do I admit this? About fines? Yes, fines don't work.

Yes, about fines. From your post: "Ah, yes, fines for companies are not enough. I agree."

>they would lose 100% of revenue.

We can't get the government to enforce this when tens of millions of records are leaked publicly, this absolutely will not happen for failure to report a vulnerability. If you have any idea of how to make it happen, please, lets immediately apply it to breaches and then figure out how to apply it to failure to report vulnerabilities.

>We just need agreement on the certification globally.

As far as I am aware, there is no certification (one which is legally required to obtain a job) on the planet that is globally recognized. But I would be happy to be proven wrong here.

>but we don't need agreement on the punishments globally.

Which will end up with some countries not willing to charge 100% loss of revenue, causing a mass exodus of companies from any country which does charge 100%, thus making the solution untenable.

ICANN is an interesting example, but it's not a certification. The scale (and thus administration, compliance, etc.) is very different.

smu · 4 months ago
To provide some additional context to OP.

In the CRA, there’s (among others):

- reporting of actively exploited vulns or severe incidents to a national cert

- reporting obligation of vulns to the provider of that vulnerable code

- mandatory vulnerability disclosure policy (to receive vuln reports)

- obligation to provide security updates and alert customers when a vuln has become known

We’ll see how well this is all followed, but from a security perspective these are all good ideas.

smu · 4 months ago
About the fines, there’s a second option: make them more frequent, so there’s less chance on getting away with (minor) transgressions.

This would require well staffed regulatory bodies. At least for GDPR, I don’t think we have that.

grayhatter · 4 months ago
> This idea I had months ago will surely fix all the problems I just started thinking about today.

I very rarely find myself agreeing with some take the author has made. To the point where I almost said never agree. But I always read though, because even though the suggestion is always surface level, it's also always well written and well expressed. I like the help in reasoning through my own thoughts, and his musings always give a good place to start explaining and correcting from.

I hate, with a passion, CVE farmers. Because sa much of it is noise these days. But everyone complaining^1 so far have all completly missed the forest for the trees. The reason everyone uses CVEs still is because the value from having a CVE was never to know the severity. (The difference between unauthenticated remote arbitrary code execution, and might create a partial denial of service in some rare and crafted cases, is 9.9 and 9.3) The value has always been the complete lack of ambiguity when discussing some defect with a security implication. You don't really understand something if you can't explain it, you can explain it if you don't have the words or names for it. CVE farming is a problem, but everyone uses CVEs because it makes defects easier to understand and talk about without misunderstandings or miscommunication.

I'd love to see whatever replaces CVEs included a super set, where CVEs, also have CRE, where Vulnerability is replaced by Risk and only when [handwavey answer about project owner agreement], which would ideally preserve the value we get from the current system. But would allow the incremental improvement suggested by the original comment this essay is responding to. I would like my CVEs to be exclusively vulns that are significant. But even more than I want that, I don't want to have to argue about where the bar for significant belongs!

No company wants to manage CVEs, there's nothing that's going to meaningfully change that in the short term. Which means no one is looking for a better CVE system. Everybody wants the devil they know, I have complaints about the CVE system. But don't want to try to replace it without accounting for how it's used, in addition to how it works (and breaks).

1^: it's still early, and the people rushing to post are often only looking at the surface level. I'm excited to hear deeper more reasoned thoughts, but that's likely to take more than just 24h

dang · 4 months ago
Related ongoing threads:

CVE Foundation - https://news.ycombinator.com/item?id=43704430

CVE program faces swift end after DHS fails to renew contract [fixed] - https://news.ycombinator.com/item?id=43700607

beambot · 4 months ago
Funding to Mitre's CVE was just reinstated:

https://www.forbes.com/sites/kateoflahertyuk/2025/04/16/cve-...