Readit News logoReadit News
donatj · 2 years ago
It's kind of bizarre to me that people now days don't think of their address as public information. In the US at least, until about a decade ago, we would get a physical book in the mail every year of everyone in our metro areas address and phone number, for free, whether we wanted it or not.

You enter your abode via the public. It's really not hard for someone like a stalker or private investigator or paparazzi to figure out if they can find... you. That is to say if you go in public. It's readily available to anyone who actually wants to seek it out, and it's something you do in public. It's not particularly private information in the practical sense.

Every time my address is leaked online, I am not worried in the least. I don't really need to do anything. It's readily available on any number of information sites online anyway.

When my credit card gets leaked, it's a major headache regardless of if I am liable for charges. Now I have to cancel my card, change all my automatic subscriptions, reenter it on every single shopping site. It will take literally months for all that to resolve.

IshKebab · 2 years ago
That's the naive black and white way to think about security. If it's possible to learn something with a bit of work then why bother keeping it secret?

The answer is that the bit of work - "really not hard" as you put it - actually can be quite hard, and it is a real deterrent.

You don't really care about whether something bad can happen... you can about whether it is likely to happen. It's a probability. And making it hard to find your address reduces that probability.

Also I would suggest that HN members are unlikely to be stalked. You might feel differently if you were a Twitch girl or whatever.

donatj · 2 years ago
I mean isn't security by obscurity generally accepted as bad practice?

If everyone treated it as easily available data, and stopped using the act of having it alone as "proof" of anything we could be much more secure. E.g. merely having someones address should not be enough to get their house swatted.

acdha · 2 years ago
Think around the edges: you could opt out of phone books because some people really needed the ability to do that, and that was in the era before bulk operations were easy. Someone targeted, yes, you could look them up and even get phone books from other areas, but the kind of data-mining and cross referencing which are now trivial were cost-prohibitive.
bastawhiz · 2 years ago
At least in the US, property ownership has been public record. You can look up who owns any parcel of land. You can't opt out of your address being public. It's just somewhat easier to search this data in today's era.
bradleyjg · 2 years ago
Every time one of these garbage companies sends me a letter telling me they lost my information in a breach but don’t worry they are giving me FREE CREDIT MONITORING!!1! they should have to put $50 inside.

I think that would go a long way towards solving the problem.

acdha · 2 years ago
Credit monitoring is really an indictment of the entire system: “if our negligence combines with someone else’s negligence, you might find out sooner”
threecheese · 2 years ago
A large group of Americans instead view this as proof that capitalism is working. (I’m not one of them) Checks and Balances, nevermind that we lack the leverage to extract more than “credit monitoring and lawyers get paid” or that it requires a civil judiciary that we pay for.
nonrandomstring · 2 years ago
No. That would just put a market price on your dignity. Fifty bucks is cheap that they'd happily pay you that.

Please forgive me... I don't mean this as a personal insult; but a better system would be where you get fined $50 for being stupid enough to give them the data in the first place.

acdha · 2 years ago
> where you get fined $50 for being stupid enough to give them the data in the first place.

So … you don’t use banks, utilities, phone companies, healthcare, etc. and don’t apply for or accept non-anonymous jobs? This isn’t optional in many cases, which is why it really needs to be covered by legislation which shifts the cost to the company collecting that data.

batch12 · 2 years ago
A better system would be one where the company paid a fine starting at $500 per record. Also, not all consumers who have had their data stolen gave it to the organization willingly.
bradleyjg · 2 years ago
I’m no more insulted than if you told me I’m a fool because I don’t live in a cave in the Himalayas chanting mantras. Our visions of a life well lived are too different for you to be able to insult me.
giantg2 · 2 years ago
It depends. Some industries would be able to happily pay that. Others have tight enough margins that they might feel real impact. Even 1 million users x $50 is a huge sum for most companies.
vegetablepotpie · 2 years ago
Data breach insurance is not helping.

Using weak passwords, leaving credentials where others can see them and downloading infected files can all lead to compromised data. Data breach insurance is specifically designed to protect a company in the aftermath of such an unexpected event.

Business correctly recognizes data breaches as a risk. The insurance industry allows companies to export that risk to them. Data breach insurance pays for the financial impact to the business as a result of a data breach. This does not protect customers and in-fact creates misaligned incentives between a business and its customers.

One solution would be to legislate that insurance is not acceptable for an organization to mitigate cyber risk. States and the federal government could do this by passing a law. I don’t see something like that getting passed though. The insurance industry and every business, both large and small, will lobby hard against it. You’d really need a strong grassroots consumer advocacy group to push hard for this, something that tells people’s personal stories to the media.

lll-o-lll · 2 years ago
We don’t need legislation, just dramatically increased penalties for a breach. Now insurance premiums drastically increase unless you’ve done x, y, z.

I’m not all “free market solves it all”, but it definitely works well at balancing money through the system. We just haven’t correctly priced a data-breach.

sopooneo · 2 years ago
Exactly. At $500 or $1000 per record, PII starts to look highly radioactive. Companies would be avoiding it's collection with a passion. And those that had to hold it would be compelled to do so less stupidly.
yjftsjthsd-h · 2 years ago
My first thought would be that insurance could be workable, but it might be too cheap right now. I would expect that when a company signs up for such a thing, they get audited and charged according to risk - a well run organization might find it a tiny cost, while say a company storing passwords in plaintext might find that their monthly bill is ruinous because they're practically guaranteed to be breached and subsequently fined into oblivion. Insurance shouldn't so much remove risk as amortize it.
artichokeheart · 2 years ago
Yep. Seen it in action.

If we store credit card information we need to be PCI compliant. Let’s outsource that then.

All stored personal information needs to be PCI compliant.

1970-01-01 · 2 years ago
You have no obligation to anyone (except the government) when they ask for your information. Lying is a valid defense to this infosec ineptitude. Virtual credit card numbers are also a leap in control. Use them.
acdha · 2 years ago
You do have a legal obligation for anything financial where KYC laws apply, and I wouldn’t try that with airline tickets, either. You might also have issues around breach of contract if you’re lying about something which might have affected their willingness to allow service – this is likely not to go beyond cancelling your account but you should think about the magnitude. Nobody’s going to care about your grocery store loyalty program, but if there’s money or copyrighted material involved you might want to weigh your willingness to be a legal test case.
1970-01-01 · 2 years ago
Airfare can be bought with virtual CCs. REAL ID requirement is another story altogether.
lofaszvanitt · 2 years ago
Exactly. Just give false info, even when buying products, so when a breach occurs you won't have any headaches whether what was leaked and how.
6510 · 2 years ago
You can argue for fines and prison sentences but on HN that won't accomplish anything. Here the only workable solution is a technical one, plenty of expertise available and people able to implement stuff.

I do security by not having things I don't need, printing documents and deleting the data. It's not perfect by it self but it is something we could model in hardware quite well.

One-way tubes seems pretty easy.

For access one could give each employee a query quota and if they exceed it have someone else increase it temporary or permanently.

One could also make a dumb console that displays data on a screen, db tables, pdf files, images.

Could build some business logic in hardware. More often than not the need for access is triggered by something. If the customer calls you some of their information can be displayed. Accessing it in the days after that isn't dubious.

It takes a lot and makes things more complicated but in the end you do get nice small data sets to work with.

9dev · 2 years ago
The GDPR may be a pain in the ass to properly implement, and certain parts of it are a bureaucrats wet dream, but it sets the right incentives. If you read the general rules, it’s just common sense: Only keep what you need, take as many steps to secure data as you can, tell users proactively what you like to do with it, ask for their consent, and delete whenever they request it.

It all sounds like lots of additional IT work, and it is (I spend a lot of time in our company to try and improve). But it only seems like a hassle because we went for so long without doing it right.

There must be a way to let human dignity be the lowest common denominator for shareholder value…

gravescale · 2 years ago
I like this description of having major problems with GDPR as usually either being because you actually are abusing people's data, or because you've run up a huge pile of technical debt related to data handling: https://reddragdiva.dreamwidth.org/606812.html
Chronoyes · 2 years ago
No, one cannot just comply with the "general rules" of GDPR, you have to comply with every last letter of the considerable legal legislation. The fact that the rules can be generalised to a reasonable few paragraphs is meaningless.
9dev · 2 years ago
That’s just not true. I’ve consulted with a few privacy legal agencies and spent a lot of time evaluating the law. Some sections are even worded in a way to allow wiggling room for prosecutors, or require good will on your part. What would even be supposed to happen if you weren’t „compliant“? In the end it’s always about specific kinds of misconduct, and that means fines. The amount of a fine depends on the severity of the misconduct. The GDPR isn’t different at all from other laws in that regard.

If you’re found to be in breach of the GDPR, the severity of the breach as well as the amount of negligence or malevolence on your part is taken under consideration to decide on the fine. The prosecuting attorney also doesn’t have to actually fine you if it’s clear you put in effort and acted in good will.

For a concrete example, a startup usually isn’t required to provide a fully fledged data deletion policy, but if you cannot roughly outline how you intend to handle people’s requests to delete their data, that doesn’t look good. If you don’t even have some sort of privacy policy on your website, that looks worse.

Nobody can implement the GDPR 100%. But you can try to handle data responsibly, and if someone discovers you don’t and you try your best to fix the error (which is on your part, mind you), nothing draconian is going to happen.

And we’re still talking about basic respect towards your users or customers here, it’s not like someone asks something ridiculous of you.

denton-scratch · 2 years ago
> you have to comply with every last letter

Cite, please.

Perhaps regulators in diferent countries take different attitudes; in the UK, it's very soft-touch. Only the most egregious, repeated flouting of the regs attracts a penalty.

As far as I can see, the Irish regulator is even softer; you could be mistaken for thinking that the Irish regulator's job is to make sure that US tech companies don't move their server sheds away from Ireland.

nonrandomstring · 2 years ago
> Information Security: "We Can Do It, We Just Choose Not To"

Maybe not.

It's convenient to think that misaligned incentives [0] or insufficient motives [1] explain failures of infosec. These are popular explanations amongst tech people, because we want to believe infosec can work. Our jobs depend on it.

Now there are gargantuan fines, shelves of regulation, auditing and compliance, even jail time for executives. Has it fixed anything? No. If anything the landscape of breaches is accelerating. And things like Microsoft Recall, cloud "AI" services are only going to amplify it. Even if we had a "corporate death penalty" that simply shut down companies on their first breach, it would fix nothing. We'd just get fly-by-night tech companies with an average lifespan of 18 months.

What if the people who said "Data wants to be free" are right? What if data containment is impossible in principle?

Once we put aside wishful thinking, how can a technological society survive. It requires radical and brutal re-thinking of cybersecurity. How we define it. How we teach it. How we legislate it. How we address harms.

[0] Bob secures Alice's data while Alice pays the price for Bob's failure

[1] Many people don't care. Not everyone has a security mindset, not because they lack intrinsic self-respect but because they are unable to comprehend the harms.

bradleyjg · 2 years ago
Now there are gargantuan fines, shelves of regulation, auditing and compliance, even jail time for executives.

What companies paid a big enough fine to have an unprofitable year? Which executives are sitting in prison?

These things only exist in theory, not practice.

I’ll give you endless reams of pointless box checking exercises in the name of auditing and compliance.

gravescale · 2 years ago
They're not pointless if you're in the Guild of Box Ticking Consultancies, and guess who gets to say what the boxes are?
amelius · 2 years ago
Maybe we need a government pentesting agency that fines companies without waiting for the first breach.
nonrandomstring · 2 years ago
> a government pentesting agency that fines companies without waiting for the first breach.

I've heard serious suggestions floated for a tax and contribution funded pentesting agency that helps companies without waiting for the first breach. But I think the scale of it all is just a bit much.