Reminds me of the time I bought lunch at work, and a colleague told me exactly what I bought and how much I paid for it. I called him out and said it was a lucky guess, and then he proceeded to tell me my entire payment history for the past 2 days.
Turns out when I was buying lunch, he was on the phone with a friend who worked at Paytm and that guy gave away my transaction history for shits and giggles.
My trust in private companies has been at it's lowest since then and I absolutely do not trust startups to keep my data safe.
What you must understand is that this is human nature not "private companies."
When I was in high school I had a friend who worked at one of those 1-hr photo processing places. People would bring their film in to have prints made. And there were no small numbers of "intimate" photos on those rolls of film. Yes even in the days of film cameras, people took photos of themselves in sexual situations.
Of course my friend thought it was hilarious and the shop would make extra prints of these photos to pass around among the staff. They had separate categories similar to what you'd see on any porn site. Of course it was in violation of policy but people do this stuff. If you're building something that handles photo/video images you must expect it and build in privacy from the ground up. You cannot rely on your staff to always be on their best behavior.
When I was working for [blank] cellphone carrier we had a competing carrier fire their entire phone repair/support staff in the area because they were keeping a USB hard drive stash of nudes. Tech support staff would pass it from store to store and dump whatever nudes they'd collected from customer phone repairs that week. I don't remember how they got caught.
I had to come down on multiple tech staff at our own store for digging around in photos anytime a hot woman came in with a phone.
Rare occasions we had this one older women that would ask us to transfer photos every year to her new phone and to "verify personally that every photo had been moved." Of course the majority of the photos would be her naked selfies or what seemed to be swinger parties. I've got a 65yo woman in a cowboy hat only seared into my brain because I was the first tech to deal with her kink of having people look through the photos.
Right, so the problem is still private companies not putting in the effort to build in privacy to prevent this.
For physical film it’s hard, but for software you should at the very least record access to personal data and audit it to make sure people actually need it and aren’t abusing whatever permissions they are using to get the data.
I assume they don’t mean that they are baffled by human nature. The thing that’s sort of unexpected is that, knowing human nature, companies don’t build in safeguards for this kind of sh.
> If you're building something that handles photo/video images you must expect it and build in privacy from the ground up.
I agree with this as part of a bigger solution. There should also be privacy regulations with serious consequences for negligence or abuse. If private customer data were a liability due to the risk of huge fines from misbehaving employees, companies would collect a heck of a lot less of it.
Right now data collection is almost all upside for the company; there are many ways to use or sell it to make more money. But users bear the costs, many of whom don't realize just how much they are being spied on.
> What you must understand is that this is human nature
Seems no one is speaking up in the defense of humanity at large. What you describe is possibly even "common" but it is not ingrained in all of "human nature". There are many people who are simply incapable of certain transgressions - for some even the thought doesn't occur. These are possibly rare but they do exist. What you are describing is the fundamental problem of humanity: we are not a smooth and uniform distribution and practically every political thought ultimately boils down to this foundational problem of our collective but morally and ethically disjoint coexistence.
Human nature? Sure, but very much lack of regulation and of good corporate privacy policies. First, customer data should be in a special, highly logged environment. Anyone logging in for whatever reason needs a justification. Next audit or periodically or if something goes wrong, those logs are checked and people confronted with why they're accessing random person X's data for no work related reason.
Won't make those incidents disappear completely, but it will sure kill off fetching data from a friend of a friend for shits n giggles.
I was at a hackathon once, and they had a startup fair, where various companies had their booths.
One startup had created a student-management system for schools. And the rep was demoing the system. Except with live data. Showing pages of real students with their pictures, home addresses, etc!
So, principles aside, the actions of companies are such that there can be no trust.
I once had a classified ad software hustle, and I needed customer data to debug things. But one of my customers was Canada's largest gay newspaper, and ts classifieds were highly sensitive.
So I wrote a routine that obfuscated the database, changing all phone numbers to 555-xxxx, changing all names to random names of fruit (So a customer might become Banana Grapes, 416-555-1234), and a few other changes to hide other possibly identifiable information.
I had a menu item to do that to the database, it was under a "developer" menu that only appeared when I was personally signed in as the only superuser. I am embarrassed to say the menu item was called "mixed fruit."
One day, I was signed in at the client's office, and the manager came by and wanted to do something or other. I gave him the mouse and keyboard without logging out and asking him to log back in. He did his job, then noticed the developer menu. "What's this," he murmured, and selected "mixed fruit."
No confirmation dialog, no warning, it begin munging the live, production database as I watched in horror. I managed to get everything sorted and the production data restored, but I learned a few lessons that day about building super-features for myself that were extremely sharp and difficult to undo.
Someone should have walked up with their phone camera and shouted "ooh hey, nice addresses!" And just started snapping away (or at least pretending to)
My awakening was the multiple times I've been talking with a startup and said, "I was surprised there's no self-guided demo or video tour available on your website; can you show me how the product works?" and had them reply something like, "Oh, sure. The reason we don't have a demo yet is that we haven't gotten around to making fake data, but let me pull up one of our customers' accounts and give you a tour using their data. Try not to read anything."
If you build robust privacy guidance mechanisms into the fabric of your startup from the beginning, your ability to handle risk management around these types of cases can be resourced to scale with the customer expectations of the system you build.
Unfortunately, if you do that, you are going to be outcompeted by the teams that are working to get their first 10,000 paying customers by any means necessary, because privacy planning is less capital efficient.
The companies that do get big enough to overcome their immediate survival constraints often have a harder problem identifying and providing resource needs for privacy assurance because it's less on the minds of the people in charge of making resource decisions because you have other operational scaling issues at the front of mind.
Your engineers and support staff doing dumb things with your data is a risk you can have resources allocated to. But it's not on the critical path to market dominance so it shouldn't be expected to be a priority.
Sounds like it should be illegal to share data by default, and that individuals shouldn't be able to sign away their expectation of privacy as part of a EULA.
I'm so glad I've never heard of that service and have no idea what it is.
Thanks very much for calling them out by name, BTW. Presumably someone from that company is reading this as we speak - and soon enough, will be reporting back to us that that employee has been identified, and of course, duly fired.
Indian peer to peer payments app. Generally considered one of the higher-quality made in India applications and very very widely used. Nobody will be IDed or fired until the same thing happens to a celeb or gvmt official.
It's because you don't live in India. Not sure if this is still the case, but Paytm was the Venmo of India, and actually had more penetration than credit cards when I visited ~5-6 years ago.
I don't think it's just private companies. Its really any thing that has data I think.
I used to work at the largest telco in the country on a software project (as a consultant) that involved some integration with existing services. With some playing around it soon became clear that all services were wide open as long as you were on the internal network, you just had to know what they were and how to call them. No authentication required, no audit logs as far as I can determine.
I didn't poke at it too much, but I was able to at least read an arbitrary cell phone's text messages and call logs.
> My trust in private companies has been at it's lowest since then and I absolutely do not trust startups to keep my data safe.
Hate to be the bearer of bad news, but governments aren’t any better. Local government in particular is usually an IT security nightmare.
There was a local government in a state I used to reside in that required folks to have an “alarm license” for their home and fined people for false alarm police callouts. The form to apply required you to give an alarm code for the police, and of course your name, address, and phone number.
Predictably, the database of information was ineffectively secured and basically public on the Internet for years before it was fixed. I don’t recall any burglaries or home invasions happening due to it, but still rather asinine.
At this point in my life I have basically no faith in any institution in society and treat all information I give out as effectively compromised immediately.
You are right, I’ve worked at a bunch of companies with millions of users and access to data was almost out of control within the company. Management doesn’t really care, including the data protection officers and the likes. I only hope Google has better policies, with all the data we store in emails and in docs.
Frankly, everyone is a risk to some extent, which is why it’s handy to just not give someone things that can be a problem if exposed. What they don’t know/don’t have, can’t be used against you.
But someone with minimal direct criminal/financial risks of exposing something is definitely higher risk than others, and that is most startups.
That said Amazon reps have clearly been bought out before, and individuals within most large corps have always been viable targets of blackmail, bribery, coercion, etc.
It’s why some societies are so resistant to phasing out Cash. Anything else gives leverage to folks that historically it’s been a bad idea to give leverage to.
At a fertility clinic they had paperwork that asked my employer, social security number etc. I asked if this was necessary, as I was paying out of pocket. They firmly said yes and asked me to fill it out.
Guess who works for $State Fertility Group, with a social 111-11-1111 and makes 100M/year.
I believe that your assessment of startups is valid. I also think your views are true of big, multinationals as well. It seems that by the time a company gets large enough, consumers start receiving some protections, but workers, not so much.
> Tesla is neither a private company, nor a startup.
The parent implies 'private' in the sense of non-governmental entity (the Indian terminology), and by that metric both Tesla and Paytm are 'private' (and publicly traded)
That's horrific. I work for a fairly large payment processor and it takes multiple levels of approval and oversight from several levels of management to get time limited access to a production database or client interface - all access being logged and ticketed along the way. The idea that someone could just start looking up transactions for shits and giggles would be unheard of.
> The idea that someone could just start looking up transactions for shits and giggles would be unheard of.
What about the DBA maintaining the database? Do they not have query access to the data? How about the devs who are responsible for reporting; do they develop reports using generated test data? It’s naive to believe that data is entirely secure and private. There’s always a level of trust required from employees to not share private data that they may see on the job.
I fully expect this in India. There is no privacy in India and no education or awareness of it either. There are no laws and so no expectations either. I don’t if it’s still prevalent but I know friends who had folks from some company knock on their doors saying they will take a sample and do free blood tests and add it to their website to track it as a service. This is pre covid
This is really disappointing. HDFC wouldn't ever do this to you if you used your debit card for transactions (I realize this isn't feasible for every vendor, I just mean conceptually). Now I have to wonder why Amex was banned for so long for not localizing data. The other payment apps with localized data aren't really doing that much to protect it!
Well, that's India, though. Whether it's a private or a public company there's no real notion of privacy of data. If you want it and you have a friend with access, he'll give it to you. Your job is to then not get caught.
when it comes to security you should always assume the worst intentions, if you can think of it as possible somebody is probably doing it. This is why nobody trusted the NSA and they were proven right with the Snowden stuff
I don’t recall the source, it may have just been anecdotes online that aren’t easily verifiable, but even after Facebook went legit as a company I’m pretty sure access to data (who’s looking at whose profile, stuff like that) was marketed as an employee perk.
I think Zuckerberg is a sociopath but this chat log often gets dredged up and it’s never been verified. It could be real but it could be an urban legend or an internet echo or the reincarnation of tubgirl.
I feel conflicted whenever I see a comment like this.
On the one hand, let's assume it's true: a Paytm employee acted negligently.
But on the other hand, what if it's not true? What if you happen to have a friend or family member who works for a Paytm competitor, or you have some grudge against Paytm for whatever reason, and are instead spreading low-key FUD about the company to make it seem like they have lax data controls and staff disregard for sensitive data?
The issue is that there doesn't really seem to be a way to substantiate your anecdote.
Let's assume it's true: a Paytm employee acted negligently.
Not negligently - maliciously.
The employee knew exactly what they're doing, that it was "wrong" in any conventional sense -- and most likely a huge liability to their career and reputation if it got found out.
From personal experience, people will do anything they are physically capable of doing and think they can get away with. Almost nobody I know has the slightest amount of respect for any private data to which they have access. This extends from people in healthcare breaking HIPAA to tell me about how Jane Doe is an idiot who got a mayo jar stuck in her vagina to IT workers showing me John Doe's cringey nude selfies. Trust absolutely no one. If it's possible, it's happening. The goal should be able to make it not possible to the best extent and when it is, create accountability.
Oh please. The comment was less about PayTM and more about tech companies being blasé about data privacy in general.
If I had a friend or family member who was an employee of such a publicly facing tech company, I’d be grilling them about their data security and privacy practices. I’ve been burned enough times by Indian companies so ridiculously free with their data sharing that I’ve stopped giving out my contact info to everything but the most essential of services.
Most Indians will lean towards believing the GP because they know how aggressively their personal data is being abused, unless Paytm comes out with concrete details of how they protect privacy inside and outside the firm.
I didn’t even realize Paytm was a real company when reading OP. It sounded like a generic name made up for purpose, like “Jane Doe” for payment companies.
This is why rule of law is important. India has weak rule of law... there's no confidence from anyone that wrongdoing will be punished and there's no confidence that making up stuff to hurt a competitor will be punished.
Given what I've seen I have absolutely no problem believing this. If you don't then that's fine but that simply means you've been living a sheltered life. Have a look at the GDPR enforcement tables for some choice violations.
So you trust a for-profit more then an aneedote by a customer of them? I am sure you'd also forcefully vaccinate your loved ones if $authority told you to do so, right?
In my experience, everything bad you can imagine, a for-profit has already done.
Wanted to get Model Y in the past. Nice comfy car, huge backseat and… a camera filming inside the car. Thanks no. Apparently for drowsiness detection, but I am not buying it. Time of flight camera staring at the driver’s face is kinda normal in car industry. Not a normal color camera filming everything and streaming somewhere like Tesla does.
> Time of flight camera staring at the driver’s face is kinda normal in car industry.
Is this really becoming standard? If so, that really bums me out.
I considered buying a Subaru for my latest car but one of the big negatives was an internal camera pointed at the driver that couldn't be disabled. It was, at least in part, for driver attention monitoring but performed horribly. It would ding contantly, even when I was staring straight ahead.
I have no faith that these video feeds will be kept private and really, really don't want to have to worry that any awkward or embarassing thing I've ever done in a car could be released to the world.
The most recent FSD beta update from a few days ago finally started using the interior camera to track driver attentiveness, according to the release notes.
Likely attempting to refer to a camera that can see depth like an Xbox Kinect sensor, but no car I know of has such tech. Some manufacturers like GM use IR cameras looking at the driver because they can see through sunglasses
Why not buy laptops that don't have cameras? They exist, and doing that would at least help to add economic incentive for the continued production of them.
From purely academic perspective, self-driving cars could probably benefit from eye-tracking data as humans drive cars. Although I agree with you on the privacy aspect. I would need proof that the data is only being used for certain things to feel comfortable with it.
The camera feeds are processed entirely locally in the car and not streamed anywhere. You can optionally opt in to share certain data with Tesla if you want to, but it's not required.
Get a German car. Privacy is taken extremely seriously here in Germany. In all walks of life. Companies, schools, doctor‘s offices, etc.
Edit:
I wouldn’t want to touch a Tesla even if it was gifted to me. Seeing the founder behave the way he does (latest incident: Changing Twitter’s logo to a shitcoins image) makes him extremely unsympathetic to me.
Is that true for cars made for markets that are not Germany? It's not that German car makers are inherently more ethical so I'd assume they do whatever they can get away with in whatever jurisdictions they sell their cars.
Or buy an old car. Of course there are tradeoffs with efficiency and safety, but I like having a car that's easy to work on and features no telemetry or smart features.
Can you elaborate why German products would be more privacy focused?
As a counterpoint, BMW started putting feature behind premium subscriptions couple years ago such as steering wheel. Such micro transactions are really hard to combine with privacy.
Except you had to register in an app every place you went to, during covid. Nice social profiling that data could provide… say, if another type of party ended up as the ruling one.
Yikes. So turning private images into memes and posting them internally isn't a fireable offence but union talk is? Such impeccable standards for conduct.
The kind of nonsense described in this story is exactly what I'd expect from an early stage software startup back in the early '00s, when I was first getting started (not in SV, mind). It was common to mock people and create memes for any reason (the Basecamp "foreign sounding name" kerfuffle very much reminds me of this).
This kind of culture was unacceptable back then, of course, but the founders and owners were little more than children themselves. It was at least understandable if not excusable.
What completely boggles my mind is that, some 20 years later, this kind of culture is still happening. And it's not happening at some small tech startup founded and run by young men barely out of school - it's happening at a wildly successful car company which has been around for over a decade run by the richest man in the world.
The culture at Tesla must be rotten to the very core for this to persist so long.
Amazon audits the use of the order database. As do banks for deposit information. As do google for looking up the contents of what's in someone's drive, etc, etc.
We've come to expect privacy as a first class right. That's why turning a customer's potentially embarrassing action into an internally shared meme feels like an extreme violation.
> Tesla managers sometimes "would crack down on inappropriate sharing of images on public Mattermost channels since they claimed the practice violated company policy," Reuters wrote.
somewhat implies that this has been a known issue for a while, already. (This shouldn't be triggered exclusively by a public scandal.)
It's not like the memes were posted internally today, firing them now is just a reactionary response. The problem is that they let it go on the entire time before now.
I work in the industry. It's not a "vehicle" any more - it's a "platform" to "engage" with customers, i.e. sell your data and lock features behind ridiculous monthly fees. That is now the mantra of the automotive industry. It really kills any enthusiasm I have for new cars.
> It really kills any enthusiasm I have for new cars.
... and televisions, and phone apps, and solar inverters (with wifi!) and air purifiers (with wifi!) and drones (with wifi to activate) and scooters (to activate), and on and on...
Wishing I still had my 89 Caprice with physical buttons and switches. Parts were insanely cheap too (except the transmission, which I part of why I got rid of it).
This is the most depressing thing I’ve read all day. I have two trucks. A 2010, and 1998. I was going to sell the 2010 for redundancy, but I don’t think I can. The 2010 is in pristine condition, and should last another decade if I’m careful with it. I’ll have to acquiesce to the modern trend with cars eventually, but I hope to hold out as long as I can.
Current owner of a '14 i3. Sorry, but the i3 has a telematics module and a cell modem. It talks to BMW. The first gen ones only have a 3g modem though, so at least they are muzzled in areas that have dropped that network.
I think we'll start to see ICE-to-EV conversion kits become more popular over the next decade or so. So you may get your wish, just not with a brand-new car.
> Wake me up when there's an EV that is just a car and not an always-connected surveillance device on wheels.
This is an easy way to excuse Tesla's behavior. You're implying everyone does it. <shrug> What can ya do, right?
There's 0 evidence other car companies are allowing PII like this to be spread internally, and employees are so brazen as to use them as the basis for memes.
I'm saying exactly the opposite. I will never consider getting an EV as long as it's full-time tethered to the mothership and my car can be deactivated/bricked/broken remotely either on purpose or by accident. I want my car to be MY property, not a device obeying inputs from someone somewhere else.
And absolutely no surveillance built in. Until then I'll stick to gasoline cars.
Wake up. I'm told Tesla has an undocumented factory option to buy a car without radios, for heads of state and their security and suchlike.
I'm not sure how difficult or possible it is to actually get a car so configured, but you can always rip out the GSM radio yourself if you wish (which voids your warranty, natch).
When I was working at Uber, we had a case of a guy stalking his ex-girlfriend's every movement through the backend. They eventually found out, fired him, and stepped up auditing.
Didn't this happen at Google a few years ago? I recall something about a guy that was accessing his ex's Google data, like email, chats, contacts, and so on.
It may seem paranoid, but it is sane to expect that everything that it is closed and consequently not auditable for security and privacy, and goes online on a network we have no control over, even though not a public one, will soon or later be used for spying.
It's not paranoid, it's simple observation. Anything that can be abused by someone, will be abused by someone. The question is whether the benefits overshadow the potential for abuse. In this case, they don't.
The more places where networks capable of surveillance exist, the less privacy you have, and the less autonomy you have. This has been obvious for decades. It is not surprising. No one on HN should be surprised by either the surveillance capabilities of a smart car, or the assdouchery capabilities of people who work for... ahem. I, for one, took both as a given, and I don't even work in tech.
> Tesla staffed its San Mateo office with mostly young workers, in their 20s and early 30s, who brought with them a culture that prized entertaining memes and viral online content. Former staffers described a free-wheeling atmosphere in chat rooms with workers exchanging jokes about images they viewed while labeling.
> According to several ex-employees, some labelers shared screenshots, sometimes marked up using Adobe Photoshop, in private group chats on Mattermost, Tesla’s internal messaging system. There they would attract responses from other workers and managers. Participants would also add their own marked-up images, jokes or emojis to keep the conversation going. Some of the emojis were custom-created to reference office inside jokes, several ex-employees said.
Work environments should ideally be fun and collegial. But I just would've assumed that it's incredibly obvious that managers should not encourage a culture of memeing and shitposting when the job is so centered on private user images.
This is absolutely disgusting to hear. Can you imagine if Apple or Google employees were caught doing something like this with photos/videos taken with their products?
It's not linked in there, but the Wired writeup adds that (0) there were at least two SRE's fired for something like this and (1) Google chose not to report this to law enforcement, seemingly prioritizing its public image over its users' safety.
Absolutely. Just like Amazon, Facebook, Uber, NSA and other employees have done so as well. The only reason Apple or Google employees might not do that is encryption and decent access controls, not because they are better people. It would not shock me in the least if a Google or Apple employee did something similar.
It might shock me a little just because of the risk/reward if they were just snooping on random people. But not if it was exes/crushes/friends/etc.
Facebook has had access auditing tools for at least 15 years at this point, and it only got stricter with time. It was already a fireable offence back then, too, to improperly access someone's data without a business justification.
From what I heard they did indeed have some LOVEINT problems before that i.e. an employee couple with a nasty breakup - but they were also orders of magnitude smaller.
Have you heard of Ring? Alexa? There are countless stories of the data from these devices being harvested by Amazon, and even sent to law enforcement to use against their owners.
Turns out when I was buying lunch, he was on the phone with a friend who worked at Paytm and that guy gave away my transaction history for shits and giggles.
My trust in private companies has been at it's lowest since then and I absolutely do not trust startups to keep my data safe.
When I was in high school I had a friend who worked at one of those 1-hr photo processing places. People would bring their film in to have prints made. And there were no small numbers of "intimate" photos on those rolls of film. Yes even in the days of film cameras, people took photos of themselves in sexual situations.
Of course my friend thought it was hilarious and the shop would make extra prints of these photos to pass around among the staff. They had separate categories similar to what you'd see on any porn site. Of course it was in violation of policy but people do this stuff. If you're building something that handles photo/video images you must expect it and build in privacy from the ground up. You cannot rely on your staff to always be on their best behavior.
I had to come down on multiple tech staff at our own store for digging around in photos anytime a hot woman came in with a phone.
Rare occasions we had this one older women that would ask us to transfer photos every year to her new phone and to "verify personally that every photo had been moved." Of course the majority of the photos would be her naked selfies or what seemed to be swinger parties. I've got a 65yo woman in a cowboy hat only seared into my brain because I was the first tech to deal with her kink of having people look through the photos.
For physical film it’s hard, but for software you should at the very least record access to personal data and audit it to make sure people actually need it and aren’t abusing whatever permissions they are using to get the data.
Everything is human nature that's why we have laws and regulations
You shouldn't even be able to access prod data as a simple employee, especially in payment company
I agree with this as part of a bigger solution. There should also be privacy regulations with serious consequences for negligence or abuse. If private customer data were a liability due to the risk of huge fines from misbehaving employees, companies would collect a heck of a lot less of it.
Right now data collection is almost all upside for the company; there are many ways to use or sell it to make more money. But users bear the costs, many of whom don't realize just how much they are being spied on.
Seems no one is speaking up in the defense of humanity at large. What you describe is possibly even "common" but it is not ingrained in all of "human nature". There are many people who are simply incapable of certain transgressions - for some even the thought doesn't occur. These are possibly rare but they do exist. What you are describing is the fundamental problem of humanity: we are not a smooth and uniform distribution and practically every political thought ultimately boils down to this foundational problem of our collective but morally and ethically disjoint coexistence.
Won't make those incidents disappear completely, but it will sure kill off fetching data from a friend of a friend for shits n giggles.
One startup had created a student-management system for schools. And the rep was demoing the system. Except with live data. Showing pages of real students with their pictures, home addresses, etc!
So, principles aside, the actions of companies are such that there can be no trust.
So I wrote a routine that obfuscated the database, changing all phone numbers to 555-xxxx, changing all names to random names of fruit (So a customer might become Banana Grapes, 416-555-1234), and a few other changes to hide other possibly identifiable information.
I had a menu item to do that to the database, it was under a "developer" menu that only appeared when I was personally signed in as the only superuser. I am embarrassed to say the menu item was called "mixed fruit."
One day, I was signed in at the client's office, and the manager came by and wanted to do something or other. I gave him the mouse and keyboard without logging out and asking him to log back in. He did his job, then noticed the developer menu. "What's this," he murmured, and selected "mixed fruit."
No confirmation dialog, no warning, it begin munging the live, production database as I watched in horror. I managed to get everything sorted and the production data restored, but I learned a few lessons that day about building super-features for myself that were extremely sharp and difficult to undo.
Unfortunately, if you do that, you are going to be outcompeted by the teams that are working to get their first 10,000 paying customers by any means necessary, because privacy planning is less capital efficient.
The companies that do get big enough to overcome their immediate survival constraints often have a harder problem identifying and providing resource needs for privacy assurance because it's less on the minds of the people in charge of making resource decisions because you have other operational scaling issues at the front of mind.
Your engineers and support staff doing dumb things with your data is a risk you can have resources allocated to. But it's not on the critical path to market dominance so it shouldn't be expected to be a priority.
Thanks very much for calling them out by name, BTW. Presumably someone from that company is reading this as we speak - and soon enough, will be reporting back to us that that employee has been identified, and of course, duly fired.
Right, PayTM?
(BTW - that's some "colleague" you have).
I still remember when Blackberry had to pull out of India because the government wouldn’t let them operate a secure, encrypted messaging service.
I used to work at the largest telco in the country on a software project (as a consultant) that involved some integration with existing services. With some playing around it soon became clear that all services were wide open as long as you were on the internal network, you just had to know what they were and how to call them. No authentication required, no audit logs as far as I can determine.
I didn't poke at it too much, but I was able to at least read an arbitrary cell phone's text messages and call logs.
Hate to be the bearer of bad news, but governments aren’t any better. Local government in particular is usually an IT security nightmare.
There was a local government in a state I used to reside in that required folks to have an “alarm license” for their home and fined people for false alarm police callouts. The form to apply required you to give an alarm code for the police, and of course your name, address, and phone number.
Predictably, the database of information was ineffectively secured and basically public on the Internet for years before it was fixed. I don’t recall any burglaries or home invasions happening due to it, but still rather asinine.
At this point in my life I have basically no faith in any institution in society and treat all information I give out as effectively compromised immediately.
Deleted Comment
I wouldn't be willing to bet on it.
But someone with minimal direct criminal/financial risks of exposing something is definitely higher risk than others, and that is most startups.
That said Amazon reps have clearly been bought out before, and individuals within most large corps have always been viable targets of blackmail, bribery, coercion, etc.
It’s why some societies are so resistant to phasing out Cash. Anything else gives leverage to folks that historically it’s been a bad idea to give leverage to.
Guess who works for $State Fertility Group, with a social 111-11-1111 and makes 100M/year.
- https://en.wikipedia.org/wiki/Foxconn#Controversies
- https://en.wikipedia.org/wiki/Hyundai_Motor_Manufacturing_Al...
etc.
Tesla is neither a private company, nor a startup.
But, your point is still valid.
The parent implies 'private' in the sense of non-governmental entity (the Indian terminology), and by that metric both Tesla and Paytm are 'private' (and publicly traded)
What about the DBA maintaining the database? Do they not have query access to the data? How about the devs who are responsible for reporting; do they develop reports using generated test data? It’s naive to believe that data is entirely secure and private. There’s always a level of trust required from employees to not share private data that they may see on the job.
And you shouldn't.
You also shouldn't trust established companies to keep your data safe. The track record of that sort of thing is absolutely dire.
I'm guessing it's the former, but I'm just too paranoid to not ask...
Deleted Comment
Zuck: Yeah so if you ever need info about anyone at Harvard
Zuck: Just ask
Zuck: I have over 4,000 emails, pictures, addresses, SNS
[Redacted Friend's Name]: What? How'd you manage that one?
Zuck: People just submitted it.
Zuck: I don't know why.
Zuck: They "trust me"
Zuck: Dumb fucks
On the one hand, let's assume it's true: a Paytm employee acted negligently.
But on the other hand, what if it's not true? What if you happen to have a friend or family member who works for a Paytm competitor, or you have some grudge against Paytm for whatever reason, and are instead spreading low-key FUD about the company to make it seem like they have lax data controls and staff disregard for sensitive data?
The issue is that there doesn't really seem to be a way to substantiate your anecdote.
Not negligently - maliciously.
The employee knew exactly what they're doing, that it was "wrong" in any conventional sense -- and most likely a huge liability to their career and reputation if it got found out.
If I had a friend or family member who was an employee of such a publicly facing tech company, I’d be grilling them about their data security and privacy practices. I’ve been burned enough times by Indian companies so ridiculously free with their data sharing that I’ve stopped giving out my contact info to everything but the most essential of services.
Most Indians will lean towards believing the GP because they know how aggressively their personal data is being abused, unless Paytm comes out with concrete details of how they protect privacy inside and outside the firm.
Deleted Comment
How about not willingly providing information of people that actually don’t NEED it?
Edit: I'm responding specifically to this: The issue is that there doesn't really seem to be a way to substantiate your anecdote.
No, that's not “negligent”. Or even “reckless”; the violation of privacy is deliberate.
No, that's not just “negligent” or even “reckless”, its intentional wrongdoing.
In my experience, everything bad you can imagine, a for-profit has already done.
Is this really becoming standard? If so, that really bums me out.
I considered buying a Subaru for my latest car but one of the big negatives was an internal camera pointed at the driver that couldn't be disabled. It was, at least in part, for driver attention monitoring but performed horribly. It would ding contantly, even when I was staring straight ahead.
I have no faith that these video feeds will be kept private and really, really don't want to have to worry that any awkward or embarassing thing I've ever done in a car could be released to the world.
I'd be pretty surprised if any of the cameras routinely stream. Too much of a bandwidth hog.
What?
https://devblogs.microsoft.com/azure-depth-platform/time-of-...
Dead Comment
Is this true? That's yet another reason to avoid buying newer cars.
Deleted Comment
Deleted Comment
Edit: I wouldn’t want to touch a Tesla even if it was gifted to me. Seeing the founder behave the way he does (latest incident: Changing Twitter’s logo to a shitcoins image) makes him extremely unsympathetic to me.
On paper. But there are plenty of examples of companies doing things that they shouldn't be doing.
Given that point, doge is an avatar of stupidity personified (caninified?) and so is a good fit for Twitter.
> Privacy is taken extremely seriously here in Germany.
Yes, to the point nothing can get done anymore and e. g. having an outdated, non-digitalized healthcare system.
Tesla is doing it wrong. But Germany is also doing it wrong - just on the other extreme.
As a counterpoint, BMW started putting feature behind premium subscriptions couple years ago such as steering wheel. Such micro transactions are really hard to combine with privacy.
I find this hard to believe considering their numerous lies about their car emissions and the extent they went to cover it up.
This kind of culture was unacceptable back then, of course, but the founders and owners were little more than children themselves. It was at least understandable if not excusable.
What completely boggles my mind is that, some 20 years later, this kind of culture is still happening. And it's not happening at some small tech startup founded and run by young men barely out of school - it's happening at a wildly successful car company which has been around for over a decade run by the richest man in the world.
The culture at Tesla must be rotten to the very core for this to persist so long.
Why? Do you feel human nature has evolved in the span of 20 years? Or more money in the industry begets professionalism?
Deleted Comment
Well, Amazon doesn't like unions neither.
We've come to expect privacy as a first class right. That's why turning a customer's potentially embarrassing action into an internally shared meme feels like an extreme violation.
Which is terrifying to me, they have to be a top company in that aspect, and I don't even fully trust them to get it right.
The little guy, unfortunately, has no chance :/
Reminds me of the Uber Christmas party story where they had things like this on a projector
> Tesla managers sometimes "would crack down on inappropriate sharing of images on public Mattermost channels since they claimed the practice violated company policy," Reuters wrote.
somewhat implies that this has been a known issue for a while, already. (This shouldn't be triggered exclusively by a public scandal.)
... and televisions, and phone apps, and solar inverters (with wifi!) and air purifiers (with wifi!) and drones (with wifi to activate) and scooters (to activate), and on and on...
Yes, OnStar, but that's 1) not specific to EVs, and 2) trivial to disable.
So it has the capability to connect? No thanks. I don't think that's the distinction OP was trying to make.
Dacia Spring if you live in Europe. Doesn't get any more barebones than that.
Just a normal car, but electric.
We have an Audi etron now and I can’t stand it. I miss the bolt.
What is wrong with the Audi?
I have a European-made EV that has no "intelligent" (surveillance) capability at all.
A bare bones, maximum efficiency, rugged EV.
A man can dream.
This is an easy way to excuse Tesla's behavior. You're implying everyone does it. <shrug> What can ya do, right?
There's 0 evidence other car companies are allowing PII like this to be spread internally, and employees are so brazen as to use them as the basis for memes.
And absolutely no surveillance built in. Until then I'll stick to gasoline cars.
The absence of evidence isn't evidence of absence. Even so, there are documented security issues with most connected vehicles.
https://www.youtube.com/watch?v=YKDqhWwzzgo
Deleted Comment
Dead Comment
I'm not sure how difficult or possible it is to actually get a car so configured, but you can always rip out the GSM radio yourself if you wish (which voids your warranty, natch).
https://www.theverge.com/2014/11/19/7245447/uber-allegedly-t...
You might say the problem was in the founding DNA: https://www.bbc.com/news/technology-40352868
Deleted Comment
The more places where networks capable of surveillance exist, the less privacy you have, and the less autonomy you have. This has been obvious for decades. It is not surprising. No one on HN should be surprised by either the surveillance capabilities of a smart car, or the assdouchery capabilities of people who work for... ahem. I, for one, took both as a given, and I don't even work in tech.
Dead Comment
> According to several ex-employees, some labelers shared screenshots, sometimes marked up using Adobe Photoshop, in private group chats on Mattermost, Tesla’s internal messaging system. There they would attract responses from other workers and managers. Participants would also add their own marked-up images, jokes or emojis to keep the conversation going. Some of the emojis were custom-created to reference office inside jokes, several ex-employees said.
Work environments should ideally be fun and collegial. But I just would've assumed that it's incredibly obvious that managers should not encourage a culture of memeing and shitposting when the job is so centered on private user images.
a fish rots from the head down.
update: Found one of them,
https://news.ycombinator.com/item?id=1692754 ("Google fired engineer for breaking internal privacy policies" (2010))
It's not linked in there, but the Wired writeup adds that (0) there were at least two SRE's fired for something like this and (1) Google chose not to report this to law enforcement, seemingly prioritizing its public image over its users' safety.
https://www.wired.com/2010/09/google-spy/ ("Ex-Googler Allegedly Spied on User E-Mails, Chats" (2010))
It might shock me a little just because of the risk/reward if they were just snooping on random people. But not if it was exes/crushes/friends/etc.
From what I heard they did indeed have some LOVEINT problems before that i.e. an employee couple with a nasty breakup - but they were also orders of magnitude smaller.