I'm ashamed to say that up until now, when I saw Facebook (or similar) acting evil I thought about the quote, "Never attribute to malice that which is adequately explained by stupidity".
Well, fuck that and fuck me, those people are not idiots, they're criminals.
Not to contradict your overall sentiment, but people often focus on intention with these issues. I think that's wrong.
Like, if someone treads dog poo into my house on their shoes, it doesn't really matter if they did it by mistake, or spent ages walking around town trying to find some dog poo to step in before coming to my house; the effect is that there is now dog poo on my carpet.
We need to be more dispassionate when discussing these issues because otherwise threads like this descend into analysis of whether Zuckerberg/Bezos/whoever is a moral person. Which is a)probably unknowable and b)besides the point.
There is a problem here with a very big company that has more power than it knows how to handle, which can probably only be mitigated by breaking it up. That's all there is to it, really.
[edit] Just as an addendum, that's not to say that if the company has done something illegal, the people responsible shouldn't be prosecuted- they should.
Sorry but I think you're being too hard on yourself, while also being inappropriately diplomatic toward Facebook. I mean, crimes happen all the time. I'm a criminal, I speed sometimes, I often jaywalk. I would say they are malicious, and have forfeited whatever position as a trustee of data that people have ever granted them. What do you do with a trustee who puts their interest before the beneficiary? Of course, you fire them. But indeed who is the beneficiary? At least in their public vernacular it's the user, even though more well read individuals know the beneficiary is the trustee, one in the same.
Facebook is perhaps not out of the ordinary really, it's just another of a variety of businesses who have come to realize they hate having users. They just need their data. Vampires don't want human friends either, they just need their blood.
That gets me every time I read in the media about a 'bug' that enabled some complex data transfer. People, somebody has to decide, plan, code, test and deploy the behaviour, and make sure it works. And yet, people accept the explanation it seems without much thought.
Except the quote is generally wrong when it comes to issues of profit. It's simply backwards. People are fucking evil when it comes to money, profits, power, status, pleasure, and generally getting ahead of others. Considering those are the main motivators for almost all human actions, Hanlon's razor is clearly a bunch of bullshit that people spread around to end actual debate and discussions they don't like.
I think that is the perception that needs to change, some actions certainly can't be explained with "move fast and break things".
Some actions move into the criminal areas, like tricking users for passwords and phone number to then use that for other purposes to further their own agenda. That could indeed be seen as criminal I think.
I’d prefer to say negligence than stupidity. Being stupid isn’t something you can avoid, but being negligent is something within their power to avoid and is a legitimate target of legal sanction.
Come on, you knew facebook was evil. Most people know facebook is evil. We choose to ignore it because otherwise we would feel obligated to change. Change would come at the expense of "likes" and other modes of social validation.
People are realizing that social validation is becoming less validated by social media. (Lets remove the likes on IG!!). The people coming around now are sheep looking for the next wave of validation.
If you're a small business - I get it, it's good to whitelist whatever information you're logging explicitly, but for smaller teams a hard to diagnose issue might lead the team to "log everything so we can sort it out later". Facebook is Facebook, whether this decision was the product of the corporation as a whole, a small dev team, or a highly paid consultant/third party, Facebook is a big enough company that they don't have the "I didn't realize..." excuse anymore.
Which is the NUMBER ONE MOST BASIC rule of handling users confidential data. It reflects incredibly poorly on the engineering practices of Facebook that this managed to get through. It should be a criminal liability.
That's still a pretty elementary error for a company that gets off on using CS-y riddles in interviews like Google used to. Move fast and break things, and then get a $5B fine.
The first thing I did after setting up Jenkins for our small dev shop was to find and install and test a plugin that removes all kinds of credentials from the logs. And I'm just a developer, and I never claimed nor think that I have any devops or administration qualifications - just common sense.
If Facebook devops engineers, who are probably among the best trained and highest paid on the planet, lack the same common sense for their users passwords... I can't even come up with an ending to this sentence that would properly express my emotions right now.
So, it's not in plaintext if it is in logs? Storing passwords in logs is even worse than on a database since access more likely is less restrictive. Every programmer worth its salt knows about this problem (I surely wrote code to prevent this).
Sorry, but it is like saying "there is no SQL injection, only bad input validation".
> 2) Using that email access to "inadvertently" upload the information of their email contacts.[1]
If they wanted to show proper contrition, they'd not only delete that information from their systems, they'd also remove all links in the social graphs that can be in any way a result of that information. They'd also take 100% of their revenue that was even slightly influenced/generated by the inappropriately gather data and send it out to the users whose data was copied without permission.
Also they should probably put together a legal team who will be ready to start handling the $150,000 per copied item that they didn't have the right to copy. :lol
In 2008-2010 companies had the chance to harvest millions of emails and other personal data. The opportunity is not likely to come back this easily.
Shutting down these options, even if it is the right things to do for the users, is effectively increasing the barriers of entry and reducing the growing speed of new companies. The result is that those companies will maintain a dominant position on the market.
If I understand this right, #3 happened and was (allegedly) resolved before #1 and #2 happened, so that's not a fair shake. #1 and #2 are still valid points.
Is there anything left I can do to escape Facebook? This is just depressing for someone who was in college when FB came out and never signed up. Nor any other social network (still a favorite oxymoron) since.
What kind of a profile of me would they be able to glean from email? Whole thing feels so fucking invasive.
>> “When we looked into the steps people were going through to verify their accounts we found that in some cases people’s email contacts were also unintentionally uploaded to Facebook when they created their account,”
Im struggling with the "unintentionally" part and to play devils advocate trying to figure out if it could be possible. Perhaps they had a service already that performed the login to upload peoples contacts on request and to save time reused this service to also verify the account?
> Every insider I've spoken to said any FTC fine on Facebook under $10 billion would be seen as a massive win, showing firm won't face serious consequences for privacy violations. Wall St. seems to agree in response to news of $3-$5b settlement [$FB up 4% after hours]
This is a very strange way to put it. More likely, the fine was priced in with some level of uncertainty, the with the fine having been decided, the market corrected relative to its estimates.
Is this actually meaningful to any company that isn't IPOing or releasing new shares? I am not pro-facebook by any means but it's not clear this means much aside from how much their investors like them, again of questionable value as the majority of votes are privately held by Zuckerberg.
This is why we need GDPR in the US. Make the fine like 5% of global annual profit (not sales). Something that will hurt and make them actually think it's not worth it and actively avoid it rather than just the cost of doing business and a snicker as they walk away.
> will not acknowledge any wrongdoing as part of the settlement
This bugs the heck out of me (in general, not specific to this case). What is point of letting them claim innocence? How does this benefit the consumer?
I can see occasional exceptions where it's clearly a case of misunderstandings so you don't want to bring down the full hammer...but I honestly can't remember more than one such case where someone DID acknowledge wrongdoing.
Those insiders may just be lowering expectations, preparing their audience to consider rather bad news as a “massive win”.
After all, the insiders most familiar with the matter are those deciding how much to set aside for its eventual resolution. There are rules for how to account for the inherent uncertainty, and massively underestimating the loss would just set them up for new trouble, i. e. a shareholder lawsuit.
A $10 billion fine for the Cambridge Analytica "scandal" would be so far beyond reasonable as to merit criminal charges against Joseph Simons. $3 billion is already insanely ridiculously high.
Before this thread becomes a Facebook bashing session, please keep in mind that Equifax leaked all your SSN data along with names and addresses and got away with no fines.
I am not a lawyer but intent appears to play a big role. A company that is negligent or incompetent will always face a lesser repercussion than one who acts deliberately.
Now of course this is not to exonerate Equifax whose entire premise rests on safeguarding sensitive information. From the consumer side, the 2 incidents are equally bad.
Not that I completely disagree with you, but lets not pretend that leaking SSN is equivalent to possibly leaking emails and passwords. One of those is far more important than the others. There was also tons of evidence of insider trading by Equifax execs. If anything the Equifax stuff proved to me that the American public doesn't care about privacy and so neither should investors.
> whose entire premise rests on safeguarding sensitive information
Is this a joke? You are not Equifax's customer and they do not need your trust. Their entire premise is selling information about you, to people who do not trust you. Securing your data is something they have to do for compliance, not a core part of their business.
Equifax did not violate the terms of a prior FTC settlement agreement in order to avoid punishment for their prior bad behavior. It’s a lot easier to expedite penalties when there is a written settlement that is being violated than original bad acts.
The Equifax matter is far from over, they are being investigated by: 48 state Attorneys General offices, the District of Columbia, the FTC, the CFPB, the SEC, the Department of Justice, other U.S. state regulators, certain Congressional committees of both the Senate and House of Representatives, the Office of the Privacy Commissioner of Canada, and the U.K.’s Financial Conduct Authority.
Not to mention they made a recent SEC filing acknowledging they expect fines from FTC and CFPB.
Precedence matters - especially for government inquiries and fines.
Also, people use FB by choice today while something like Equifax is forced upon us given institutional structures. So it is unclear why Facebook is more wrong than Equifax.
It seems beside the "the bug bounty is far less than it would have sold for in the dark market" another meme has taken hold of HN. No matter how significant the fine is, there always be people clamoring how trivial it is compared to their revenues. It could have been $10B and you would hear the same thing. The only fine, then, worth slapping is one that bankrupts the entire company.
Facebook made ~$7B net income in 2018. Am I supposed to believe that $5B fine isn't going to affect them at all?
This fine is not for everything unethical Facebook ever did. This fine is for Facebook giving data to Cambridge Analytica. I don’t think Facebook got paid for that at all. When trying to google around for what people paid CA for the use of the data, I found numbers in the hundreds of thousands.
So in fact, Facebook is paying billions because they made a bad decision when choosing to grant access to data for research. That’s a lot of money. I think if you’d asked what a likely fine was when the story broke, people would have guessed a few million dollars.
The relevant privacy issues (Cambridge Analytica) did almost nothing for facebook. The core product, sharing information with apps so they could be social, was a bad idea and never took off. It was shuttered in 2014 (a little bit after Kogan made his app and extracted a bunch of data). So of that 172B in revenue, basically none of it came about as a result of the product that Cambridge Analytica (and others) used.
> Facebook earned about $172 billion since 2013. Do you think facebook would have earned $167 billion while behaving legally?
Considering the fine seems to be primarily based on something they stopped doing in 2014, yeah, I think they'd have easily earned $167 billion without doing it.
>Facebook earned about $172 billion since 2013. Do you think facebook would have earned $167 billion while behaving legally?
>5 billion is ... an order of the magnitude off
So you're saying a 30% of their income came as the result of illegal activity? That's a pretty strong statement to make. What proof do you have of that?
Do you want to punish Facebook for doing things that are legal (but arguably not ethical) or do you want to punish them for doing things that are illegal? The former means you're essentially advocating for an ex post facto law and it's easy to understand why some people would see that as not fair.
The fine should be high enough to nullify any profit they made from their malfeasance, plus some amount as punitive damages to discourage the behavior in the future.
Finding the right fine for actions like this is hard.
On one hand, we (as a society) want it to hurt, to cause the company real pain. On the other hand, we don't want to actually destroy the company. This leads to a "cost of doing business" problem, where evil practices become a preferred path for management, if evil is sufficiently profitable.
The underlying problem is that corporations are NOT people, whatever the law says, and the corporation itself has no inherent ethics or morality. The fire doesn't choose to burn the forest, it just does.
edit: I suppose the problem in part is society. We see a beauty and symmetry in capitalism, so we assume beautiful == good, a philosophical failing going all the way back to the ancient Greeks. Just because it's beautiful and useful doesn't mean it's "good" in a moral sense.
The size of the forest fire matters. Facebook is probably the first company in history that can cause planet wide forest fires at scale and speed never possible before.
That said I think they finally get that, after a long period of total denial. Not because of the fines and regulatory action which have come too late, but just from all the unintended unignorable consequences that have pilled up.
Yeah. The loss of long-term customers over privacy issues, addiction issues, and general creepiness is probably a greater concern to them than a fine.
I quit using Facebook because I realized it's actively unhealthy for me, and I've been "healing" since. They don't want the dribble of middle class tech nerds leaving the platform to become a flood.
It would be bad for investors, but... would it be that bad overall? I guess a replacement could theoretically be worse but they would have to commit to that ideal and not be dissuaded by the fate of their predecessors
It's a fair question, something we should be asking ourselves as a society.
But as I suggested earlier, we're still stuck with a society where a lot of people, perhaps even a majority, find corporate capitalism to be morally righteous, not merely elegant and effective. Corporate life is more sacred than human life in our world, sadly.
"The Silicon Valley company and the F.T.C.'s consumer protection and enforcement staff have been in negotiations for months over a financial penalty"
Honest question: What gets negotiated in this kind of settlement? What leverage does Facebook have in this situation to say, "no, that fine is too high"?
> Now the FTC is facing a potentially big delay and a risk of no fine
The risk is much higher than that. There's a risk that a court rejects the entire justification for the fine greatly reducing both the ftc's ability to impose future fines and their power in general.
If Facebook is being charged this much for their (pretty serious) privacy issues, Equifax deserves to be shut down and have their lead executives face jail time.
I'm not defending Facebook, but I feel like whenever a huge financial instiitution does something, they're treated as a protected class, but other companies get hit pretty hard (justifiably)
1) Prompting users to give Facebook their email passwords.[0]
2) Using that email access to "inadvertently" upload the information of their email contacts.[1]
3) Storing said passwords and others in plaintext. [2]
It's pretty impressive that a company could do something so brazenly malevolent and be confident that they will escape with no more than a fine.
[0] https://www.thedailybeast.com/beyond-sketchy-facebook-demand...
[1] https://www.theguardian.com/technology/2019/apr/18/facebook-...
[2] https://krebsonsecurity.com/2019/03/facebook-stored-hundreds...
Well, fuck that and fuck me, those people are not idiots, they're criminals.
Like, if someone treads dog poo into my house on their shoes, it doesn't really matter if they did it by mistake, or spent ages walking around town trying to find some dog poo to step in before coming to my house; the effect is that there is now dog poo on my carpet.
We need to be more dispassionate when discussing these issues because otherwise threads like this descend into analysis of whether Zuckerberg/Bezos/whoever is a moral person. Which is a)probably unknowable and b)besides the point.
There is a problem here with a very big company that has more power than it knows how to handle, which can probably only be mitigated by breaking it up. That's all there is to it, really.
[edit] Just as an addendum, that's not to say that if the company has done something illegal, the people responsible shouldn't be prosecuted- they should.
Facebook is perhaps not out of the ordinary really, it's just another of a variety of businesses who have come to realize they hate having users. They just need their data. Vampires don't want human friends either, they just need their blood.
My response to that quote is, to paraphrase Arthur C. Clarke: "Sufficiently advanced stupidity is indistinguishable from malice."
Some actions move into the criminal areas, like tricking users for passwords and phone number to then use that for other purposes to further their own agenda. That could indeed be seen as criminal I think.
And someone has to be held accountable.
I find that the people who use that quote are most often both.
People are realizing that social validation is becoming less validated by social media. (Lets remove the likes on IG!!). The people coming around now are sheep looking for the next wave of validation.
Deleted Comment
in logs.
If Facebook devops engineers, who are probably among the best trained and highest paid on the planet, lack the same common sense for their users passwords... I can't even come up with an ending to this sentence that would properly express my emotions right now.
Sorry, but it is like saying "there is no SQL injection, only bad input validation".
... for seven years.
If they wanted to show proper contrition, they'd not only delete that information from their systems, they'd also remove all links in the social graphs that can be in any way a result of that information. They'd also take 100% of their revenue that was even slightly influenced/generated by the inappropriately gather data and send it out to the users whose data was copied without permission.
Also they should probably put together a legal team who will be ready to start handling the $150,000 per copied item that they didn't have the right to copy. :lol
Shutting down these options, even if it is the right things to do for the users, is effectively increasing the barriers of entry and reducing the growing speed of new companies. The result is that those companies will maintain a dominant position on the market.
What kind of a profile of me would they be able to glean from email? Whole thing feels so fucking invasive.
>> “When we looked into the steps people were going through to verify their accounts we found that in some cases people’s email contacts were also unintentionally uploaded to Facebook when they created their account,”
Im struggling with the "unintentionally" part and to play devils advocate trying to figure out if it could be possible. Perhaps they had a service already that performed the login to upload peoples contacts on request and to save time reused this service to also verify the account?
You're under the impression that regulators are in the business of bankrupting companies.
All the instances you cited of Facebook's wrongdoing are not worth a $50billion(!!) fine. They just aren't.
[I meant this as a serious observation, linkedin was sued but I believe they only got a mild slap on the wrist]
Dead Comment
https://twitter.com/lhfang/status/1121148735818358784
Is this actually meaningful to any company that isn't IPOing or releasing new shares? I am not pro-facebook by any means but it's not clear this means much aside from how much their investors like them, again of questionable value as the majority of votes are privately held by Zuckerberg.
If the settlement happens, I imagine this would provide some weight behind any class action lawsuit.
This bugs the heck out of me (in general, not specific to this case). What is point of letting them claim innocence? How does this benefit the consumer?
I can see occasional exceptions where it's clearly a case of misunderstandings so you don't want to bring down the full hammer...but I honestly can't remember more than one such case where someone DID acknowledge wrongdoing.
After all, the insiders most familiar with the matter are those deciding how much to set aside for its eventual resolution. There are rules for how to account for the inherent uncertainty, and massively underestimating the loss would just set them up for new trouble, i. e. a shareholder lawsuit.
Dead Comment
Dead Comment
Now of course this is not to exonerate Equifax whose entire premise rests on safeguarding sensitive information. From the consumer side, the 2 incidents are equally bad.
Is this a joke? You are not Equifax's customer and they do not need your trust. Their entire premise is selling information about you, to people who do not trust you. Securing your data is something they have to do for compliance, not a core part of their business.
The Equifax matter is far from over, they are being investigated by: 48 state Attorneys General offices, the District of Columbia, the FTC, the CFPB, the SEC, the Department of Justice, other U.S. state regulators, certain Congressional committees of both the Senate and House of Representatives, the Office of the Privacy Commissioner of Canada, and the U.K.’s Financial Conduct Authority.
Not to mention they made a recent SEC filing acknowledging they expect fines from FTC and CFPB.
https://www.ftc.gov/news-events/press-releases/2011/11/faceb...
Also, people use FB by choice today while something like Equifax is forced upon us given institutional structures. So it is unclear why Facebook is more wrong than Equifax.
Dead Comment
Dead Comment
Facebook made ~$7B net income in 2018. Am I supposed to believe that $5B fine isn't going to affect them at all?
What about ethically?
How much do you think Facebook would have earned if it acted responsibly?
Once you add punitive damages, I don't understand how you could possibly think $5 billion is anything less than an order of magnitude off.
So in fact, Facebook is paying billions because they made a bad decision when choosing to grant access to data for research. That’s a lot of money. I think if you’d asked what a likely fine was when the story broke, people would have guessed a few million dollars.
No, they didn't, it's around $56.3 billion [0].
[0] https://www.macrotrends.net/stocks/charts/FB/facebook/net-in...
Considering the fine seems to be primarily based on something they stopped doing in 2014, yeah, I think they'd have easily earned $167 billion without doing it.
>5 billion is ... an order of the magnitude off
So you're saying a 30% of their income came as the result of illegal activity? That's a pretty strong statement to make. What proof do you have of that?
Do you want to punish Facebook for doing things that are legal (but arguably not ethical) or do you want to punish them for doing things that are illegal? The former means you're essentially advocating for an ex post facto law and it's easy to understand why some people would see that as not fair.
On one hand, we (as a society) want it to hurt, to cause the company real pain. On the other hand, we don't want to actually destroy the company. This leads to a "cost of doing business" problem, where evil practices become a preferred path for management, if evil is sufficiently profitable.
The underlying problem is that corporations are NOT people, whatever the law says, and the corporation itself has no inherent ethics or morality. The fire doesn't choose to burn the forest, it just does.
edit: I suppose the problem in part is society. We see a beauty and symmetry in capitalism, so we assume beautiful == good, a philosophical failing going all the way back to the ancient Greeks. Just because it's beautiful and useful doesn't mean it's "good" in a moral sense.
That said I think they finally get that, after a long period of total denial. Not because of the fines and regulatory action which have come too late, but just from all the unintended unignorable consequences that have pilled up.
I quit using Facebook because I realized it's actively unhealthy for me, and I've been "healing" since. They don't want the dribble of middle class tech nerds leaving the platform to become a flood.
It would be bad for investors, but... would it be that bad overall? I guess a replacement could theoretically be worse but they would have to commit to that ideal and not be dissuaded by the fate of their predecessors
But as I suggested earlier, we're still stuck with a society where a lot of people, perhaps even a majority, find corporate capitalism to be morally righteous, not merely elegant and effective. Corporate life is more sacred than human life in our world, sadly.
https://finance.yahoo.com/quote/fb?ltr=1
Honest question: What gets negotiated in this kind of settlement? What leverage does Facebook have in this situation to say, "no, that fine is too high"?
"No, that fine is too high. We'll settle this in court(s) since we think we can do better."
Now the FTC is facing a potentially big delay and a risk of no fine if the courts eventually hand Facebook a favorable decision.
The risk is much higher than that. There's a risk that a court rejects the entire justification for the fine greatly reducing both the ftc's ability to impose future fines and their power in general.
I'm not defending Facebook, but I feel like whenever a huge financial instiitution does something, they're treated as a protected class, but other companies get hit pretty hard (justifiably)