Readit News logoReadit News
mbid · 8 years ago
> Protecting people’s information is at the heart of everything we do

Thanks for striking down the bad guys, Facebook, our ever vigilant guardian of personal information.

nathanaldensr · 8 years ago
That's exactly what I wanted to say. Talk about the fox guarding the hen house... none of these parties--Facebook included--can be trusted with any data. They're all out to make a buck with it!
Gibbon1 · 8 years ago
> They're all out to make a buck with it!

It's not even that though. If someone is reselling your companies data to a third party without authorization, you put both the company and it's key people on the persona non grata list. Large well run corporations and a lot of smaller companies have black lists like that and they use them.

paulddraper · 8 years ago
I 100% believe in the fox to guard the hen house...from the other foxes.

No point in sharing dinner.

Dead Comment

netsharc · 8 years ago
It's a fucking joke, their way of protecting is to say, "app developer, you're not allowed to do these things.". It's like a bank who leaves people's money in a field, and "protects" it by putting up a sign that says "dear visitors, you are not allowed to steal this money.".
driverdan · 8 years ago
The best way for them to maintain their walled garden is by restricting 3rd party access to the data. This isn't to protect users, it's to maintain and improve their wall.
r00fus · 8 years ago
You don't remember all their silly breaches and lapses of security about a decade ago do you?

Or are they reformed now?

mirimir · 8 years ago
They do want to protect its profitability for them ;)

And politics aside, this is analogous to Craigslist vs 3Taps, Radpad and so on.

almostApatriot1 · 8 years ago
well, you need to have it in order to protect it
oh-kumudo · 8 years ago
It is as weird as we define how to kill a lobster is more humane.
Simon_says · 8 years ago
"humane" does not mean "human".
killjoywashere · 8 years ago
Excuse me, I think you dropped this -> "</s>"
Stratoscope · 8 years ago
Just as a great comedian doesn't laugh at their own joke, great sarcasm is best served without a sarcasm tag.
noobermin · 8 years ago
I think Poe's law isn't that absolute. Given HN style, I could tell his post was sarcastic.
neuronexmachina · 8 years ago
Some additional context on Cambridge Analytica. I'm guessing Dr. Aleksandr Kogan (who also had his FB account suspended) was Cambridge Analytica (and the Trump campaign's) source of data on individual Facebook profiles:

https://www.theguardian.com/technology/2017/oct/26/cambridge...

> Cambridge Analytica used its own database and voter information collected from Facebook and news publishers in its effort to help elect Donald Trump, despite a claim by a top campaign official who has downplayed the company’s role in the election. ... In another case, in the late stages of the November election, Schweickert said the company acquired data on voters who voted early – data it collected from local counties and states – and linked the information to individual Facebook profiles.

kbaker · 8 years ago
I urge everyone to watch this video from Cambridge Analytica about their techniques with big data and psychographic profiling leading up to the 2016 election. Very dystopian, it almost feels like a seminar with a Bond villain. Quote from the video:

> "we were able to form a model to predict the personality of every single adult in the United States of America."

https://youtu.be/n8Dd5aVXLCc

Edit: also this article is a good read about how they used Facebook likes to build up profiles. Also contains a summary of the video: https://motherboard.vice.com/en_us/article/mg9vvn/how-our-li...

noobermin · 8 years ago
So, I'm far to the left of most people, and I'm no fan of Trump, but I watched that video and I got the sense I usually get from reading about ML/big data, people seem to be selling these as more than they actually are. It almost is intuitive if not, I guess, what most people who study market research already know: you tailor your message to your audience. Even on a personal level, anyone who has navigated real life with real people knows that you have to communicate with people in the way that they will be most receptive. Nowhere are they "planting" ideas in people's heads, nowhere are they brainwashing people, they are merely saying the same message (for example, defend the 2nd ammendment) to people who might already agree in a way that they'll receive it. The big 5 personality traits are an already well known and documented idea, and even without that, in your day-to-day life, you learn what people's personalities are and you tailor how you communicate with them anyway.

The thing that they seem to bring to the table however is the massive amount of personalized data. That is the issue here, because that data can be abused by nefarious actors if it were to fall in the wrong hands. What usually takes a much more personal touch (like going out to voters, talking to them, or if you can't do that, have teams on the ground that talk to voters and tell you before your speech what they care about and why) can be done in mass due to essentially aggregating private data that users "agree to" obliviously or even don't consent to. That is the issue at hand, not like Cambridge somehow warped the people in Wisconsin's brains to mush and made them go to the polls.

YeGoblynQueenne · 8 years ago
>> "we were able to form a model to predict the personality of every single adult in the United States of America."

That would be dystopian if it weren't so ridiculous. They can do nothing of the sort- predict the personality of any adult in the USA. At best they might be able to make predictions on a population basis, but anything more precise than that is out of the question. Behaviour prediction on an individual scale is pure, unadulterated fantasy.

This is just typical overselling of a service, by people who can use maths to obfuscate the fact that they 're making it up as they go along, targeted to people who wouldn't understand the maths if a five-year old explained it to them anyway.

mietek · 8 years ago
> a seminar with a Bond villain

Incidentally, the psychologist mentioned in the FB statement appears to have changed his surname to Spectre.

https://www.psychol.cam.ac.uk/people/ak823%40cam.ac.uk

alexdong · 8 years ago
The video above was 7-week before the Trump campaign and this one is after the result and a sort of review of the "success":

https://www.youtube.com/watch?v=6bG5ps5KdDo

bitL · 8 years ago
> form a model to predict the personality

I can do a model of personality using ML/DL anytime as well, it's a piece of cake. How good the model is is another question; I'd be surprised if they were significantly better than random noise given how pathetic "science" of psychology is.

natestemen · 8 years ago
Wow everything about this guy just gives me the chills.
abakker · 8 years ago
Honest question: is this just masked preparation for GDPR?

It looks to me that CA may have gathered and shared FB data, and FB has suddenly realized that they have a GDPR violation on their hands.

IANAL, but I have seen articles suggesting that people will need to re-consent to the use of their data, and I could see this being a problem in this case, where the consent was never given in the first place, but FB would have to be able to report on how that data was being used.

Specifically, it looks likely that CA's data violates GDPR Article 9 section 1 - https://gdpr-info.eu/art-9-gdpr/ - completely.

GDPR violations for a company of Facebook's size would be substantial. In Facebook's case, if the fines were deemed to be "aggravated", the absolute maximum fine would be 4% of FB's annual revenue, or $318Million USD.

ngcazz · 8 years ago
Fast-track GDPR then inject the fine money into the NHS with compliments from the EU.
tener · 8 years ago
Is GDPR retroactive? I doubt so.
jjp · 8 years ago
Yes in that it has no consideration on when the data was captured or acquired. And there is a trickle down to anyone you have or passed the data onto.
harry8 · 8 years ago
Dump facebook. Just do it.

Make your final posts informing your friends how to contact you by email. Follow up with a reminder or two over the coming weeks and delete it.

It will be the best feeling you ever had from doing something with your facebook account.

matt_wulfeck · 8 years ago
And after you dump facebook, invest in creating a strong social network in real life.
rybosome · 8 years ago
Since early 2017, I’ve stopped posting, scrolling mindlessly through the news feed, giving a shit how many likes things I say or appear in garner, and subjecting myself to the collective fear and anger of my bubble. It’s been great! On the rare occasion that I need to go on (a friend has invited me to something and the details are on Facebook), I can get in and out without getting sucked in. Given this, I have no motivation to actually delete my profile, but I also don’t want to be a regular user ever again.
jf · 8 years ago
> It will be the best feeling you ever had from doing something with your facebook account.

I second this, deleting my Facebook account was the best decision I made in 2017

fuzzyoneuk · 8 years ago
I've not been on fb for the past 3-4 years, the absolute best decision I have ever made, no more trying to decrypt cryptic statuses from people you have no interest in and 100% less pictures you could care less about.
YeGoblynQueenne · 8 years ago
>> In 2015, we learned that a psychology professor at the University of Cambridge named Dr. Aleksandr Kogan lied to us and violated our Platform Policies by passing data from an app that was using Facebook Login to SCL/Cambridge Analytica, a firm that does political, government and military work around the globe. He also passed that data to Christopher Wylie of Eunoia Technologies, Inc.

So basically FB's problem is that Kogan passed the data to third parties, without FB's knowledge and -I guess- without FB being in on the deal. Because FB's whole business model is to hoover up its user's data and sell it to "third parties".

Third parties who may then do with it whatever they like, without users having any control over it. You know- like Kogan just did.

FB is trying to pretend they're the responsible party in this - "Protecting peoples' information" is what they do, they say. Well, no it isn't. Trading peoples' information is at the heart of everything they do. And this is just one more example of why it is so harmful.

rogerb · 8 years ago
Someone please correct me - but I don't think that FB sells data. They use it for their own uses, and customers of their advertising platform can use that data to target groups, but I don't think they sell their data to 3rd parties. Am I wrong ?
jimsmart · 8 years ago
jimsmart · 8 years ago
My understanding is that Facebook act as the 'ad broker' themselves, as opposed to selling the data on to third-party ad brokers.
SheinhardtWigCo · 8 years ago
Nobody outside of Facebook has any way of knowing. That’s the problem.
aslkdjaslkdj · 8 years ago
Story from March 2017 about this:

https://theintercept.com/2017/03/30/facebook-failed-to-prote...

>In late 2015, the turkers began reporting that the Global Science Research survey had abruptly shut down. The Guardian had published a report that exposed exactly who the turkers were working for. Their data was being collected by Aleksandr Kogan, a young lecturer at Cambridge University. Kogan founded Global Science Research in 2014, after the university’s psychology department refused to allow him to use its own pool of data for commercial purposes. The data collection that Kogan undertook independent of the university was done on behalf of a military contractor called Strategic Communication Laboratories, or SCL. The company’s election division claims to use “data-driven messaging” as part of “delivering electoral success.”

>Shortly after The Guardian published its 2015 article, Facebook contacted Global Science Research and requested that it delete the data it had taken from Facebook users. Facebook’s policies give Facebook the right to delete data gathered by any app deemed to be “negatively impacting the Platform.” The company believes that Kogan and SCL complied with the request, which was made during the Republican primary, before Cambridge Analytica switched over from Ted Cruz’s campaign to Donald Trump’s. It remains unclear what was ultimately done with the Facebook data, or whether any models or algorithms derived from it wound up being used by the Trump campaign.

>In public, Facebook continues to maintain that whatever happened during the run-up to the election was business as usual. “Our investigation to date has not uncovered anything that suggests wrongdoing,” a Facebook spokesperson told The Intercept.

>Facebook appears not to have considered Global Science Research’s data collection to have been a serious ethical lapse. Joseph Chancellor, Kogan’s main collaborator on the SCL project and a former co-owner of Global Science Research, is now employed by Facebook Research. “The work that he did previously has no bearing on the work that he does at Facebook,” a Facebook spokesperson told The Intercept.

astronautjones · 8 years ago
The Intercept is so wildly underrated (and unfairly disparaged). Vital reporting.
username223 · 8 years ago
Wow. From the laughable shell companies, to the use of underpaid contractors for questionable deeds, to the "we can do whatever we want" terms and conditions, to hiring one of the chief perps, these people never cease to amaze.
NotSammyHagar · 8 years ago
It's too easy for this information to get out. There's basically no penalty, you just use as you want and year's late facebook cuts you off? So you start a new company, owned by a new llc, get the lawyers to sign the 'i am not evil' doc and repeat.

There's no way this will have any impact until there are criminal citations on the people involved. In this case, because it's the us, there's not going to be anything criminal. Facebook, you need to sue companies that do this into oblivion. You are rich and can stand up.

In europe, there's gdpr, the us has no defense. Remember, the chinese and russian govts (probably) were the ones that hacked the company that had all the security clearance applications. Basically, everyone but average americans have access to us govt employee's private information. There must be some blackmailing going on.

jimsmart · 8 years ago
GDPR will most certainly reach into US businesses as well as EU businesses. Any business collecting data on EU users will be subject to the GDPR.

https://www.forbes.com/sites/forbestechcouncil/2017/12/04/ye...

Though as the article points out: "There are still questions about how the EU will enforce these actions against U.S. and other multinational companies [...]"

bklaasen · 8 years ago
European companies that exchange PII with US-based companies in order to provide features to their customers and value to themselves are terminating those contracts and turning to EU-based companies.

Even though many US-based companies are attempting to comply with GDPR, European companies which use their services aren't prepared to take the risk of being in breach of GDPR.

ams6110 · 8 years ago
Here's the thing. When you give your data to Facebook or anybody else you no longer control it. Laws, policies, user agreements, none of that really matters. You don't control your data once it is no longer in your possession.
noobermin · 8 years ago
I don't know if it really is in fb's best interest to sue them. It might in fact be in their interest to allow the players to come back because that would benefit their bottom line.
NotSammyHagar · 8 years ago
Eventually fb will suffer enough about this kind of stuff that it will hurt their rep. I hope.
NotSammyHagar · 8 years ago
I'm getting downvoted why? Isn't it true that's it is too easy to do this? The us govt wasn't hacked? see https://www.washingtonpost.com/news/federal-eye/wp/2015/07/0....
NotSammyHagar · 8 years ago
Now my downvote question is getting downvoted. It's very meta. Let's see if my downvote downvote question get's downvoted.
kevingadd · 8 years ago
This is an especially egregious case because it was well-known a long time ago that Kogan extracted a lot of really sensitive user data via the Facebook APIs that could be used for precision ad targeting. It says a lot about FB that they took this long to take any significant action other than getting a pinky promise from multiple companies that they deleted the data.

Even if they deleted the data, they could have retained information generated using that data which would allow them to effectively abuse the original data for ad targeting - which they did years ago. FB had opportunity to know this long before now.

It's bewildering that they knew this huge amount of sensitive data got into the hands of unauthorized third parties and were willing to treat assurances as a sufficient remedy. At the very least, all of their ad targeting should have been carefully vetted, but it seems ridiculous to let these third parties continue operating on the service when they had already demonstrated a willingness to blatantly violate the FB ToS and use unethical tactics for ad targeting.

In practice FB users aren't informed enough to know what this means or care about it, but missteps like this really demolish any argument that FB cares about user privacy or ToS enforcement. They had a huge amount of time to realize this was happening and take action on it. At this point it seems unlikely that CA and SCL are the only companies doing things like this - it's not exactly a secret that these techniques are effective. If they wanted to make it clear to third parties that this wouldn't be tolerated, they should have cracked down years ago.

maxerickson · 8 years ago
Facebook employees worked with the campaign to help with their digital strategy. Directly.

https://www.politico.com/story/2017/10/26/facebook-google-tw...

I guess there would still be advantage to targeting adds even more directly (using RNC voter data combined with the FB profiles).

xienze · 8 years ago
> It's bewildering that they knew this huge amount of sensitive data got into the hands of unauthorized third parties and were willing to treat assurances as a sufficient remedy.

What exactly should they have done? It’s data, once someone has it, game over. It’s like trying to unring a bell.

thawkins · 8 years ago
Manditory warning notices on all pages of FB, detailing about how your privacy can be compromised, your democratic rights infringed by using FB, in the same vein as warning notices on cigarette packs, or medication packaging.
thawkins · 8 years ago
They could have made at least a cursory attempt to bolt the stable door none the less.

I see no evidence that they have made any efforts to prevent a recurrence.