Readit News logoReadit News
sayhar · 4 years ago
Oh hi! That’s me! (And a bunch of other folks). Happy to answer any questions.

I also really like this founders note that we wrote. It’s in our own words, etc: https://integrityinstitute.org/founders-letter

We are recruiting new members! If you work on integrity / trust and safety / antispam / content quality / etc, let’s talk.

dogman144 · 4 years ago
Many PMs and engineers and laymen knew FB was a rotten product for a long time. I also assume a lot of people at this institute didn’t take the Alex Stamos (ex, short term FB CISO) U-turn either. So…

How is this effort not a virtue signaling by people that made their fortune on the back of this generation’s cigarettes, and now hoping to get traction on being listened to for fixing the mess they created?

shug2k · 4 years ago
Integrity folks across the industry work on solving these problems, not creating them. It is more of a balance of whether to tackle the problems inside the organization, or outside - depending on which is more effective.

What is not in question is that we should be spending time on tackling these problems.

Deleted Comment

homo_ergaster · 4 years ago
>now hoping to get traction

Not even just that, they also want us to donate to cover the costs for the good work they’re surely going to be doing

rajin444 · 4 years ago
Are there any examples of sticking to your principles by standing up for socially and morally reprehensible groups? My initial impression is this is just another group pushing left leaning elite American values as "integrity".

Things like this: https://www.aclu.org/other/aclu-history-taking-stand-free-sp... go a long way.

zestyping · 4 years ago
I have a hypothesis. It goes like this.

Today all the major text-based social platforms work in about the same way: we type text into an empty little box that's threaded under another box. There are variations in ranking and flagging, but the basic mechanism is unchanged.

It is striking to me that, in all these years, we have explored only a tiny little corner of the design space. There are no mechanisms to lower the temperature when arguments get intense, for example. Nothing to help keep us from misconstruing comments out of context. Nothing to assist us in feeling compassion for the people we're talking to, or understanding their intent as they mean it to be understood. And so on. It's almost as though we sold millions of cars without brakes, everyone is crashing into things and hurting each other, and our response as a society is to throw up our hands and say "Welp, guess humans are just too stupid to drive cars safely."

Many people have suggested eliminating engagement as a metric and going back to purely chronological feeds. That sounds pretty reasonable given where "engagement" has gotten us so far.

But what if there were such a thing as "healthy engagement"?

The hypothesis is that healthy engagement is achievable. I don't know whether it is, but there's a huge range of design possibilities that we have yet to explore.

Do you know of anyone working on healthy engagement? Is it something you want to work on, or do you have any recommendations on starting an effort in this direction?

rhizome · 4 years ago
>Do you know of anyone working on healthy engagement?

Wikipedia is one, by not "working on" it.

>It's almost as though we sold millions of cars without brakes, everyone is crashing into things and hurting each other, and our response as a society is to throw up our hands and say "Welp, guess humans are just too stupid to drive cars safely."

No, they have brakes, it's just that half of the car owners are punching people who work at Jiffy Lube and screaming that brakers should be killed.

>But what if there were such a thing as "healthy engagement"?

"Engagement" can not ever be healthy because "working on" it relies on manipulation. Curiosity implies an absence of manipulation and a maximum of arbitrary connections.

You want healthy engagement? Create something well-loved that doesn't depend on or require any additional action from the appreciator for them to experience its complete impact.

Oh shit, that's hard! "Well then how about I scare them into clicking on a link that pays me money when they do so, then use an image and/or words to call them inadequate in some aspect of their lives so that they send me more money?" Tomato-tomahto?

sayhar · 4 years ago
Hey this is a really good question. And I agree with you 100%! We have only explored a tiny corner of the design space.

Since you asked for links, I think you'd like this talk I gave at Berkman a little while ago: https://cyber.harvard.edu/events/governing-social-media-city

It's about: if we think of social media as a new city, what are the alternatives to hiring tons of cops/censors? What about urban planning, bike lanes, etc?

As for healthy engagement: I think there are a few people in this space. I honestly don't know as many as I'd like. This will be a learning experience for everyone. I think New Public might be doing good work, but I'm not sure!

https://newpublic.org/

Hope that helps!

SrslyJosh · 4 years ago
> There are no mechanisms to lower the temperature when arguments get intense, for example.

This is a second, third, or fourth-tier concern after stuff like white supremacists using platforms for organizing/harassment, governments using them to facilitate genocide, scammers using them to profit off of the pandemic, etc., etc.

"When arguments get intense" completely ignores the incredibly low-hanging fruit: banning networks of bad actors. (They have the data to do this, they just don't want to.)

> Do you know of anyone working on healthy engagement?

Twitter is trying to warn users about "intense" conversations. Unsurprisingly, they suck at it:

https://twitter.com/angryblacklady/status/144754672404668416...https://twitter.com/chadloder/status/1446879163307028481

neural_thing · 4 years ago
It's understandable that people go to work for Sauron to get paid big money. But starting a "human-orc ethics think tank" because you have relevant experience in the field is a bit rich.
labster · 4 years ago
We need to be able to forgive those who have done evil, if they truly repent. If they’re trying to do good now, we shouldn’t hold it against them that they worked for a monopoly spreading lies that incited violence, created online addicts, and took the pieces of silver happily. No, it only matters do good now.
LurkingPenguin · 4 years ago
It's also a bit rich to tell people "from the start, the Institute has been funded out-of-pocket by its founders. But that can’t last forever." -- even though you were only incorporated a month ago on September 29 -- and start asking for donations before you have received approval for tax-exempt status[1].

[1] https://integrityinstitute.org/donate

smsm42 · 4 years ago
How do you ensure that "integrity" does not morph into "ban everything our groupthink says is wrong"? How do you ensure you're not coopted into an ideological or partisan propaganda/speech control efforts - a trap that so many "fact checkers" gladly fallen into?
sayhar · 4 years ago
One thing we focus on is not really looking at content at all, but instead, behavior. Mostly -- spam.

One way to see what we're doing is glorified spamfighting. Except the spam sometimes isn't fake ray-bans, but is instead doctored videos that call for lynchings in India.

Here's more on the subject. (My ideas, not the full institute): https://cyber.harvard.edu/events/governing-social-media-city

wolfram74 · 4 years ago
One content agnostic approach to improving signal to noise is the tendency for deep chain sharing to be mostly false/misleading content. Introducing a forced copy-paste hurdle after the chain gets, say, 4 deep, would be content agnostic, but probably improve the quality of what get's spread.
shug2k · 4 years ago
Transparency goes a long way towards addressing this concern. This is one area we are focused on - asking companies to be more transparent, both in their overall metrics and samples of public content, so we can have those debates as a society rather than behind closed doors.
btown · 4 years ago
What are the institute's thoughts, if you've formed them, on end-to-end encryption, especially as it applies to social media where the line between group and group-text blurs? I feel it's an incredibly nuanced topic that's become incredibly polarizing in recent days with some of Haugen's comments.

On the one hand, in favor of E2EE, companies can and will use the content of messages, if they have access to them, to micro-target suggested content to users, and this can lead to increased levels of misinformation being promoted to people who have engaged with misinformation. And of course there's the government surveillance angle, which is an entirely separate story!

But if you remove the signals in that content by encrypting in a way that is opaque to the platform, do you substantially reduce the ability to microtarget? Very possibly not, given the amount of graph data the social media company has anyways about group members independent from the content itself. And encryption gives the social media company the ability to wash its hands of any responsibility or awareness of content.

Assuming it were easy to technically achieve (which is a huge leap, to be fair!) do you think it better serves the definition of integrity you've adopted, that a social media platform have the majority of its content end-to-end encrypted, or not?

sayhar · 4 years ago
Hi! Thanks for the constructive question. I agree with you that it's nuanced, and also I'm sad that it's getting polarized/simplified in some venues.

I don't think we have gotten an institute stance on very much relating to E2EE. Our community advisory board (composed of integrity workers) is the moral core of the organization. So far, when we say "the institute has a stance on X", that has meant "the advisory board signs off on X, and that X represents a good faith consensus view of workers in the industry".

We don't yet has a doc that lays out why we think integrity and privacy can coexist nicely. Speaking only for myself, I think the answer lies in careful design. As an example, you could see, via research and experimentation on FB Messenger, that messages that are forwarded in chains of > N are just empirically overwhelmingly likely to be bad faith, spammy, etc. You could then take that finding to WhatsApp, Signal, etc, and then bake in changes to the UX that make it slightly more annoying to forward messages if they've been on a reshare chain of N/M. That kind of stuff.

There's also some consideration to group size -- if a group is 5000 people large on, say, Telegram, it might be encrypted, but it's no longer really private. Maybe it should be treated differently? Unclear, let's think and research about it.

I think a rough consensus we might move towards is treating messaging differently than broadcast, and also treating broadcast features inside of messaging apps differently than straight up messaging themselves.

But again, those are just some of my more idle thoughts. There are members and fellows who are better experts on this particular subject than I am.

Does that make sense? Is that helpful?

piva00 · 4 years ago
Not working on integrity but really, really interested in seeing this coming alive. I've had a very deep curiosity about this topic for a couple of years, allegedly Facebook's products might have influenced the elections on my home country (Brazil), which has very directly impacted the quality of life of my family still living there.

Looking forward to see what comes out of this and wishing you and the team all the best luck, thank you.

ChrisMarshallNY · 4 years ago
I am not working on integrity (I have a feeling that the integrity offices of many companies have brooms and buckets in them), but I do write software that Serves a constituency that has a very vested interest in the matter, and wish you well.

I also have a personal code of ethics, and hold myself to a very high standard of Personal Integrity.

In my experience, talking about that in the tech community does not end well.

Ethics and Integrity do not seem to be popular topics for discussion in SV.

sayhar · 4 years ago
Thank you! I helped set up the Brazil Election War Room -- the first election war room inside of FB. It was intense! There is a special place in my heart for your country <3
gwittel · 4 years ago
(1) Thank you so much for doing this. Facebook has been after me for years (> 15 years in anti-spam/email abuse). I keep putting it off, but its time for a clear "no thanks". As an outsider saying "no" - is there anything I can convey to them to drive the message home?

(2) Minor - your "Join Us" link from the Founders Letter page is 404 -- https://integrityinstitute.org/join-us

sayhar · 4 years ago
Thank you!

(1) - I'm not sure. I don't think I'm an expert here. But if you've got 15 years of anti-spam experience, we'd love to have you join us as a member :-)

(2) - Thank you. Fixed!

tmule · 4 years ago
What’s the agenda here? (Fit the description, but am very skeptical of your motives :))
sayhar · 4 years ago
I think this should answer your question: https://integrityinstitute.org/founders-letter

But also, our values: https://integrityinstitute.org/our-values

Also, succinctly -- this is a real, grassroots thing. We gathered a bunch of friends and coworkers for this big idea of "what if we had best practices and a professional association for integrity work, just like we do for cybersecurity" and then worked for 10 months to do it.

We've tried hard to put all kinds of pieces into place. We're making it a place that is both independent of companies and also safe for current employees of those companies to join.

It's also cool that we can draw on this community to give expert advice to stakeholders (policymakers, journalists, companies, academics, etc). Mad that congress doesn't understand how Instagram works, or whatever? We can explain things -- as people whose training was in looking at the total information ecosystem of a platform.

A goal is to have integrity work be at least as prestigious, valued, and essential as cybersecurity, or software engineering is. A thing where quality matters, and if you do shoddy work you will be called out on it.

Does that help?

spywaregorilla · 4 years ago
How do you define integrity in this context. (you, personally)
sayhar · 4 years ago
This is a great question! I'm still trying to find a top-down definition instead of a "I know it when I see it" one.

To me it's probably something like this: we can think about an information ecosystem or social platform as a system. "Normal" hacking of the system happens through finding loopholes code. (That's cybersecurity). "Integrity-related" hacking of the system happens through finding loopholes in design and rules.

For some easy examples, that covers things like realizing you can post to 1000 groups in an hour. Or using sockpuppets to give artificial boosts to posts. The attackers aren't hacking code, but are hacking a system of rules, norms, and defaults on that system. (And often, finding the holes between where one part of the system was soldered onto the other).

That's the technical part. Integrity also has a sort of ethical component. I think that's meaningful too.

_bfhp · 4 years ago
Have you spoken or do you plan to speak out about the most normalized form of state-sponsored propaganda in the U.S., namely hasbara [1]? For example, Hasbara Fellowships and Israel on Campus Coalition [2]

[1] https://en.wikipedia.org/wiki/Hasbara

[2] https://en.wikipedia.org/wiki/Israel_on_Campus_Coalition#Mis...

Deleted Comment

agd · 4 years ago
My understanding, from speaking with someone at Facebook who has worked in this area, is that they are primarily empowered to do individual fixes. I.e. they do root cause analysis on why content X was correctly/incorrectly handled, update processes, and repeat.

The problem instead seems to be a systemic one. i.e. What kinds of posts does the platform incentivise and promote as a whole? However, changing the system would require significant product updates and harm the bottom line, as it would likely result in lower engagement.

We're also in this weird position where politicians seem to want to hold platforms accountable for content which is legal, but objectionable. This is also exacerbated by employee activists wanting to do the same.

mzs · 4 years ago
>The problem instead seems to be a systemic one. i.e. What kinds of posts does the platform incentivise and promote as a whole? However, changing the system would require significant product updates and harm the bottom line, as it would likely result in lower engagement.

It took me an hour but I made FB enjoyable for me again by making it a feed that is just my friends in chronological order. As a bonus there is no good way for the system that decides what to show me to be gamed, I just bookmark this:

https://facebook.com/?sk=h_chr

Preventing all the groups and pages I had interacted with before from creeping into that feed was the time consuming but worthwhile part. I had to go into my groups settings & unfollow them all individually though:

https://facebook.com/groups/feed/

same for pages:

https://www.facebook.com/pages/?category=liked&ref=bookmarks

lanerobertlane · 4 years ago
> https://facebook.com/?sk=h_chr

That's just the "Most recent" option from the left hand menu (or the burger menu) on the Facebook website?

https://i.imgur.com/NQlh7lZ.png

sayhar · 4 years ago
> The problem instead seems to be a systemic one. i.e. What kinds of posts does the platform incentivise and promote as a whole?

These are the big questions we want to be tackling.

Dead Comment

debacle · 4 years ago
> Allen left Facebook soon after that for a data scientist job at the Democratic National Committee.

So the guy on the election integrity team then takes a job with the DNC. This is called a revolving door.

And now, curiously, he is part of the effort to push for more censorship of Facebook. This concerted effort to cow Facebook into censoring their platform more is coming from the RNC, DNC, Congress, and the mainstream media.

This founders' letter from their website is dripping with hubris:

https://integrityinstitute.org/founders-letter

They talk about keeping the Internet "safe" and "good" but not free.

baumy · 4 years ago
I had the same take away from that founders letter as you did - full of arrogance, as well as very chilling and creepy in my opinion.

Somewhere in this thread one of them is commenting, and made a point to link to that same letter saying how proud he was of it.

Just a complete disconnect.

guscost · 4 years ago
“Of all tyrannies, a tyranny sincerely exercised for the good of its victims may be the most oppressive. It would be better to live under robber barons than under omnipotent moral busybodies. The robber baron's cruelty may sometimes sleep, his cupidity may at some point be satiated; but those who torment us for our own good will torment us without end for they do so with the approval of their own conscience.” - C. S. Lewis
brodouevencode · 4 years ago
Forget the military industrial complex, welcome to the social media industrial complex.
debacle · 4 years ago
In our current descent into technofascism, I think every industry is going to be gamed.
wil421 · 4 years ago
Not surprising. I know a few people who have family members who are ex US intelligence apparatus or FBI turned Facebook investigators. They investigate the bad bad stuff like CP, human trafficking, and animal torture. I’m sure they are specifically hired for connections.
debacle · 4 years ago
The current conversation isn't about the real bad stuff (which I thankfully have never seen on FB), but rather about how much control the government should have over Facebook as a platform, and the direction seems to be universally in favor of more restriction of speech, not less.
camgunz · 4 years ago
Do you really disagree that there's way, way too much bad stuff on Facebook and it's had a hugely harmful impact on our society? Or, do you just think we shouldn't do anything about it?
colpabar · 4 years ago
I don't think the comment was about "too much bad stuff on facebook" at all, rather it was about how an actual facebook employee, who now works for the dnc, is forming a group that will advocate for policy that will affect facebook. When this happens in the banking industry, we call it regulatory capture. When it happens in the insurance/healthcare industry, we call it regulatory capture. When a board member of an oil company is appointed head of the epa, we call it regulatory capture. Why is this case any different?

(here's another way of looking at it: if the guy was hired as a "data scientist" by the republicans, wouldn't you question his motives?)

baumy · 4 years ago
Yes, I disagree with that. Even if I agreed, I'd still think we shouldn't do anything about it.

How much bad stuff is "too much?" Which stuff is "bad"? Who gets to decide? Why should I trust them? Who and what else is influencing them? What happens when the people making those decisions change? Why should I cede the authority to decide which "stuff" I get to see to anyone other than myself?

There are no good answers to any of these questions, and there never will be.

refurb · 4 years ago
That’s exactly what this and all the efforts to control Facebook are - a power grab.

Slap some “feel good” language around protecting democracy (ha!) and they can nicely make sure it’s their message that’s approved (of course! They are protecting democracy) and not their opponents (they are harming democracy!).

TechBro8615 · 4 years ago
This is either a Trojan horse or an exercise in self-aggrandizing pomposity.

What is this about “not breaking your NDAs” with Facebook? If you had integrity the NDAs wouldn’t matter, because no moral framework is constrained by respecting an arbitrary document drafted by a megacorporation to keep you silent. When your first move is to assuage any fears from your former employer that you might do something as drastic as break your NDA, it’s obvious your priorities are nowhere near where you claim them to be. So the question becomes, is this intentional dishonesty, or lack of self-awareness? I suspect a bit of both.

Don’t tell me what integrity means.

Edit: Apparently I read this wrong, and they haven’t actually quit their jobs? So this is just a cartel of self-righteous employees at tech corps with plans to maximize efficiency of collaborative deplatforming? Lol!!!

colpabar · 4 years ago
The line about the one founder taking a job with the DNC was enough for me, but I'm glad I read until this part because I think it's even more illuminating:

>"Frances is exposing a lot of the knobs in the machine that is a modern social media platform," Massachi said. "Now that people know that those knobs exist, we can start having a conversation about what is the science of how these work, what these knobs do and why you would want to turn them in which direction."

Which I translated to:

>Now that someone else has broken their NDA, we can use our jobs that we are not quitting to make more money.

thr0wawayf00 · 4 years ago
> If you had integrity the NDAs wouldn’t matter, because no moral framework is constrained by respecting an arbitrary document drafted by a megacorporation to keep you silent.

The desire to not spend the next several years mired in lawsuits brought by a company that can hire an army of lawyers doesn't need to be explained by a moral framework. Wouldn't you rather spend that time being an activist instead of sitting in the courtroom, wondering if you're going to survive the financial fallout?

stainforth · 4 years ago
The recent whistleblower has also been suggested to be a tactic to control and set the terms of the concern about Facebook, utlimately not challenging its reach as a monopoly but to allow for certain censorship actions that facebook would be more than happy to oblige to do.
samlevine · 4 years ago
Facebook should focus on giving tools to individuals and communities to communicate with each other and speak freely with one another. Communities and individuals should determine their rules, not Facebook.

Platform wide content moderation is inevitable (illegal content exists), but mass censorship is bad. It will come for you, if it hasn't already.

zht · 4 years ago
you mean like how r/physical_removal, r/jailbait, r/creepshots, etc etc were allowed to proliferate on reddit
akyu · 4 years ago
>We respect your privacy. We will not share your data with anyone for marketing purposes.

Yet the homepage has Google analytics and Squarespace cross site trackers. Not a great first impression.

jeffbee · 4 years ago
Not everyone equates privacy maximalism with integrity. In fact, in the eyes of many people they are unrelated topics.
e-clinton · 4 years ago
They explicitly say “marketing purposes”… doesn’t mean they won’t share for other purposes.
smsm42 · 4 years ago
That's a common trick - "overly-specific lie". "I never stole your wallet from your purse" (I stole it while it was lying on the table), "I never told your wife any rumors about you" (I told them to her best friend to whom she talks daily), "we never sell your information to the highest bidder" (second highest is ok though), etc.
ch4s3 · 4 years ago
This has a real Silicon Valley (show) tethics vide to it.
birdyrooster · 4 years ago
I came wading through the comments specifically for this, totally agree.
ch4s3 · 4 years ago
You're welcome. But it is uncanny.
metters · 4 years ago
Dito
aww_dang · 4 years ago
Once again I'm skeptical. From where I stand the entire promo reads like FB demanding regulatory capture. Of course it is done in an underhanded way, "Oh, we've been so naughty! We need the firm hand of regulation"

Shades of Castle Anthrax from Monty Python's "Holy Grail".

Maybe others here would like to argue the minutiae, but I see that as largely irrelevant.

smsm42 · 4 years ago
That sounds a bit like "former Enron staffers launch Ethical Business Institute". I mean I'm sure some nice people worked for Enron too, but...

Also I can't point at the exact place but it definitely has a whiff of "how we get DNC/government to control all major information platforms without them explicitly controlling all major information platforms". I mean, I have nothing against banning troll farms and such, but when I read about platforms' "integrity teams" banning people that are definitely not troll farms but somebody who dares to say something contradicting the Currently Approved Truth (TM) - and it's not one or two cases, but dozens going into hundreds - I start doubting "integrity" has anything to do with it.

rhizome · 4 years ago
This is how you create a regime of regulatory capture.

https://en.wikipedia.org/wiki/Regulatory_capture

>banning people that are definitely not troll farms but somebody who dares to say something contradicting the Currently Approved Truth (TM)

Hundreds? I can't think of a single one, honestly.

smsm42 · 4 years ago
If you can't think of one person who was banned by Facebook and isn't a troll farm, you're either very ignorant in the matter - which is ok, nobody knows everything, and it may not be a subject that is of interest to everybody - or pretending to be. I am not sure if there's hundreds - I personally know cases definitely well into two figures, but probably less than a hundred. But I don't do any systematic research on the subject, it's just what I hear from my acquaintances (none of whom work for any troll farms) or read in the press (usually when somebody high-profile enough gets banned). So I estimate there are many more cases than I personally know. Which would take it easily into hundreds.
shug2k · 4 years ago
If a bunch of Enron staffers left Enron due to frustration with the company and built an Ethical Business Institute to improve industry practices, I would be interested to see what they had to say.

In any industry, it's a difficult balance between expertise and bias - but we have tackled these problems and understand the nuances in detail.

We also have folks from a mix of political backgrounds.

hpoe · 4 years ago
What do you mean by mix of political backgrounds? Do you have people that are registered Republicans, are any of them in leadership positions? Or registered libertarians for that matter?

Anyone that has similar involvement in politics like working directly for a major political party?

EDIT: Checked out the site and darn, I'll give them credit they do have a Republican and someone who worked with the RNC, so that is something. So I'll give you that one, I'm still suspicious but can't knock you for being completely lacking in diversity.

I mean I still don't love the idea, and don't think it won't be used to ensure that "misinformation" is stamped out, but I'll give credit where credit is due.

Deleted Comment

smsm42 · 4 years ago
> from a mix of political backgrounds.

What kind of mix? There are different mixes. A mix of Socialist, Communist, Antifa, Democratic Socialist, Democrat and Left Anarchist certainly qualify as "a mix of political backgrounds", but doesn't exactly represent a broad spectrum of opinions on many important questions.