You think that only listening to people with similar interests, will filter out misinformation? What?
On the other hand, social media platforms inject content that is popular, according to the number of "likes" and "reposts" into your newsfeed. It means the content has gone viral and it has already spread to several clusters around you, most of them centered around some other topic and, therefore, unable to properly evaluate it.
To answer your question, I don't think it will completely filter misinformation, but I think this strategy is better than anything Facebook and Twitter can come up with.
Just look at this[1] if you want proof that this is not sustainable. I wonder who benefited from and supported this...
1. https://www.nytimes.com/2018/10/11/technology/fake-news-onli...
The problem is automated newsfeed, which makes users read "recommended" content and ads instead of allowing them to choose which sources they trust.
The real solution is to switch to Mastodon and similar federated networks, where everyone can select who to follow and receive nothing else. It is easy to subscribe to actual people with similar interests who "boost" (repost) the kind of content you like. They filter out misinformation way better than any of those Facebooks and Twitters can ever do.
Once this was noticed we started weighing the terminals because we could not open the devices (once opened they become useless).
They have learned of this so they started scraping non-essential plastic from inside the device to offset the weight of the added board.
We have ended up measuring angular momentum on a special fixture. There are very expensive laboratory tables to measure angular momentum. I have created a fixture where the device could be placed in two separate positions. The theory is that if the weight and all possible angular momentums match then the devices have to be identical. We could not measure all possible angular momentums but it was possible to measure one or two that would not be known to the attacker.
You only need to measure three angular momentums, all other can be calculated. See https://en.wikipedia.org/wiki/Moment_of_inertia#Motion_in_sp...
"This shows that the inertia matrix can be used to calculate the moment of inertia of a body around any specified rotation axis in the body."
On the attacker side, they only need to make sure three angular momentums match.
The linked article clearly says that the videos were deleted.
FTA: "Navalny’s aide, Leonid Volkov, said in a social media post that Google deleted the videos after the Central Election Commission had sent a letter of complaint to Google about the advertisement — a demand Volkov called illegal. "
The letter sent to Google is here: https://navalny.com/p/5949/ Video mentioned in the letter (you can see it in the scan) is still available: https://www.youtube.com/watch?v=fGhGMhSd99w
Deleted Comment
The real solution to censorship is federation. We are bad at making decentralized search engines, and projects like YaCy [1] have never been successful, partly because it is hard to make ranking algorithms decentralized. But projects like Mastodon [2], Pleroma [3] and PeerTube [4] already have made federated social networks where you can decide what you subscribe to and receive it regardless of someone else decisions. The technical part is simple and transparent and if your provider tries to censor the sources you like, you can always switch to another one or host it yourself.
Surely you have to backup your data, but this case has nothing to do with information access.