Perhaps: Pay/reward for data sharing, decentralized training, differential privacy, local training and submitting the weights, local fine-tuning of pre-trained model, marketplace of third-party (open source) models, ...
Perhaps: Pay/reward for data sharing, decentralized training, differential privacy, local training and submitting the weights, local fine-tuning of pre-trained model, marketplace of third-party (open source) models, ...
Also the latest "taboo" is discrimination on intelligence. You can not help it that you were born below the IQ Bell curve. "You just gotta work harder for it" is like telling an overweight person to "just stop eating so much". Difficulty learning new material puts you at an eternal disadvantage, with no way to catch up to what smart people are learning in their teens and early twenties.
The big five openly discriminate on intelligence. Intelligence and aptitude is also correlated with your upbringing (Did you have smart parents to guide your academic career and interests? Were they rich enough to help put you through a good college?).
For all the renewed interest in identity politics and combating discrimination, intelligence is really the odd one out. High school teachers berate you for something that isn't your fault, it is commonly accepted to call out your low IQ, no matter how much you want to work for a tech company -- even if they favor women, poor people, minorities, war veterans, the disabled -- you are not going to get hired to fill some neurodiversity quota.
The ease with which the left ridicules, stigmatizes, and marginalizes right politicians (Bush & Trump) and those that vote for them is astounding to me. "Those idiot low-educated racists ruined it all! They are not smart enough to vote rationally!". Change "low-intelligence" with any other protected status and such language becomes vile and primitive.
> A computing professional has an additional obligation to report any signs of system risks that might result in harm. If leaders do not act to curtail or mitigate such risks, it may be necessary to "blow the whistle" to reduce potential harm. However, capricious or misguided reporting of risks can itself be harmful. Before reporting risks, a computing professional should carefully assess relevant aspects of the situation.
The irony is that the previous leak (which detailed a harmful IT system) was the cause of this very meeting (Googlers wanted to know more about the project they read about on HackerNews/The Intercept). Without that earlier leak, upper management wouldn't be forced to organize this open meeting. Though, I can fully understand getting angry and wanting to signal support.
What this new leak made clear is that the supposed culture of openness at Google is a farce: "Brin reportedly denied having knowledge of the program until after news leaked and what Brin described as "this kerfuffle" erupted." So even one of the CEO's was left in the dark.
It's right there in the US Constitution Bill of Rights.
A censored search engine is a gray PSYOP weapon.
> Psychological operations (PSYOP) are planned operations to convey selected information and indicators to audiences to influence their emotions, motives, and objective reasoning, and ultimately the behavior of governments, organizations, groups, and individuals.
Deleted Comment
These people get banned for what their followers do. But there is no consistency, and political preference. When someone posts the phone number of an employer (after he is deemed racist by a one-sided video), and over 200 people call and the guy loses his job... Twitter does not act. When a verified journalist publicly shames, with photo, a teen alt-right protestor (in an effort to scare people into staying home next time)... Twitter does not act. When the president of the USA threatens nuclear war or reposts incorrect far-right propaganda... Twitter does not act. When people proudly in their bio state that they will punch anyone they perceive as a violent nazi, and post videos and incitements to promote punching more... Twitter does not act.
First they came for Alex Jones, but I wasn't a conspiracy theorist, so I did not care.
Twitter is now a platform where a single tweet can deny you entry to the US. Where sending a 10-year-old .gif to someone with epilepsy can land you in jail with a felony. Where terrorist attack announcements are broadcasted live for all to see. Where people dig up 5-year-old tweets and take them out of context, and send them to your employer. Where preference is given to a select few verified people and their voices amplified (who cares what a spoiled out-of-touch movie/music star thinks of politics?). It may be too late to salvage.
I don't know all the details yet, but Facebook and Youtube should have connections with US intelligence. If Alex Jones, like WikiLeaks, was used by foreign agencies to promote conspiracy theories harmful to US political and democratic coherence (9/11, birther movement, division over gun rights, pizzagate), these companies should be aware of that, which turns their decision into a different light (but of course, they can not come clean about this yet).
Hence changing it to opt out - don't want to donate organs? Fine - opt out.
Let's take this as a fact. You don't solve the problem by flipping this around, because then you only solve for social gain. You just make it worse for the individual:
> It's opt-out now, and easy to opt out, yet despite many people saying they don't support it, few opt out.
Now you have people not supporting (or wanting to remain blissfully unaware of) donorship, registered as donors. You bank on apathy and ignorance, because a simple and deliberate opt-in was not convincing enough, and turned supporters for or against, but still, for whatever reason, on the fence, into fair gain.
Having your cake and eating it: If it is very easy to go through opt-in process, yet there are problems with registering a deliberate choice, then it will also be very easy to go through opt-out process, but there will be the same problems with registering a deliberate choice. The ease cancels out, and the problems remain. But taking away a human right, until registered protest, is far worse than a deliberate opting out of it.
There seems to be a dumbing down of America, where we expect corporations to be the ones to decide things are right or wrong, rather than letting individuals make up their own minds and spot satire or bad will or exaggeration on their own. Anyone watching Alex Jones could laugh at some of his more ridiculous claims. Just because some people believed every word he said..
How is anyone better off now that he's banned? People who didn't watch him still don't watch him. People who did watch him still watch him ( maybe moreso because they downloaded his app ), and people who were on the fence maybe think he was right.
These people get banned for what their followers do. But there is no consistency, and political preference. When someone posts the phone number of an employer (after he is deemed racist by a one-sided video), and over 200 people call and the guy loses his job... Twitter does not act. When a verified journalist publicly shames, with photo, a teen alt-right protestor (in an effort to scare people into staying home next time)... Twitter does not act. When the president of the USA threatens nuclear war or reposts incorrect far-right propaganda... Twitter does not act. When people proudly in their bio state that they will punch anyone they perceive as a violent nazi, and post videos and incitements to promote punching more... Twitter does not act.
First they came for Alex Jones, but I wasn't a conspiracy theorist, so I did not care.
Twitter is now a platform where a single tweet can deny you entry to the US. Where sending a 10-year-old .gif to someone with epilepsy can land you in jail with a felony. Where terrorist attack announcements are broadcasted live for all to see. Where people dig up 5-year-old tweets and take them out of context, and send them to your employer. Where preference is given to a select few verified people and their voices amplified (who cares what a spoiled out-of-touch movie/music star thinks of politics?). It may be too late to salvage.
Would you believe that the author of this research was harping about blockchain adoption by large companies less than a year ago? A lot has changed in 11 months, such as a slow price drop from the 2017 December insanity.
Isn't it a bit rich to first make companies anxious to join in on the bandwagon, then to turn around and say they are investing into something that is on the very brink of destruction? Based on pretending that Bitcoin was invented at its height in December 2017?
Heck, I did not sell any of my market research, I gave it away for free (silly me). Sure, sure, I made a few million here and there, but nothing like charging 0.5 BTC to tell you the financial world is collapsing, because John McAffee is so irrationally bullish.
This time it is real though. I can feel it too. Bitcoin is over! I hope you made out like a bandit, because soon, the only way to make money is to sell research about the impending bounce-back of crypto.
Of course the early adopters of Bitcoin are critical about its demise. They have a vested financial interest in seeing it succeed. Unlike these researchers, who only stumbled upon Bitcoin when they realized that people are willing to pay for their objective expert opinion on something they themselves missed the boat on.