Readit News logoReadit News
sambuccid commented on Meta shuts down global accounts linked to abortion advice and queer content   theguardian.com/global-de... · Posted by u/ta988
quantummagic · 9 days ago
As long as you support social media companies censoring people you don't like, you're in a weaker position arguing against their censorship of people you do like. There should be a strong social objection to all such censorship, but I don't know how we get there from here. All the justifications for censorship during Covid were corrosive, "The 1st amendment only protects you from _government_ censorship, etc."

At this point, nobody trusts the other side to "play fair" and reciprocate, which makes standing on principle feel like a loss. If all sides stood up just a little bit for the principle of "I don't agree with that person, but I defend his right to voice himself", we'd all be better off.

sambuccid · 9 days ago
I agree, we should play fair and form opinions using principles more, but I think there is a caveat to that. If what you are defending actually causes a considerate amount of harm or violence then I think you need to start to think in a more nuanced way and weighting the pro and cons
sambuccid commented on The US polluters that are rewriting the EU's human rights and climate law   somo.nl/the-secretive-cab... · Posted by u/saubeidl
jack_tripper · 15 days ago
What social and environmental policies are you currently lacking? Be specific please.

And we all want many thing in life, like for example I would want my bus to work every 5 minutes instead of every 30 minutes, but everything nice in life has a hefty price, and if you make a large part of the economy bankrupt or leave and workers unemployed or broke from rising costs, in exchange for financially unrealistic environmental targets that only a small part of the population can tolerate("let them eat cake"), then that might not sit well with a large part of the democratic voting population who has to bare the brunt of your wishes.

A balance has to be found between what's nice and desirable and what's economically feasible without causing economic hardship on others, otherwise something breaks and you get rising extremism and .

sambuccid · 15 days ago
That' true, but now everything depends on "what is economically feasable" and unless we are experts ourselves we can't really know. We need to rely on experts to tell us what is economically feasable, but those experts are the ones under pressure from lobbyists to say one thing or the other. Some parties says that it's economically feasable and that will actually save money, other parties say that it's not feasable and it would cost too much.

Oil companies and countries that sell oil will say it's not feasable and companies that produce panels says that it is.

We cannot rely on "what is economically feasable" because unless you are and expert you will have to get that info from one side or the other, and even independent bodies will be under lobbying pressures.

sambuccid commented on It’s been a very hard year   bell.bz/its-been-a-very-h... · Posted by u/surprisetalk
order-matters · 19 days ago
Yes, actually - being right and out of business is much better than being wrong and in business when it comes to ethics and morals. I am sure you could find a lot of moral values you would simply refuse to compromise on for the sake of business. the line between moral value and heavy preference, however, is blurry - and is probably where most people have AI placed on the moral spectrum right now. Being out of business shouldn't be a death sentence, and if it is then maybe we are overlooking something more significant.

I am in a different camp altogether on AI, though, and would happily continue to do business with it. I genuinely do not see the difference between it and the computer in general. I could even argue it's the same as the printing press.

What exactly is the moral dilemma with AI? We are all reading this message on devices built off of far more ethically questionable operations. that's not to say two things cant both be bad, but it just looks to me like people are using the moral argument as a means to avoid learning something new while being able to virtue signal how ethical they are about it, while at the same time they refuse to sacrifice things they are already accustomed to for ethical reasons when they learn more about it. It just all seems rather convenient.

the main issue I see talked about with it is in unethical model training, but let me know of others. Personally, I think you can separate the process from the product. A product isnt unethical just because unethical processes were used to create it. The creator/perpetrator of the unethical process should be held accountable and all benefits taken back as to kill any perceived incentive to perform the actions, but once the damage is done why let it happen in vain? For example, should we let people die rather than use medical knowledge gained unethically?

Maybe we should be targeting these AI companies if they are unethical and stop them from training any new models using the same unethical practices, hold them accountable for their actions, and distribute the intellectual property and profits gained from existing models to the public, but models that are already trained can actually be used for good and I personally see it as unethical not to.

Sorry for the ramble, but it is a very interesting topic that should probably have as much discussion around it as we can get

sambuccid · 18 days ago
>> The creator/perpetrator of the unethical process should be held accountable and all benefits taken back as to kill any perceived incentive to perform the actions, but once the damage is done why let it happen in vain?

That's very similar to other unethical processes(for example child labour), and we see that government is often either too slow to move or just not interested, and that's why people try to influence the market by changing what they buy.

It's similar for AI, some people don't use it so that they don't pay the creators (in money or in personal data) to train the next model, and at the same time signal to the companies that they wouldn't be future customers of the next model.

(I'm not necessarely in the group of people avoiding to use AI, but I can see their point)

sambuccid commented on The EU made Apple adopt new Wi-Fi standards, and now Android can support AirDrop   arstechnica.com/gadgets/2... · Posted by u/cyclecount
Longhanks · 23 days ago
> "essential public infrastructure"

If people wanted these devices and services to be public infrastructure, they should be developed and maintained using public funds.

sambuccid · 23 days ago
And the huge revenue would also be public
sambuccid commented on I don't care how well your "AI" works   fokus.cool/2025/11/25/i-d... · Posted by u/todsacerdoti
TrackerFF · 24 days ago
I get that some people want to be intellectually "pure". Artisans crafting high-quality software, made with love, and all that stuff.

But one emerging reality for everyone should be that businesses are swallowing the AI-hype raw. You really need a competent and understanding boss to not be labeled a luddite, because let's be real - LLMs have made everyone more "productive" on paper. Non-coders are churning out small apps in record pace, juniors are looking like savants with the amount of code and tasks they finish, where probably 90% of the code is done by Claude or whatever.

If your org is blindly data/metric driven, it is probably just a mater of time until managers start asking why everyone else is producing so much, while you're slow?

sambuccid · 23 days ago
In my experience I saw the complete opposite of "juniors looking like savants", there are a few pieces of code made by some juniors and som mid engineers in my company(one also involving a senior) that were clearly made with AI, and they are such a mess that they haven't been touched ever since because it's just impossible to understand, and this wasn't caught in the PR because the size of it was so large that people didn't actually bother reading it.

I did see a few good senior engineers using AI and producing good code, but for junior and mid engineers I have witnessed the complete opposite.

sambuccid commented on Meta buried 'causal' evidence of social media harm, US court filings allege   reuters.com/sustainabilit... · Posted by u/pseudolus
asim · a month ago
We all know this. As people in the tech industry. As people on this website. We know this. The question is, what are we going to do about it? We spend enough time complaining or saying "I'm going to quit facebook" but there's Instagram and Threads and whatever else. And you alone quitting isn't enough. We have to help the people really suffering. We can sometimes equate social media to cigarettes or alcohol and relate the addictive parts of that but we have to acknowledge tools for communication and community are useful, if not even vital in this day and age. We have to find a way to separate the good from the bad and actively create alternatives. It does not mean you create a better cigarette or ban alcohol for minors. It means you use things for their intended purpose.

We can strip systems like X, Instagram, Facebook, Youtube, TikTok, etc of their addictive parts and get back to utility to value. We can have systems not owned by US corporations that are fundamentally valuable to society. But it requires us, the tech savvy engineering folk to make those leaps. Because the rest of society can't do it. We are in the position of power. We have the ability.

We can do something about it.

I wrote something to that effect two days ago on a platform I'm building. https://mu.xyz/post?id=1763732217570513817

sambuccid · a month ago
Platforms that have the useful stuff from social media without the addictive part already exist: Forums, micro-blogging, blogs, news aggregators, messaging apps, platforms for image portfolios, video sharing platforms.

And most of them have existed before the boom of social media, but they just don't get as huge because they are not addictive.

The useful part of a social media is so small that if you put it on it's own you don't get a famous app, you have something that people use for a small part of their day and otherwise carry on with their life.

A social media essentially leverages the huge and constant need that umans have to socialize, and claims that you can do it better and more through the platform instead of doing it in real life, and they do so by making sure that enough people in your social circle prioritise the platform over getting together in real life. And I believe this is also the main harmlful part of them, people not getting actual real social time with their peers and then struggling with mental health.

sambuccid commented on AI is a front for consolidation of resources and power   chrbutler.com/what-ai-is-... · Posted by u/delaugust
sambuccid · a month ago
Independently on the hypotesys/conspiracy that the big investors and big tech don't actually believe in AI, the measurable outcome the OP is making remains the same: very few people will end up owning a big chunk of the natural resources, and it will not matter if they will be used to power AI or for anything else.

Perhaps govrenments should add clauses on the contracts they make to avoid this big power imbalance from happening.

sambuccid commented on What we talk about when we talk about sideloading   f-droid.org/2025/10/28/si... · Posted by u/rom1v
terminalshort · 2 months ago
I think this misses the forest for the trees here. The platforms behavior here is a symptom and not the core problem. I think the following are pretty clearly correct:

1. It's your damn phone and you should be able to install whatever the hell you want on it

2. Having an approved channel for verified app loading is a valuable security tool and greatly reduces the number of malicious apps installed on users devices

Given that both of these things are obviously true, it seems like a pretty obvious solution is to just have a pop up that has a install at your own risk warning whenever you install something outside of the official app store. 99.9% of users would never see the warning either because almost all developers would register their apps through the official store.

But there is a reason why Apple/Google won't do that, and it's because they take a vig on all transactions done through those apps (a step so bold for an OS that even MSFT never even dared try in its worst Windows monopoly days). In a normal market there would be no incentive to side load because legitimate app owners would have no incentive not to have users load apps outside of the secure channel of the official app store, and users would have no incentive to go outside of it. But with the platforms taxing everything inside the app, now every developer has every incentive to say "sideload the unofficial version and get 10% off everything in the app". So the platforms have to make it nearly impossible to keep everything in their controlled channel. Solve the platform tax, solve the side loading issue.

sambuccid · 2 months ago
> In a normal market there would be no incentive to side load because legitimate app owners would have no incentive not to have users load apps outside of the secure channel of the official app store, and users would have no incentive to go outside of it.

> Solve the platform tax, solve the side loading issue.

I think maybe for a large part of legitimate app owners there would be no incentive, but there are other reasons/incetives for legitimate app owners to go outside the official app store even in the case of no tax, a few that pop to mind are:

- open source devs might have the preference to publish their app on a community-led store.

- users trying to keep an old phone functioning using an unofficial custom android, with no support for the store.

- developers creating apps for themselves and their friends not needing to publish the app publicly.

- companies creating apps just for work phones wanting to keep them private outside of any store.

- A company providing "build-your-app-with-AI" service preferring to just provide a final apk file.

I think it's important to remember that there are loads of other reasons outside the financial one to keep the ability to install what you want on your phone. If google dropped any tax they put on their store now, the problem with these new changes would still be there

(edits: formatting issues)

sambuccid commented on US axes website for reporting human rights abuses by US-armed foreign forces   bbc.com/news/articles/cqx... · Posted by u/tartoran
nla · 2 months ago
The Leahy Law requires the U.S. government to facilitate receipt of information about alleged abuses by U.S. supported forces.

The State Department confirms it no longer operates the HRG, but says it is still receiving reports through other direct channels.

I couldn't find any requirement in the law that requires a public website.

NGOs can still submit information through established contacts or by email.

I would think email is a lot easier than a webform.

sambuccid · 2 months ago
I might be wrong but I guess it might also be easier for leadership to put pressure and influence personal communications than to avoid processing official reportings from their own website. An article reading "they ignored emails from amnesty international" sounds different from "they are not acting on this report made on their official website"
sambuccid commented on ChatGPT Atlas   chatgpt.com/atlas... · Posted by u/easton
sambuccid · 2 months ago
I wonder if this could cause atrophy also to the abilityto understand long detailed text among the large part of the population, probably all those that doesn't work in an "information heavy" field

u/sambuccid

KarmaCake day10July 28, 2025View Original