Readit News logoReadit News
vectorEQ commented on Google resists demands from states in digital-ad probe   wsj.com/articles/google-r... · Posted by u/aty268
Frost1x · 6 years ago
Personally, I dont believe businesses should be treated as people legally, though I understand some of the complexity involved and why this is currently, mostly the case.

Businesses are entities and should be treated significantly different since as abstract/artificial human constructs, they have no real notion of life, death, hardship, hunger, disease, etc. and are able to act accordingly based on these relaxed constraints whereas humans do need to deal with these aspects.

Any notion of these human concerns reflected in a business exist only due to businesses being composed of and controlled by humans. Business decisions don't have to and often don't reflect regular human concerns. That relaxed constraint allows them certain competitive advantages over humans.

Those competitive advantages are then exploited purely as a proxy for some arbitrarily privileged humans, allowing them to push their personal desires on the world with losses minimally effecting their primary human concerns.

That common proxy relationship use needs to have more accountability that leads back to the humans playing the business puppets, otherwise, the punishments are not nearly equivalent in impact on life of a business entity vs an individual.

vectorEQ · 6 years ago
businesses aren't on trial. it's their owners / directors / responsible parties who are on trial, and those are 'private individuals' - they might be held accountable for their business or business practices, but the idea a business is on trial is silly. and now we know 'people' are actually on trial, it's more logical to say they should be treated as individuals....
vectorEQ commented on Boeing finds debris left in new 737 MAXes, now in storage   leehamnews.com/2020/02/18... · Posted by u/robin_reala
vectorEQ · 6 years ago
no where it states if this is actually hazardous or not, if its a common thing which might happen to other vendors too or anything. just 'found some stuff which didn't pass the checks.' ok, thats what there are checks for..., good job... such reporting. just post some random piece of information about some buzzword or google trend without any background or context to put it in.
vectorEQ commented on Almost everything on computers is perceptually slower than it was in 1983 (2017)   twitter.com/gravislizard/... · Posted by u/Bender
vectorEQ · 6 years ago
people went with HTTP and other shitty models. now we live with the pain. good for selling new hardware tho, so who really cares!

There could be tons done to still have our modern looks, but have old skool performance. issue is mainly in how we store and subsequently use data.

tons of people still use DOS era softwares just due to this very fact. their old programs they can perform the same work 1000 times faster due to how data is stored / presented. It generally doesn't go through 1000 layers of processing each click, and a lot of data is stored as a 'view', not some raw binary blob to be parsed out again on demand..

vectorEQ commented on 0day vulnerability in firmware for HiSilicon-based DVRs, NVRs and IP cameras   habr.com/en/post/486856/... · Posted by u/mcsoft
LatteLazy · 6 years ago
Isn't this just telnet? Like last time people claimed huawei "injected back doors", nothing is being injected by them, and these are not backdoors, they are front doors, standard festures etc? But dressed up in a way to make it look scarey to someone non-technical? Sorry if I'm missing something here...
vectorEQ · 6 years ago
its a telnet but you need to activate it first. often backdoors are simple shells like telnet or such services. but it usually requires some 'magic packets' or such things to open the port to it or start the service. if you look at the POC you see it's not simply making a telnet connection to a port, but it does some other stuff first to prepare for it.
vectorEQ commented on Twitter to label deepfakes and other deceptive media   reuters.com/article/us-tw... · Posted by u/jonbaer
ducttape12 · 6 years ago
You don't need advanced AI to create hyper realistic deep fake news. You just need to take a picture of X candidate, stick a made up quote beneath them, and put it on Facebook. There's enough people on social media who will share that without verifying it.
vectorEQ · 6 years ago
agree. it's more a problem of lack of critical thinking&reading (because this is generally not taught during education) than anything else.

The problem with these kinds of censorship actions is that there is literally no end to it, and it will never solve the real root cause of the problem. It will just suck up resources and people/institutions will keep complaining that they got into trouble because someone took a tweet as a truth...

vectorEQ commented on Is it ok to be a jerk to jerks at office?    · Posted by u/mightymosquito
vectorEQ · 6 years ago
:')... don't adapt yourself to other peoples standards. be yourself regardless of people around you.
vectorEQ commented on We should take moral advice from computers and not the other way around   bioethics.georgetown.edu/... · Posted by u/Gormisdomai
vectorEQ · 6 years ago
i think that due to the fact all humans carry imperfect information, their opinions on good/bad/ethical varies, and that will ensure that never a system can be made which 'behaves 100% ethically'. it might behave in such a way with respect to the opinion of the person claiming that, but to another person it can seem completely unreasonable/unethical...

since you can't create an AI which takes into account all the flaws of all humans present in their knowledge and consciousness, a system which has such perfection is impossible to make. (even these flaws are often just perceived flaws and them being a flaw is an subjective matter based upon other subjective matters.)

it's not about having all the data for an AI, it's more about understanding what lack of data means to humans and how it affects their decision and interpretations, and about how the same data can be interpreted in many ways.

Even if all humans were exposed to the same data as each other exactly, they would likely still carry different opinions and interpret the same data differently leading to a completely different decision making process... I think this at the moment is inherently impossible to create within or take into account in current computers or programming.

If you would reverse it, and have humans take all their morals and ethics from computers, what is left of humans? Isn't that what makes a human? the ability and/or inability to do this themselves. i think no one is looking for a world or working towards a world where only 1 human exists in multitude. i think the work should be focused on preserving the uniqueness of identity while maximising its potential within that uniqueness. That also makes me of the opinion that AI should thus be specialised within domains of operation, and not attempted to be implemented in a general fasion.

perhaps an AI system could exist which comprises of many specific AI systems, which would make it more generally applicable based upon many input from specialised AI systems, who knows. But 1 system and 1 data set will never be able to cover inherent uniqueness within humans.

you can argue about some rotten apple humans who have 'bad behaviour', but even the good people you know, are wildely different from you. admit it. you are not them, and they are not you and that's how it should be.

vectorEQ commented on Congrats! Web scraping is legal! (US precedent)   parsers.me/us-court-fully... · Posted by u/ehurynovich
powrtoch · 6 years ago
"HiQ only takes information from public LinkedIn profiles. By definition, any member of the public has the right to access this information. Most importantly, the appeals court also upheld a lower court ruling that prohibits LinkedIn from interfering with hiQ’s web scraping of its site."

Surely I'm not reading this correctly. This would seem to suggest that websites are not legally allowed to prevent bots from crawling their sites. Lots of sites have ToS preventing such things, are those legally void now? Are captchas on public pages illegal, even if you request the page 8000 times in a second?

"In this case, hiQ argued that LinkedIn’s technical measures to block web scraping interfere with hiQ’s contracts with its own customers who rely on this data. In legal jargon, this is called” malicious interference with a contract”, which is prohibited by American law."

This is almost weirder. If LinkedIn wanted to force users to sign in to view profile info, would they be not allowed to do that because some company had signed a contract that implicitly assumed access to that data? If someone writes a web scraper for my site, and I unknowingly change my site in a way that breaks that scraper, can a court force me to revert the change?

Seems to imply that every business is somehow beholden to every contract signed by anyone.

vectorEQ · 6 years ago
i'm wondering if that robots.txt might then get you sued due to blocking scrapers / bots?
vectorEQ commented on Wuhan seafood market pneumonia virus isolate Wuhan-Hu-1, complete genome   ncbi.nlm.nih.gov/nuccore/... · Posted by u/matt2000
vectorEQ · 6 years ago
aaaaaaaaaa aaaaaaaaaa aaaaaaaaaa aaa , looks more like exploit than virus code :'D
vectorEQ commented on Genetically engineered moth is released into an open field   technologynetworks.com/ge... · Posted by u/sigmaprimus
vectorEQ · 6 years ago
soon the first genetically engineered human is released onto the streets. to reproduce and spread their self-limiting genes. everyone will always be happy, and happily consume :O :D

u/vectorEQ

KarmaCake day396July 28, 2014
About
config
View Original