Oh boy, years after years, after everything we went so far, there are people in the world who believes that Big Tech invests billions dollars into AI (including data centers and infra) with only socially good intent and we should not at all touch anything Big Tech do …
I find that any article that goes against the popular "tech" narrative of the day gets flagged.
I once read a great article on Hacker News where Bill Gates and Warren Buffet explained tangible problems with Bitcoin and other cryptocurrencies. It ended up being flagged, because the article went against the "Bitcoin is going to take over the world" narrative.
I can't stand the "OMG there's a new thing, we need to regulate it" way of thinking. No doubt in my mind the woman in the article doesn't understand much more about so-called "AI" than Biden.
It seems like she has a background in EE and physics, but almost all of her work is policy oriented and not actually hands-on building things, at which point, "when all you have is a hammer" seems relevant. From the article:
> [Interviewer:] It’s almost predestined that you’d be in this job. As soon as you got your physics degree at Caltech, you went to DC and got enmeshed in policy.
> Born in India and raised in Texas, Prabhakar has a PhD in applied physics from Caltech and previously ran two US agencies: the National Institute of Standards and Technology and the Department of Defense’s Advanced Research Projects Agency. She also spent 15 years in Silicon Valley as a venture capitalist, including as president of Interval Research, Paul Allen’s legendary tech incubator, and has served as vice president or chief technology officer at several companies.
The government has the legal authority to kill, imprison, and tax citizens. It can destroy entire industries with the stroke of a pen. The military can wipe other nations off the face of the Earth.
And you want to wield that awesome power for some trivial problem like spam on the Internet? This is exactly the kind of overreach the founding fathers intended to prevent.
> seems only good at enabling generative spam and disinformation.
That seems overly pessimistic. They're pretty good at generating summaries of text, simple machine translation, generating alternative prose, turning unstructured text into table-like data, providing a text interface over a corpus of facts, and probably a bunch of things I'm not thinking of.
Its in the nature of the beast, the boomer generation, being in the retreat-defeat of retirement, wants things stable, conservative, collapsing inwards at a snails pace. Your point of view has no party running
The thing is, we do need regulation but not the kind that's being proposed, which appears to be more like regulatory capture by OpenAI & friends to enshrine deep learning at scale as something only verified multinationals with NSA backdoors can do and put up even larger barriers for entry for anyone else.
What's needed is practically the exact opposite, a set of rules that prevents work automation to be owned by a select few who will then manipulate the resulting techno-feudalist world with impunity because they need no one else. Things are changing too quickly, the fastest they ever have in the history of man even. Some people are probably preparing to enter a college programme right now to learn something that will be fully automated before they even get their degree.
Everyone but psychopaths wants to live in a stable society, conservatism is apparently a trend among the young, and the boomer generation is all but dead at this point. But go off.
I once read a great article on Hacker News where Bill Gates and Warren Buffet explained tangible problems with Bitcoin and other cryptocurrencies. It ended up being flagged, because the article went against the "Bitcoin is going to take over the world" narrative.
Dead Comment
And the article says it as if that's some God-given truth...
The sad part is that AI won't "break out" and kill all humans aka stop this insanity. Life isn't a fairy^W cyberpunk tale.
> [Interviewer:] It’s almost predestined that you’d be in this job. As soon as you got your physics degree at Caltech, you went to DC and got enmeshed in policy.
Dead Comment
I think regulations are exactly what's required.
And you want to wield that awesome power for some trivial problem like spam on the Internet? This is exactly the kind of overreach the founding fathers intended to prevent.
That seems overly pessimistic. They're pretty good at generating summaries of text, simple machine translation, generating alternative prose, turning unstructured text into table-like data, providing a text interface over a corpus of facts, and probably a bunch of things I'm not thinking of.
This doesn't seem to make sense. Although it uses the word "disinformation", so perhaps an LLM wrote it.
Dead Comment
What's needed is practically the exact opposite, a set of rules that prevents work automation to be owned by a select few who will then manipulate the resulting techno-feudalist world with impunity because they need no one else. Things are changing too quickly, the fastest they ever have in the history of man even. Some people are probably preparing to enter a college programme right now to learn something that will be fully automated before they even get their degree.