Buddhist meditation exists only in the context of the Four Noble Truths and the rest of the Buddha's Dhamma. Throwing them away means it stops being Buddhist.
Buddhist meditation exists only in the context of the Four Noble Truths and the rest of the Buddha's Dhamma. Throwing them away means it stops being Buddhist.
this is literally what they are
The intersection of the two seems to be quite hard to find.
At the state that we're in the AIs we're building are just really useful input/output devices that respond to a stimuli (e.g., a "prompt"). No stimuli, no output.
This isn't a nuclear weapon. We're not going to accidentally create Skynet. The only thing it's going to go nuclear on is the market for jobs that are going to get automated in an economy that may not be ready for it.
If anything, the "danger" here is that AGI is going to be a printing press. A cotton gin. A horseless carriage -- all at the same time and then some, into a world that may not be ready for it economically.
Progress of technology should not be artitrarily held back to protect automateable jobs though. We need to adapt.
Any of the signatories here match your criteria? https://safe.ai/work/statement-on-ai-risk#signatories
Or if you’re talking more about everyday engineers working in the field, I suspect the people soldering vacuum tubes to the ENIAC would not necessarily have been the same people with the clearest vision for the future of the computer.
(I could throw in lust, but that's last century's problem since various forms of birth control have negated its influence on housing demand)
The US government and state governments are openly hostile to our residents and currently implementing massive mechanisms to track and control our population including our immigrant communities, women who need access to birth control, LGBTQ communities .
The government wanting a system that requires GPS and speed information to allow law enforcement to remotely control the movement of undesirable activities is the obvious goal here.
For Elizer to really deign novelty here, he'd have predicted the reason why this happens at all: training data. Instead he played the Chomsky card and insisted on deeper patterns that don't exist (as well as solutions that don't work). Namedropping Elizer's research as a refutation is weak bordering on disingenuous.