Readit News logoReadit News
robbedpeter commented on Graphene may have found its killer app   economist.com/science-and... · Posted by u/jkuria
etrautmann · 4 years ago
I read this as less concrete used for any one application. It's also not clear to me that this would reduce net costs, as producing bulk graphene is likely not as cheap as any other conceivable additive. From an environmental perspective, if required volume is reduced but cost is the same, then worldwide consumption could be reduced.
robbedpeter · 4 years ago
It's cheaper than many other additives.

Graphene flakes are all that is needed for concrete reinforcement, and it's trivial to produce, even at scale. Carbon chunks are thrown into industrial blenders with water and detergent, resulting in graphene flakes sheared and then separated in suspension. The flakes are separated, washed, and dried.

Large graphene sheets are hard. Tiny flakes are trivial, gradeschool kitchen science.

robbedpeter commented on Engineer turns plastic into bricks that are reportedly stronger than concrete   peopleofcolorintech.com/b... · Posted by u/laurex
shadowtree · 4 years ago
This is a great way to get microplastic through abrasion into even more places within the foodchain.

Wonder if it burns nicely too, plastic after all is "frozen gasoline" - here mixed with sand.

/debbiedowner

robbedpeter · 4 years ago
This is where you might use a variation of that "healing cracks" bacteria in conjunction with the "eats plastic" bacteria. If you could get the plastic digesting bacteria to excrete rigid oxides, then most of the plastic would be replaced with an interlocking 3d mesh of concrete and rock-like bacteria waste.

https://www.sciencedirect.com/science/article/abs/pii/S09500...

It'd eliminate the plastic waste and leave much more environmentally friendly remains, and it'd be self healing.

robbedpeter commented on Cats learn the names of their friend cats in their daily lives   nature.com/articles/s4159... · Posted by u/michaelwm
lngnmn2 · 4 years ago
No. They just associate repeated sounds, without any semantic whatsoever, just like bird songs.

Cats do not have the brain circuitry for semantic networks (based on a language concepts) evolved yet. So called language areas are required.

How this crap is even got through a peer review? Rithoric question, I know.

robbedpeter · 4 years ago
https://youtu.be/uFhBd5mMkU8

This is one of note thousands of videos of cats and dogs using buttons to talk.

Cats, and all mammals, have a neocortex. Theirs is not as deeply layered or large as humans, but they most definitely have the ability to reason abstractly, are aware of themselves, think emotionally, and engage in complex, time aware planning over long periods.

Your views are wrong. Language areas like Broca's region in the human brain are a consequence of physical distribution relative to the connectome and sensory endpoints. If you were to rewire the millions of connections to the lips, tongue, mouth, ears, and other body parts to be locations on the neocortex, broca's region would be somewhere different. You have about 1 square meter of neocortex responsible for all of your perception and cognition, and almost all of it is uniform. Neurons aren't differentiated by function, and animal experiments show that plasticity allows for arbitrary rewiring.

The literature in the field shows that human cognition is likely superior to other species in the depth of cortical layering and size of the organ. It's likely the only reason elephants and whales or other animals with larger brains can't compete with humans is the mere absence of hands and vocal organs. Our range of colors and audible senses are important but lesser than many animals.

Give an orca hands and human speech and there's nothing we know about neuroscience to imply that the animal wouldn't be smarter and more capable than humans. There's a lot of evidence that the killer whale would be more intelligent than humans in many ways.

The cortical layering and columnar architecture of neuron clusters differs between species, and seems to dictate the cognitive depth of abstract reasoning. There may be different algorithmic constructions in neural connections that favor human level cognition.

In principle, however, human brains aren't terribly different from many other large mammals, and elephants certainly display complex, emotional, symbolic, and abstract reasoning well within a range comparable to human experience.

Your notion of animal cognition is unscientific and biased toward an assumption of human superiority that isn't grounded in fact. Neuroscience is slowly and tirelessly matching toward reverse engineering the brain. The more we learn, the more we find similarity in the basic functions of mammal brains, from mice to humans to blue whales.

robbedpeter commented on The lab-leak theory is looking stronger by the day   theintercept.com/2022/05/... · Posted by u/jessaustin
kromem · 4 years ago
The Intercept is one of the last news outlets that does actual investigative journalism, and the podcast opens discussing how one of the people on it found out they were a finalist for a Pulitzer.

The tribalism around brand loyalty infesting even people's heuristics around commenting on news they don't even bother reading is something else.

Startup idea for anyone out there: a news commenting site like Reddit or HN with a built in browser that doesn't let you comment until you've spent at least half the expected amount of time to read the article with the article open.

I'd happily give that site my attention, ad dollars, and data instead.

robbedpeter · 4 years ago
Was. They've sold out to the ad churn, laying off journalists, and some genius middle management is steering the company toward profits over quality journalism.

They're a dead outlet, unless they restructure their management to preserve journalistic integrity. That's expensive and they seem more interested in cashing out their reputation.

robbedpeter commented on Monet: The Water Lily Pond   artsandculture.google.com... · Posted by u/theawesomekhan
iamevn · 4 years ago
> Follow OP’s link here, zoom in, and see for yourself.

I can't seem to zoom in because the page hijacks my scroll and has disabled zoom.

robbedpeter · 4 years ago
Yeah. It might as well be in flash.
robbedpeter commented on Forget personalisation, it’s impossible and it doesn’t work   marketingweek.com/peter-w... · Posted by u/lando2319
shkkmo · 4 years ago
None of those problems make advertising any better of a solution for getting information.
robbedpeter · 4 years ago
Advertising incentivizes lying. The best presentation wins, in this current ad market.

Metrics that track good faith interactions are needed, like eBay reputation - if someone isn't 98% or higher, they're going to be overlooked or bypassed in favor of someone with a higher score.

Product reviews and ratings get gamed, because current systems don't reward good faith transactions - Amazon and Google customers purchase attention and shuffle facts around top maximize purchases. If quality reviews and curation were incentivized, there would be a thriving class of reviewers and experts playing a role in the marketplace. Their absence is glaring, and the horde of product influencers and professional reviewers underscore the deep corruption of adtech. Those people leech money from the market by selling the ability to lie. The lies are sanctioned by adtech firms, and often laundered through otherwise reliable data sources.

Any legitimate attempt to compete threatens the entire adtech ecosystem, so a majority of all consumer marketplaces are incentivized to cultivate the corruption and prevent any changes or reform that threaten the sanctified lies.

Things like Angie's List and product review vlogs and expert podcasts are stuck within the system, regardless of their intent or functionality when they start. They eventually converge into niches that support the system as a whole. Even reddit, requiring individual human dialog and interaction, has been infested by professional reviewers shilling crappy products.

You can't trust the data sources because trustworthy sources are incompatible with adtech. Google has sufficient data to fix it, but they'd lose money by allowing reform, so they maintain the ethically gray areas ferociously. Their business is not quality search, it's maximizing advertising profits, and it's more profitable to have 50 people paying a premium for scraps than 5 high quality vendors with vetted products earning those spots through quality and service.

The system is working as intended.

robbedpeter commented on Facebook Deliberately Caused Havoc in Australia to Influence New Law   wsj.com/articles/facebook... · Posted by u/marban
robbedpeter · 4 years ago
They shouldn't be doing stupid things like hosting anything important on Facebook.

I have zero sympathy for Australia's inability to elect competent government.

They signed up for free services like every other chump on the planet, with the same terms of service, with the same naive, zero-fucks-given attitude toward privacy, service continuity, or responsibility, and it bit them in the ass while highlighting their stupidity. This story is going to get repeated over and over until western governments catch up to social media.

robbedpeter commented on Why I’m skeptical of “steelmanning”   statmodeling.stat.columbi... · Posted by u/luu
twelve40 · 4 years ago
yeah i don't get why 'strawmanning' Y is necessarily bad in this case. You also automatically strawman your own position, so what? that's kind of the point, to entertain another theory, possibly one by one.
robbedpeter · 4 years ago
Steelmanning is your own good faith attempt to understand the opposing argument so well that you can articulate it as coherently as your own position.

If you're simply repeating what the other person said, you're not using the concept to full effect.

If you're framing your own argument as a strawman instead of clarifying the opposing argument, you've missed the plot entirely (unless your opponent is arguing for the use of strawmen in debate?)

The utility of steelmanning is to minimize assumptions. Everyone has to demonstrate their comprehension. You can take it further and in order to 'pass' the steelman stage, you have to agree with your opponent's steelman argument, or have a dialog to refine it until you're satisfied that both of you understand your argument.

robbedpeter commented on OPT: Open Pre-trained Transformer Language Models   arxiv.org/abs/2205.01068... · Posted by u/MasterScrat
etaioinshrdlu · 4 years ago
What type of hardware would you need to run it?
robbedpeter · 4 years ago
A cluster of many $8000+ gpus. You're looking at around 350GB of vram, so 30 12gb gpus - a 3090 will cost around $1800, so $54k on the gpus, probably another $15k in power, cooling, and infrastructure, $5k in network, and probably another $20k in other costs to bootstrap it.

Or wait 10 years, if gpu capacity scales with Moore's law, consumer hardware should be able to run a ~400GB model locally.

robbedpeter commented on OPT: Open Pre-trained Transformer Language Models   arxiv.org/abs/2205.01068... · Posted by u/MasterScrat
f311a · 4 years ago
Just curious, will I be able to use it using my Nvidia card with 10GB of memory? Does it require multiple graphic cards?
robbedpeter · 4 years ago
The smaller models, yes. I'd bet dollars to donuts that gpt-neo and EleutherAI models outperform most, if not all, of Facebook's.

Check out huggingface, you'll be able to run a 2.7b model or smaller.

https://huggingface.co/EleutherAI/gpt-neo-2.7B/tree/main

u/robbedpeter

KarmaCake day2498June 30, 2021View Original