Readit News logoReadit News
miltondts commented on The Long History of Nobody Wants to Work Anymore   mstdn.ca/@paulisci/110798... · Posted by u/latexr
mandelbrotwurst · 2 years ago
What is "Ancient Time"? Is that capitalization intentional?

Edit: Oh, maybe you meant "Ancient Rome"?

miltondts · 2 years ago
https://en.m.wikipedia.org/wiki/Ancient_history

"Ancient history covers all continents inhabited by humans in the period 3000 BC – AD 500."

miltondts commented on OpenWorm – A computational model of C. elegans worm   github.com/openworm/OpenW... · Posted by u/dvrp
quickthrower2 · 2 years ago
Is it simulating cells? atoms? or treating the neurons as black boxes?
miltondts · 2 years ago
From a quick read of the model used [0], it seems to simulate neurons and muscles at a functional level (not atoms or cell internals).

[0] - https://github.com/openworm/c302 (linked in the original github page)

Deleted Comment

miltondts commented on Ask HN: How does ChatGPT work?    · Posted by u/funerr
bryan0 · 3 years ago
I found this description of the GPT-3 transformer architecture useful: https://dugas.ch/artificial_curiosity/GPT_architecture.html

Not eli5 but close enough.

miltondts · 3 years ago
What I don't understand is where is the memory? How does GPT-3 or ChatGPT remember so much information with just that architecture? It would seem that the maximum it could remember is 2048 words.

EDIT: Maybe it's 2048 x 96? Still seems low for what it can do.

miltondts commented on Artist feeds childhood diary into GPT-3 to have a chat with herself   nwn.blogs.com/nwn/2022/11... · Posted by u/SLHamlet
tartoran · 3 years ago
Of course we are/have some kind of sophisticated pattern matching machines, what is language anyway? But we understand context in conversation, at least what we mean by understanding. GPT-3 doesn’t do any of that.
miltondts · 3 years ago
I'm fairly sure GPT-3 can't do any of that, because it was trained only with text. That is, it was trained without the context we are exposed to. Now imagine if it was trained with a body (similar sensors to what we have) in the real world. I'm not so sure we wouldn't get something indistinguishable from a human.
miltondts commented on Net-zero commitments could limit warming to below 2 °C   nature.com/articles/d4158... · Posted by u/doener
captainmuon · 3 years ago
Is that under "realistic" economical assumptions? I.e. phasing out fossil fuel while keeping our political system and economy reasonably intact? Or is it even impossible if we stop emitting all CO2 tomorrow, because the warming is lagging behind?

Last I heard, we could still physically achive 1.5 degrees, but we would have to switch completely to command economy immediately and take a big reduction in economic activity (a couple times larger than during the peak of COVID). Which I believe is still better than the alternative.

miltondts · 3 years ago
Depends on the model used. Some models say we already have 2C locked in: https://www.ecowatch.com/greenhouse-gases-paris-agreement-26...

u/miltondts

KarmaCake day353July 27, 2015View Original