Readit News logoReadit News

Dead Comment

Dead Comment

Dead Comment

danrocks commented on Perennial rice: Plant once, harvest again and again   npr.org/2023/03/23/116568... · Posted by u/colinprince
smnrchrds · 2 years ago
probably better than a race condition sandwich
danrocks · 2 years ago
probably sandwich race a than better

Dead Comment

Dead Comment

danrocks commented on Do Kwon arrested in Montenegro: Interior Minister   coindesk.com/business/202... · Posted by u/janmo
JumpCrisscross · 2 years ago
> curious to know where he'll be deported to

It looks like Montenegro can extradite to Korea [1][2].

[1] https://www.coe.int/en/web/transnational-criminal-justice-pc...

[2] https://en.wikipedia.org/wiki/European_Convention_on_Extradi...

danrocks · 2 years ago
I'm sure Singapore and the US would like to have a word as well.
danrocks commented on Do Kwon arrested in Montenegro: Interior Minister   coindesk.com/business/202... · Posted by u/janmo
danrocks · 2 years ago
He's been charged in so many places that I am curious to know where he'll be deported to.

Dead Comment

danrocks commented on The case for slowing down AI   vox.com/the-highlight/236... · Posted by u/mfiguiere
13years · 2 years ago
I expect AI is going to take the lead for societal concerns of unintended consequences. The advancement pace is way ahead of our ability to reason about the potential effects of what we are building.

We will have completed many iterations before we have even a moment to review the feedback loop. Thus, we will likely compound many mistakes before we realize it.

I have no idea how it will slow down. Someone just figured out how to reduce the cost of building a multi million dollar model to around $600. That was supposed to take another decade.

I've spent a lot of time thinking about what the picture looks like in this advancement. I think we are going to trip over many landmines in this new tech gold rush. I've written my own perspectives on that here - https://dakara.substack.com/p/ai-and-the-end-to-all-things

danrocks · 2 years ago
> I have no idea how it will slow down. Someone just figured out how to reduce the cost of building a multi million dollar model to around $600. That was supposed to take another decade.

I don't think this is accurate. The Stanford team used LLaMA as base model and added a smaller model on top of it - training the joint model using data (generated from ChatGPT) is what cost $600. Nobody trained a GPT-like model from scratch for $600 - this experiment took advantage of the millions of USD used to train the larger models.

u/danrocks

KarmaCake day2014August 12, 2021View Original