Readit News logoReadit News
mk67 commented on Rost – Rust Programming in German   github.com/michidk/rost... · Posted by u/miniBill
tiffanyh · 5 months ago
Are you based in Germany?

SAP is also a “German tech company”, and I bet most of their non-Germany based employees don’t speak German either.

mk67 · 5 months ago
Working at SAP and working language is English, also in Germany.
mk67 commented on Show HN: Trolling SMS spammers with Ollama   evan.widloski.com/softwar... · Posted by u/Evidlo
zcw100 · 7 months ago
Enforceable but not necessarily enforced.
mk67 · 7 months ago
It definitely will be if you go to court. As soon as you have any witnesses there is little chance to get out of a verbal contract.
mk67 commented on Show HN: Trolling SMS spammers with Ollama   evan.widloski.com/softwar... · Posted by u/Evidlo
sega_sai · 7 months ago
I would have thought that any kind of contract would require a signature or something rather than agreement by text (but obviously I'm not a lawyer)
mk67 · 7 months ago
No, in basically all countries even verbal contracts are valid and enforceable.
mk67 commented on Tesla Sales Are Tanking in Europe   insideevs.com/news/745119... · Posted by u/belter
CamperBob2 · 7 months ago
Trusting the other driver's turn signal at a 4-way stop is fine.

Trusting the other driver's turn signal when one or both is in motion is a good way to get killed.

Signaling to enter or exit a roundabout communicates no useful information.

mk67 · 7 months ago
Very nonsensical statement, as it helps a lot since decades and allows faster traffic flow in roundabouts.
mk67 commented on Former Google CEO Eric Schmidt's Leaked Stanford Talk   github.com/ociubotaru/tra... · Posted by u/gregzeng95
fer · a year ago
Unsure Meta now, but FB was indeed absurdly leetcodish 10 years back.
mk67 · a year ago
Same when I interviewed ~1-2 years ago.
mk67 commented on PcTattletale leaks victims' screen recordings to entire Internet   ericdaigle.ca/pctattletal... · Posted by u/nneonneo
ziddoap · a year ago
Can you back this up somehow?

I haven't heard of or seen anything indicating that the US, or any other country, is trying to make car theft or petty theft legal.

mk67 · a year ago
From what I read it's not prosecuted in San Francisco e.g. anymore.
mk67 commented on Big data is dead (2023)   motherduck.com/blog/big-d... · Posted by u/armanke13
CoastalCoder · a year ago
Could you point to something explaining that eigenvalue / dimensions topic?

It sounds interesting, but it's totally new to me.

mk67 commented on GPUs Go Brrr   hazyresearch.stanford.edu... · Posted by u/nmstoker
Tarrosion · a year ago
Why don't gradients vanish in large scale LLMs?
mk67 · a year ago
Not easy to give a concise answer here, but let me try:

The problem mainly occurs in networks with recurrent connections or very deep architectures. In recurrent architectures this was solved via LSTMs with the signal gates. In very deep networks, e.g. ResNet, this was solved via residual connections, i.e. skip connections over layers. There were also other advances, such as replacing sigmoid activations with the simpler ReLU.

Transformers, which are the main architecture of modern LLMs, are highly parallel without any recurrence, i.e. at any layer you still have access to all the input tokens, whereas in an RNN you process one token at a time. To solve the potential problem due to "deepness" they also utilize skip connections.

mk67 commented on GPUs Go Brrr   hazyresearch.stanford.edu... · Posted by u/nmstoker
heavenlyblue · a year ago
They don't do global optimisation of all layers at the same time, instead training all layers independently of each other.
mk67 · a year ago
I'm in the industry and nobody does that since over ten years. There was just a small phase when Hinton published "Greedy layer-wise training of deep networks" in 2007 and people did it for a few years at most. But already with the rise of LSTMs in the 2010s this wasn't done anymore and now with transformers also not. Would you care to share how you reached your conclusion as it matches 0 of my experience over the last 15 years and we also train large-scale LLMs in our company. There's just not much point to it when gradients don't vanish.
mk67 commented on New capacitor with 19-times energy density   livescience.com/technolog... · Posted by u/tromp
BenjiWiebe · a year ago
I think the idea is that vampires don't die unless you put a stake through their heart. Most things have many other ways of killing them.
mk67 · a year ago
I thought sunlight and water also kills them.

u/mk67

KarmaCake day65January 20, 2023View Original