Readit News logoReadit News
wmeddie commented on Destruction of nuclear bombs using ultra-high energy neutrino beam (2003) [pdf]   arxiv.org/pdf/hep-ph/0305... · Posted by u/mvkel
wmeddie · 2 years ago
I don't think I've ever seen a sci-fi rendition of this concept. It means that an advanced enough alien species could come to a planet, from orbit disable all nuclear weapons and power plants, then start their invasion. Not as cinematic as a Death Star planet explosion, but sounds like a good tactical move for a galactic empire.
wmeddie commented on Common Lisp: An Interactive Approach (1992)   cse.buffalo.edu/~shapiro/... · Posted by u/nanna
sillysaurusx · 2 years ago
Even today it amazes me that python devs don’t live in the repl the same way that lispers do. The interactive approach is underrated. Especially in the era of test-first development, which I think is a fad long term. (That’s not to say no tests, just not tests first.)
wmeddie · 2 years ago
I think you can argue that the pervasive use of notebooks is close enough for learning at least, but it's not as good for real development. The edit-and-continue features in Visual Studio for C# (and similar feature in Java) is the closest non-lisp thing we have these days. The languages aren't made for it like lisp are though, you have to do full restarts all the time.

I still wish there was an environment more like Smalltalk for Python.

wmeddie commented on Ask HN: How to handle Asian-style “Family name first” when designing interfaces?    · Posted by u/evolve2k
andreareina · 2 years ago
Dual citizenship frequently results in different names in different passports.
wmeddie · 2 years ago
Exactly. This is the case for my children.
wmeddie commented on Ask HN: How to handle Asian-style “Family name first” when designing interfaces?    · Posted by u/evolve2k
wmeddie · 2 years ago
Definitely read https://www.kalzumeus.com/2010/06/17/falsehoods-programmers-... if you haven't yet.

Then think about what are the requirements your system needs when it comes to names.

Does the app need to know what a user's name is at all or is a username enough? Does it need to distinguish the family part of their name for anything?

A thing I think is the most general is to just have a Full Name field (min length 1 and either John Doe or something cute as default) And a Nickname or Display Name field if your app needs to show something on screen.

wmeddie commented on Ask HN: Is it just me or GPT-4's quality has significantly deteriorated lately?    · Posted by u/behnamoh
chx · 3 years ago
Asking facts from a generative AI is folly.
wmeddie · 3 years ago
Yes, people really need to know that unless you are using the browser plugin, you really shouldn't ask it questions like this. (A good rule of thumb I think is if you can't expect a random person on the street to get the question right without looking it up, you shouldn't expect GPT-4 to get it right either.)

Unfortunately for this question, even using the browser plugin it wasn't able to get the answer: https://chat.openai.com/share/6344f09e-4ba0-45c7-b455-7be59d...

wmeddie commented on NEC’s Forgotten FPUs   cpushack.com/2021/09/01/n... · Posted by u/protomyth
fulafel · 4 years ago
Cool. A few questions, don't feel obliged to answer all of them: Is it a custom instruction set? What are similarities / differences to desktop vector instructions like sse/avx (or tpu etc "neural processors")? What's the sw/compiler stack like, how easy is it to port software, or is sw more commonly custom written for the platform?
wmeddie · 4 years ago
All good questions.

1) It is a custom instruction set, you can rean the ISA guide over at https://www.hpc.nec/documentation

2) The main difference in simple terms is that AVX instructions have a fixed vector length (4, 8, 16 etc). With the SX the vector length is flexible so it can be 10, 4, anything up to the max_vlen (up to 256 on the latest ones). Essentially the idea is you have a single instruction that can replace a whole for loop. Without a good compiler though that means you have to re-write your nested loops.

3) There's currently two options when it comes to the compiler, you can use the proprietary NCC or use the open source LLVM fork NEC has. NCC is less compatible than GCC/Clang (particularly modern C++17 is problematic) but has a lot of advanced algorithms for taking your loops and rewriting them and vectorizing them automatically. The LLVM-fork currently supports assembly instruction intrinsics but they are still working on contributing better loop auto-vectorization into LLVM.

4) Porting software is not terribly difficult to get working, but quite a bit harder to get performing very well depending on the type of workload. Since the Scalar core is pretty standard, you can almost always take regular CPU code and get it running (unlike GPU code in general). If you don't leverage the vector processor though, the performance you get will be nothing special, especially at 1.6GHz. Most of the software made for it starts off as being CPU code and is then modified with pragmas or some refactoring to get it running with good performance on the VE. In almost all cases the resulting code still runs on a CPU just fine. One example of a project that supports both in a single code-base is the Frovedis framework[1].

I think the chip deserves a little more interest than it does. It's one of the few accelerators that you can 1) Buy today, right now 2) Has open source drivers [2] 3) Can run tensorflow [3]. The lack of fp16 support really hurt it for Deep Learning but it's like having a 1080 with 48 GB of RAM, still lots of interesting things you can do with that.

[1]: https://github.com/frovedis/frovedis [2]: https://github.com/veos-sxarr-NEC/ve_drv-kmod [3]: https://github.com/sx-aurora-dev/tensorflow

wmeddie commented on NEC’s Forgotten FPUs   cpushack.com/2021/09/01/n... · Posted by u/protomyth
fulafel · 4 years ago
I thought this was going to be about NEC's vector supercomputer[1] processors. Anybody know about writeups regarding these?

[1] https://en.wikipedia.org/wiki/NEC_SX

wmeddie · 4 years ago
I am currently working with these. Anything you'd like to know more about?
wmeddie commented on Japan Turns to Coal After Closing Nuclear Power Plants   bloomberg.com/opinion/art... · Posted by u/jseliger
wmeddie · 6 years ago
One thing this article doesn't cover which was a big part of this decision was nuclear waste. Nobody wants nuclear waste sites in their prefectures. There was a plan to use breeder reactors, but that hasn't gone anywhere, and the waste is just piling up at the reactor sites with absolutely no place to dispose it. When the mayor of Osaka even hinted at making an underwater waste site the backlash was huge. I doubt any other politicians will step up after that (nor will their parties allow it). So with waste being in this deadlock state for decades, there's just no way forward for nuclear here.
wmeddie commented on Ask HN: What is your ML stack like?    · Posted by u/imagiko
itronitron · 6 years ago
why not write the ML model in Java?
wmeddie · 6 years ago
As someone slightly involved in the Deeplearning4J[1] project it always surprises me that more people don't consider this option.

[1]: https://github.com/eclipse/deeplearning4j

wmeddie commented on How to Use User Mode Linux   christine.website/blog/ho... · Posted by u/xena
wmeddie · 6 years ago
Brings back memories. My first VPS back in 2003 used UML and it worked great. You could also run it inside Windows as well. Could even get GUI apps running by running Xming on your Windows desktop using CoLinux (http://www.colinux.org/?section=screenshots).

u/wmeddie

KarmaCake day133March 16, 2010
About
Living and working in Japan. Interested in doing Deep Learning with statically typed languages and getting them models into production.
View Original