Readit News logoReadit News
rbn3 commented on Strudel: A live coding platform to write dynamic music pieces in the browser   strudel.cc... · Posted by u/todsacerdoti
rbn3 · a year ago
https://strudel.cc/?lxYj8fRH-Fb0 sounds amazing, can't wait for this to get MIDI output so i can use it with other music software
rbn3 commented on Lenovo ThinkPad X1 Carbon G12 laptop review: First major refresh in three years   notebookcheck.net/Lenovo-... · Posted by u/neverrroot
rcarmo · a year ago
The X1s have been very much hit and miss over the years. I used several iterations and flavours, and can summarise things like so (apologies for not knowing the right numbers for some anymore, I had to return them):

- 2014 X1 Carbon: Amazing screen, so-so keyboard, great battery life. Heavy. Don't remember the fan noise.

- 2015 X1 Carbon: Horrible dim screen (yes, I know you can BTO an upgrade, but we got the defaults), good keyboard, too heavy, horrid battery life that was especially painful as I spent 2015/16 mostly on the road. Fan noise was a frequent nuisance when I was in my home office.

- 2022 X1 Yoga Gen 6 (https://taoofmac.com/space/blog/2022/12/03/1600): Amazing display, good keyboard, nice weight, battery life seems OK for Windows but pretty short compared to my MacBook (as usual these days). Had to gimp it to have a quieter fan, and I've had days when my IdeaPad Flex 5 (which runs Fedora) lasts longer, which is just... odd.

As to the Carbon carrying an Intel Ultra... I really like Lenovo, but I bought an IdeaPad Flex 5 with my own money because it used an AMD CPU. Still haven't regretted the multi-core performance, or the iGPU (which was pretty great for that time).

rbn3 · a year ago
>Had to gimp it to have a quieter fan

Had to do the same with my IdeaPad Flex 5, which always felt super infuriating.

I ended up so frustrated with it that i've ultimatly caved and bought a macbook. I guess i'll have to stay on desktop machines for x86 until someone finally figures out how to make laptop that runs linux and stays silent under heavy load.

rbn3 commented on Attention deficits linked with proclivity to explore while foraging   royalsocietypublishing.or... · Posted by u/Tomte
blowski · 2 years ago
Is berry-picking the only job where ADHD actively helps?
rbn3 · 2 years ago
From my own experience and that of many peers I've talked to it really seems to benefit DJing as a skill. It's an activity where you get to be hyperfocused and freely associating/improvising at the same time. Not really a career-path I'd recommend to anyone though.
rbn3 commented on Riffusion – Stable Diffusion fine-tuned to generate music   riffusion.com/about... · Posted by u/MitPitt
gedy · 3 years ago
This is so good that I wondered if it's fake. Really impressive results from generated spectrographs! Also really interesting that it's not exactly trained on the audio files themselves - wonder if the usual copyright-based objections wild even apply here.
rbn3 · 3 years ago
regarding those usual objections, i'd argue that a spectrograph representation of a given piece of audio is just a different (lossy) encoding of the same content/information, so any hypothetical objections would still apply here.
rbn3 commented on Riffusion – Stable Diffusion fine-tuned to generate music   riffusion.com/about... · Posted by u/MitPitt
rbn3 · 3 years ago
great stuff, while it comes with the usual smeary iFFT artifacts that AI-generated sound tends to have the results are surprisingly good. i especially love the nonsense vocals it generates in the last example, which remind me of what singing along to foreign songs felt like in my childhood.
rbn3 commented on Dhall: A Gateway Drug to Haskell   saurabhnanda.in/2022/03/2... · Posted by u/todsacerdoti
picozeta · 3 years ago
In my opinion Haskell's capabilities to help implement general-purpose SW (services, compilers, GUI, game-dev, systems) are way over-hyped.

Sure, it's nice to implement parsers/compilers in it (see ̶N̶̶̶i̶̶̶x̶̶̶, Elm, DHall) and it's to some extent popular for backend web development, but I don't think the strong static type system is such an advantage as advocates want you to believe.

Of course it's safer (and faster) than dynamic languages like Python (Ruby, JavaScript), but in my personal opinion, you don't gain that much if you compare it to something like Rust, Java or even Go.

In particular the common abstractions (Monad, Applicative, Functor) are used very differently in different libraries (everyone seems to want to write her own DSL), so it's hard to get comfortable with third party libraries.

Furthermore it's still the case, that there's not much really interesting SW made in it. Some years ago you could argue, it's because of familiarity, but (static typed) functional programming is part of CS curricula for at least 10 to 20 years now.

Lastly it's just incredible hard to estimate runtime characteristics (clock-time, memory usage), because you never know which compiler optimizations kick in or not.

It's a nice language, but again, I don't think it's such a panacea as proponents often tell you.

And to be snarky: If Haskell is such an advanced language to make implementing reliable SW such a breeze, it should be part (maybe as a dependency) of more SW.

EDIT: Nix is implemented in C++ as correctly pointed out in a comment.

rbn3 · 3 years ago
>Furthermore it's still the case, that there's not much really interesting SW made in it.

TidalCycles would like to have a word with you. Easily one of the most interesting and unusual pieces of SW that currently exists.

rbn3 commented on Physicists are building neural networks out of vibrations, voltages and lasers   quantamagazine.org/how-to... · Posted by u/pseudolus
grahamrow · 3 years ago
In this PNN approach you are solving for what additional stimuli, when applied to the system alongside the inputs, produce the desired result for a given input. In reservoir computing (RC) you don’t bother to provide any additional stimuli, and find the linear combination of reservoir outputs that gives the desired result. Training the former is more demanding and analogous to a NN (thus the name), but directly produces your answer from the system. The latter is very easy to train (one regression) but requires post processing for inference.
rbn3 · 3 years ago
aaah - got it, thanks!
rbn3 commented on Physicists are building neural networks out of vibrations, voltages and lasers   quantamagazine.org/how-to... · Posted by u/pseudolus
rbn3 · 3 years ago
This instantly reminded me of the paper "pattern recognition in a bucket"[0], which I've seen referenced a lot when I first started reading about AI in general. I only have surface-level knowledge about the field, but how exactly does what's described in the article differ from reservoir computing? (The article doesn't mention that term, so I assume there must be a difference)

[0] https://www.researchgate.net/publication/221531443_Pattern_R...

u/rbn3

KarmaCake day16June 1, 2022View Original