Readit News logoReadit News
craftit commented on The hidden compile-time cost of C++26 reflection   vittorioromeo.com/index/b... · Posted by u/SuperV1234
leni536 · 2 days ago
libstdc++'s <print> is very heavy, reflection or not. AFAIK there is no inherent reason for it to be that heavy, fmtlib compiles faster.

<meta> is another question, it depends on string_view, vector, and possibly other parts. Maybe it's possible to make it leaner with more selective internal deps.

craftit · 2 days ago
I don't know the exact details, but I have heard (on C++ Weekly, I believe) that it offers some advantages when linking code compiled with different compiler versions. That said, I normally avoid it and use fmtlib to avoid the extra compile time. So it isn't clear if it is a win to me. Header-only libraries are great on small projects, but on large codebases with 1000's of files, it really hits you.
craftit commented on YouTube is a mysterious monopoly   anderegg.ca/2025/09/08/yo... · Posted by u/geerlingguy
craftit · 6 months ago
My personal experience is that the increase in ads has encouraged me to subscribe to creators I like via Patreon and view content on there. If many people are doing this, I wonder if it skews the view statistics and, therefore, lowers the number of recommendations for the best channels. In turn, this makes it less likely for good channels to be discovered. The increase in YouTube ads also makes me much less interested in browsing there, and I am finding other things to do instead.
craftit commented on Ask HN: How do you organize your electronic components?    · Posted by u/Acetylcholine
craftit · a year ago
I opted for sorting by project. We live in a world where you can get most things the next day. Then, I keep a few part-sample books for common components like capacitors and resistors for modding. Even if I need a specialist component, I know what projects I have used it in before
craftit commented on Ask HN: Why don't VCs just "suck it up" and pay founders a competitive salary?    · Posted by u/burtonator
burtonator · 2 years ago
Feature that benefits who? the VCs? I think that's what you're trying to say (with sarcasm).

Correct me if I'm wrong though.

craftit · 2 years ago
My take is that it means founders are sharing the risk by investing in the company and by taking a smaller salary than they could otherwise. This means there is a clear motivation to get a successful exit for them as well as the VC's
craftit commented on The C++20 Naughty and Nice List for Game Devs   jeremyong.com/c++/2023/12... · Posted by u/todsacerdoti
mysterydip · 2 years ago
This was my first encounter with the three-way comparison operator (<=>). Can someone give a practical use case? There must be one for it to be included in the spec, but I'm not seeing it.
craftit · 2 years ago
It saves writing lots of boilerplate. If you implement it for a type, you automatically get: <, >, <=, >=, ==, !=
craftit commented on Pretraining data enables narrow selection capabilities in transformer models   arxiv.org/abs/2311.00871... · Posted by u/hislaziness
ffwd · 2 years ago
I have a question which I don't know the answer to:

With those structured numbers will the LLMs be 100% accurate on new prompts or will they just be better than chance (even significantly better than chance)?

Because this is one thing, it has to learn the structure and then create probabilities based on the data, but does that mean it's actually learning the underlying algorithm for addition for example or is it just getting better probabilities because of a narrowing of them? If it can indeed learn underlying algorithms like this that's super interesting. The reason also this is in an issue if it _can't_ learn those, you can never trust the answer unless you check it, but that's sort of a sidepoint.

craftit · 2 years ago
From what I understand, it can learn and execute the algorithm fairly reliably, though it won't be 100%. When the LLM generates text, it is randomised a little, as well as some tricks that prevent repetition, which would likely cause problems with numbers containing all the same digit.
craftit commented on Pretraining data enables narrow selection capabilities in transformer models   arxiv.org/abs/2311.00871... · Posted by u/hislaziness
simbolit · 2 years ago
Why are transformer models so bad at math? They often fail at simple addition.
craftit · 2 years ago
Because often, the tokens are broken up as random groups of numbers. For example, let's say 1984 appears quite a few times in the source text, this will become a single token. Given that these many different, semi-random groups of digits it is hard for the LLM to learn any consistent rules. I believe there are papers showing that if you structure numbers more consistently LLMs have no problem with this kind of arithmetic.
craftit commented on Is A.I. Art Stealing from Artists?   newyorker.com/culture/inf... · Posted by u/giuliomagnifico
a_bonobo · 3 years ago
Two potential counter-arguments:

1) When artists 'steal' from others, they generally build something 'new'. AI can't really create anything 'new'. Case in point: Umberto Eco's The Name Of The Rose steals the plot and outline from crime stories, steals the library idea from Borges, and steals the murderer's motive from Eco's medieval manuscript (forgot which one). Yet the outcome is something completely new. Same goes for hip-hop music; sampling is at its core, but the final music that comes out is nothing like what it samples, the samples are just a part of something new.

2) when artists steal from each other, it's generally poor artists ripping off other poor artists. No money can flow. When a million/billion-dollar company rips off poor artists to make more money via generative AI it's a different story. Money could flow but it doesn't.

craftit · 3 years ago
Too your first point, I am not sure what constitutes new or novel here. I've seen the AI do exactly the type of borrowing from different sources you describe and producing something that seems very new to me. Though I admit, often these are under human direction with some carefully chosen sentences. Is that enough to generate something really new?

To your second point, most of the work I've seen generated from these models were done for free. It could be argued that these tools add to an artists toolbox rather than take something away. I can see for example where one poor artist could create a computer game of the same quality that it takes a triple-AAA game company to do today. Is that good or bad?

craftit commented on Is A.I. Art Stealing from Artists?   newyorker.com/culture/inf... · Posted by u/giuliomagnifico
craftit · 3 years ago
From what I've seen in the art world it is common practice to build on the work of others without attribution. Is this really all that different? That said, I think these tools still implicitly depend on people to pick out good images that resonate with people, as if people don't like the image it creates it won't get shared.
craftit commented on Building a Cloud Database from Scratch: Why We Moved from C++ to Rust (2022)   risingwave-labs.com/blog/... · Posted by u/mountainview
hamilyon2 · 3 years ago
Is rust code heavily seasoned with unsafe keyword really that hard to prototype in?

Is it meaningfully harder than c++ in this regard?

craftit · 3 years ago
From what I've experimented with so far the biggest barrier to using rust was having a mature ecosystem of libraries to use. This is changing though!

u/craftit

KarmaCake day48October 29, 2013View Original