I don’t personally see that company as reliable or trustworthy at all.
> The producer price index increased 0.9% from a month earlier, the largest advance since consumer inflation peaked in June 2022, according to a Bureau of Labor Statistics report out Thursday
Including a header that is not in the program, and not in ISO C, is undefined behavior. So is calling a function that is not in ISO C and not in the program. (If the function is not anywhere, the program won't link. But if it is somewhere, then ISO C has nothing to say about its behavior.)
Correct, portable POSIX C programs have undefined behavior in ISO C; only if we interpret them via IEEE 1003 are they defined by that document.
If you invent a new platform with a C compiler, you can have it such that #include <windows.h> reformats all the attached storage devices. ISO C allows this because it doesn't specify what happens if #include <windows.h> successfully resolves to a file and includes its contents. Those contents could be anything, including some compile-time instruction to do harm.
Even if a compiler's documentationd doesn't grant that a certain instance of undefined behavior is a documented extension, the existence of a de facto extension can be inferred empirically through numerous experiments: compiling test code and reverse engineering the object code.
Moreover, the source code for a compiler may be available; the behavior of something can be inferred from studying the code. The code could change in the next version. But so could the documentation; documentation can take away a documented extension the same way as a compiler code change can take away a de facto extension.
Speaking of object code: if you follow a programming paradigm of verifying the object code, then undefined behavior becomes moot, to an extent. You don't trust the compiler anyway. If the machine code has the behavior which implements the requirements that your project expects of the source code, then the necessary thing has been somehow obtained.
True, most compilers have sane defaults in many cases for things that are technically undefined (like take sizeof(void) or do pointer arithmetic on something other than a char). But not all of these cases can be saved by sane defaults.
Undefined behavior means the compiler can replace the code with whatever. So if you e.g. compile optimizing for size, the compiler will rip out the offending code, as replacing it with nothing yields the greatest size optimization.
See also John Regehr's collection of UB-Canaries: https://github.com/regehr/ub-canaries
Snippets of software exhibiting undefined behavior, executing e.g. both the true and the false branch of an if-statement or none etc. UB should not be taken lightly IMO...
I think you're making some extraordinary claims. I'd love to see some receipts. :-)
I have had trouble compiling older C++ code bases with newer compilers, even when specifying C++98 as source standard. I gave up trying to get Scott McPeak's Elkhound C++ parser to compile, last I had to attempt it.
C is a bit more forgiving on that topic (it hasn't changed as much, for better or worse).
Heck I learned to program in C++, back when DR-DOS 5 was the latest version, and all I had available was 640 KB to play around, leaving aside MEMMAX.
Nowadays the only reason many embedded developers keep using C is religous.
There are quite a few besides various PICs AFAIK, how modern they are is subjective I guess, and it IS mostly the weaker chips. Keil, Renesas, NXP, STMicro (STM8 MCUs used to be C only, not sure today) all sell parts where C++ is unsupported.
> Nowadays the only reason many embedded developers keep using C is religous.
I don’t completely agree, but I see where you are coming from.
The simplest tool that gets the job done is often the best IMO.
In my experience, it is much more difficult to learn C++ well enough to code safely, compared to C.
Many embedded developers are more EE than CS, so simpler software is often preferred.
I know you don’t have to use all the C++ features, all at once, but still :)
Horses for courses. I prefer C for anything embedded.
…I struggle to comprehend how an odd quantization like 5 bit, that doesn't align well with 8 bit boundaries, would not slow things down for inference: given that on one hand the hardware doing the multiplications doesn't support vectors of 5 bit values but needs repacking to 8 bit before multiplication, and on the other hand the weights can't be bulk-repacked to 8 bit once and for all in advance (otherwise it wouldn't fit inside the RAM, besides in that case one would use a 8 bit quantization anyways)
it would require quite a lot of instructions per multiplication (way more than for 4 bit quantization where the alignment match simplifies things) to ad-hoc repack the 5 bit values to vectors of 8 bit. So i kinda wonder how much (percentage-wise) that would impact inference performance
Who says it doesn’t :)?
At least in my tests there is a big penalty to using an “odd” bit stride.
Testing 4bit quantization vs 5bit in Llama.cpp, I see quite a bit more than the “naiively expected” 25% slowdown from 4 to 5 bits.