> There is little of needless code in standard library. And what there is, believe me, over the decades was really polished.
Wrong. Counter-example: locales [1]
Also: String handling, which is responsible for so, so, so many security vulnerabilities. The underlying cause is zero-terminated strings (instead of using start/end pairs or start/length pairs). You can't even tokenize a string without either copying or modifying it!
Also also: just a single, apologetic mention of undefined behavior. Responsible – in cooperation with over-zealous compiler writers – for so many more bugs not already caused by improper string handling.
I love C but I think most of the C standard library is useless and outdated rubbish (pretty much everything above the mem...() and math functions). The entire IO and memory-management areas should have been either thrown out of the standard library, or updated 20 years ago to keep up with low-level operating system features.
But one of the best features of C is that you can mostly ignore the standard library and still enjoy "C the language", e.g. nobody ever choose C for its standard library ;)
(also re UB etc...: use the mighty trio UBSAN, ASAN and TSAN!)
> But one of the best features of C is that you can mostly ignore the standard library and still enjoy "C the language", e.g. nobody ever choose C for its standard library ;)
C is one of the hardest languages to do this with given its anaemic dependency management.
> (also re UB etc...: use the mighty trio UBSAN, ASAN and TSAN!)
All of which will miss some cases, even in combination.
There's a lot of subtly broken stuff in C stdlib. For example, time.h calls that may touch the TZ env var aren't thread-safe, because getenv is unsafe (it gives a pointer to its internal globally mutable data structure without locks).
Because C is so bare-bones, it usually leans on POSIX as its extended standard library, and that is also full of old cruft.
Exhibit B: yes, the command that repeatedly prints `y` to stdout. Exhibit C: all handling of time, with time_t, struct timespec, timeval etc. We should just have a 64 bit uint representing nanoseconds from epoch, which would give us 585 years and get rid of much nonsense.
In my firmware I keep time as a 64 bit signed number that counts 32.768 khz ticks since the unit powered up. I think it'll roll over in 8 million years or something like that.
If you want to know what's broken? Most real time clock modules. They almost all want to store time as HH:MM:SS MM:DD:YY and sometimes 1/256 of a second but sometimes not.
Probably something like how msgpack (or even variable length unicode encoding) works, where the low 127 bytes is a single uint8 char in place, and then the next 121 signals that a string of length 1-121 bytes follows, and then after that are a handful of sentinel values for "length follows and it's 1/2/4/8 bytes" plus maybe some special cases like zero-length string.
I can understand that this might have been too much implementation complexity/risk to contemplate 40 years ago, but this kind of pattern is very well established at this point, especially in scripting languages with loose typing.
> Also: String handling, which is responsible for so, so, so many security vulnerabilities.
That's not really a counter-example. Sure the string handling was an unfortunately choice but it is the standard. The standard library must implement it as defined. Being standard-compliant is not lack of polish.
It takes more code to use it correctly than to hand-code what you think it is supposed to be doing for you. strtok is Cursed.
strlcpy is similar. If you don't write that much more code, you are not using it correctly, and it is not giving you the value that is the reason you thought was why you were using it.
> There is little of needless code in standard library. And what there is, believe me, over the decades was really polished.
Ha ha ha ha ha ha ha ha ha ha.
Locales are a massive clusterfuck, basically too simple to handle localization if you actually care about it, but supports enough of it to screw you over if you don't care about it. The "wide character" support is also a nightmare. The time library support is also quite a bit wonky (years are measured as years since 1900 because Y2K is definitely not a pressing issue in 1989!).
> inline Assembly
Fun fact, here is the C specification's entire mention of inline assembly:
> The asm keyword may be used to insert assembly language directly into the translator output (6.8). The most common implementation is via a statement of the form: asm (character-string-literal);
There is no discussion of what inline assembly can and cannot do, how it interacts with the rest of the code in term of semantics, how to pass arguments to and form inline assembly, etc. You might get some of this information from the manuals of compiler implementations, but even that can be surprisingly free of necessary information. Compare this to Rust's inline assembly documentation: https://rust-lang.github.io/rfcs/2873-inline-asm.html (which is more detailed than even gcc's or LLVM's inline assembly documentation).
I have been programming in C for the last 25 years, and every year or so someone comes with a new shinny thing that will "replace" C. First it was C++ (we know how well it did...), then Objective-C, then Java, then C#, then Go, and now rust.
Everyone of these language brought new ideas, but they don't stand a chance because their designers don't understand the point of C. The C language didn't win because it was the "best" language or had the best set of features. Far from it. Even in the mid 70s it was a backward language compared to other cool languages of the day like Algol and Lisp.
C won the competition because it just gives programmers the bare minimum functionality to put an operating system and a compiler in place! It is flexible, you can provide your own library if you want, and therefore gives your easy portability. OS writers will chose C any hour of the day or night because it makes their job easier.
By comparison, other languages will require a huge library to be available, and sometimes a complex runtime system, just for you to write a simple "hello world"! Imagine if you need to write a new OS, a compiler, a linker, or a shell interpreter... you get the idea.
My conclusion is that language designers still didn't get what made C so successful and therefore keep coming up with shiny complex things that don't stand a chance to become the next C.
> C won the competition because it just gives programmers the bare minimum functionality to put an operating system and a compiler in place!
Not really. C won because it was the standard compiler for Unix, and was a free compiler for a free operating system in a time when both were highly unusual.
But in the past few decades, C has lost a lot of its market share to other programming languages. In the realm of desktop (and mobile) applications, C is basically unused for new projects--its standard library is truly anemic here, and major support libraries (e.g., GUI toolkits) are often in C++ and not C. Where C is still the dominant language is the land of embedded applications, and it's not had a lot of competition here since most languages don't bother trying to define a freestanding implementation.
Rust is really the first language to try to contend this space. And there are signs that it may supplant C: Intel is apparently looking to move its firmware to Rust; Linux is allowing Rust for device drivers and kernel modules. Hell, even some OS programming courses (e.g., Stanford, Georgia Tech) have moved their curriculum to use Rust instead of C.
C won because it was the only low level language that was competently implemented for DOS for years. DOS was were 90% of the programming action was in the 80's, and C was a perfect fit for DOS. C also was easily extended to handle segmented memory.
I'm enrolled for a masters at Georiga Tech right now and last I checked the OS course is in C, I would kinda prefer to learn rust anyway so I hope you are right.
You probably don't realise it, but you're cheating.
Operating systems are currently written in C, and therefore have a C interface. Which language is best at interfacing with C? (No trick here, just a rhetorical question.) C of course. Other languages would need some sort of FFI, which is generally unwieldy enough that the designers hid it behind a comprehensive standard library.
C doesn't need an huge library to be available, you say? Oh but it does. It's called the kernel. Comes with a freaking huge runtime too.
> Imagine if you need to write a new OS, a compiler, a linker, or a shell interpreter... you get the idea.
I think I do, but I'm afraid you don't. Writing an OS (and the rest) in Pascal (I'm thinking of Oberon specifically) is no harder than to write it in C. If you write your whole OS in Blub, interfacing with Blub will be easiest in Blub, you won't need an extensive Blub standard library because you already have the kernel, all the tools (debuggers, editors…) will be Blub friendly…
Lisp machines used to be a thing, you know.
> My conclusion is that language designers still didn't get what made C so successful […]
Language designers can't even address what makes C so successful: network effects.
> Writing an OS (and the rest) in Pascal (I'm thinking of Oberon specifically) is no harder than to write it in C.
You can't because it is impossible to escape the type system. Without the ability to cast pointers you can't write a memory allocator. You can't write a function like dlopen()...
it did well enough that C stdlibs (at least MSVC's, LLVM's) and compilers (... pretty much all the big ones) are implemented in C++ and just export C symbols nowadays, likewise for newer OSes like Fuschia.
SerenityOS (https://github.com/SerenityOS/serenity) was written from scratch in two years in C++ and goes as far as having a custom web browser & JS engine. Where is the equivalent in C ? Where are the C web browsers, C office suites, C Godot/Unity/Unreal-like game engines ? Why is Arduino being programmed in C++ and not C ?
Language designers are pretty clever people and understand a great deal about the successes and failures of languages. I think if you look at a lot of what Rust is doing today (and C++ yesterday, for better or worse) you'll see a lot of inspiration and influence from the successes and failures of C. Particularly when it comes to safety and generic programming.
Objective-C, Java, Swift, and C# have become massively successful as application programming languages because C is/was terrible at it. They learned a lot about how painful it was to do basic higher level programming tasks when you are restricted to C's semantics and memory model.
C is great but I don't think it's worth romanticizing since history has shown that C isn't that great for writing anything but systems code. Which is a restricted domain to begin with, and isn't even that attractive for it anymore.
The one thing C has over anything else is interop. The language of FFI is C. There's no inherent reason for that other than history, and it's not super broken so we're not going to fix it.
Games (many in C++ with one or two features beyond basic C). All kind of solvers. GPU programming.
C sucks when you need convenience of big standard library or safety above performance. There is nothing like C when you care about speed, memory footprint and efficient memory management. It's great other languages took over in areas C is terrible at but it's not like areas when it's the best and often they only option disappeared.
C won over because of the mystique of the syntax in which you can load multiple side effects into expressions. The intuition was that this leads to faster code, and in fact, with naive compilers, it did lead to faster code. C's terseness won over programmers who hated typing things like BEGIN and END for delimiting blocks. Unlike Pascal or Modula, C came with something very useful: a macro preprocessor. This is such an advantage, that it's better to have a crappy one than none at all. Those programmers who did not hate BEGIN and END could have them, thanks to that preprocessor. The preprocessor also ticked off a checkbox for those programmers who were used to doing systems programming using a macro assembler.
C started to be popular in the microcomputer world at a time when systems programming was being done in assembly languages. For instance, most arcade games for 8 bit microcomputers, were written in assembler. Some applications for the IBM PC were written in assembler, such as WordPerfect.
The freedom with pointers thanks to arithmetic would instantly make sense and appeal to assembly people, who would find a systems language without pointer flexibility to be too strait-jacketed.
I see C as a minimally-viable HLL. Just the essential high level stuff – expression-oriented syntax (a huge thing over assembly), structured programming support (conditionals, loops), automatic storage management (no worries about what goes in registers and what goes at certain offsets in the stack frame), abstraction over calling conventions, and a rudimentary type system around scalars with a certain size and pointers pointing at them.
I'm in a similar position (only 10 years professionally though). I agree with everything you said, except I think Rust genuinely has a shot here. It doesn't have a runtime, and you can choose to use just the core libraries or even no standard libraries. It exports to the C ABI, so it's compatible with existing software and libraries. It isn't a garbage pile like C++. And it solves a real problem C has, code safety. It's silly to say any language will completely replace any other language, but I think Rust fits into the slot pretty well for the vast majority of cases where C is currently the best choice.
People proposing every other language that tried to replace C thought the same. Only time will tell, of course, but I wouldn't bet on it. Nowadays C++ is seen by many as a "garbage pile", the same can happen to rust.
Rust has a terrible, verbose syntax and solves a problem that is a non issue in many applications where C shines while making ordinary things difficult (like recursive data structures).
I am sure it has its place but I think it's just too ugly to be attractive. There is something pleasing about writing C, Python or even Javascript which you will never get with Rust. It will never be a language a lot of people enjoy writing imo.
I would argue that Java and C# completely succeeded. Who is out there writing Desktop applications or web services in pure C anymore?
C++ too to a lesser extent. I work on spacecraft flight software, and there's a significant push to move from pure C to modern C++.
No single language is going to (or wants to IMO) replace C in every single use case, but replacing C in specific use cases has been a huge boon for productivity.
Rust, I'd argue, is the language trying to really take on most of the C use cases. The only one it's really not trying to take over is old embedded work where C is pretty much the only option.
Otherwise, I can't think of any circumstance Rust isn't trying to muscle in on C and C++'s territory.
I like C fine, and I also taught it when I was in grad school.
New languages mostly don't replace existing ones. Rather, they supplant them for some uses, and open up new kinds of software which are easier to implement or to conceive with the new language. Now, you referred to C++ specifically, and since I'm somewhat familiar with it I'll address some of the points you made with respect to just that one:
Bjarne Strousup said: "If you want to create a new language, a new system - it's quite useful not to try to invent every wheel." For a long while now, C++ teachers/trainers encourage their audience _not_ to think of C++ as an "augmented C" or "C with feature X Y and Z", and to avoid most "C-style" code in favor of idioms appropriate to what the language offers today.
Also, C didn't "win the competition" because there isn't a "bestest language for everything" competition. It has been, and is, a popular language with many uses. Writing operating system kernels is one kind of programming task, where C is the most popular. Even at this level (and lower still), other languages are potentially interesting and often used. See, for example:
Finally, C++ doesn't require a huge library nor a complex runtime system because it has a "freestanding mode" in which the requirements are very limited (although more is required than for C). See:
https://en.cppreference.com/w/cpp/freestanding
When it comes to declarations, the world is already moving over to Pascal's syntactic conventions in some ways. In newer languages, you see something like this:
var x: *int;
more often than:
int* x;
i.e. type name follows variable name, and type modifiers work more like unary prefix operators on types. And this is because it's less ambiguous to parse, and makes more complex types a lot easier to read, since you simply go left-to-right, instead of C's "spiraling" declarations.
The rest of Pascal's syntax is not particularly problematic, either. I'd say that the two biggest problems with it were begin..end for blocks, and having all local declarations in a block separate from code. But Modula-2 already dropped "begin", and various Pascal dialects added inline variable declarations eventually. So, on the whole, I think we'd actually be better off in terms of code readability if Modula-2 rather than C became the standard systems programming language.
As opposed to C's syntactic conventions, the absolutely gorgeous mush of *s, &s, __MACRO_LIKE_FUNCTIONS___() and #preprocessor directives
The default is not the best, the default has just beaten the world so many times on the head that anything else became foreign and weird and got laughed out of the room before it even had the chance to say anything.
One programming language history book/blog/paper a day keeps the nonsense notions away, C "Won" like people win the lottery or the roulette.
Programmers have been show, time after time, not to be particularly trustworthy. Have we not learned the lesson that it's really easy to make mistakes, and we should trust tools instead of people to check our work?
> Don’t prevent the programmer from doing what needs to be done.
Ditto the above.
> Keep the language small and simple.
It is in some ways, but it's "smallness" leads to a serious lack of simplicity as seemingly simple things are incredibly hard to do right consistently. For instance, avoiding indexing past the end of an array or rolling over an integer.
> Provide only one way to do an operation.
That's nice, I'll grant you, although there are of course exceptions that prove the rule, like:
a[b]
is synonymous with
*(a + b)
*((uint8_t *)a + (b * sizeof(*a))
> Make it fast, even if it is not guaranteed to be portable.
The funny thing about rolling over an integer not being easy to catch is hard to ignore. The PDP-11 'add' instruction sets the 'C' bit if there is a carry from the MSB; the Z bit is set if the result == 0; the N bit is set if the result < 0; and the V bit is set if there is arithmetic overflow (both operands were of the same sign, but the result is of opposite sign). By simply making these bit available as, say, special names that could be tested (e.g. C N Z V) after an operation, you could determine what happened (if appropriate) and take action. HP's SPL had such a feature (used on the HP3000 series). C is not a well-designed language, but an improved 'lifeform' along the way. Today, I would like to see a c-like language developed specifically for RISC-V; especially one that had many many fewer edge cases.
> Today, I would like to see a c-like language developed specifically for RISC-V
When I've learned that RISC-V had no carry bit, I couldn't help but think it might have been designed for C to begin with. Sure, they give reasons for this choice, none of them linked to C. Still, it hurts multi-precision arithmetic any language with BigInt would have benefited from (I recall Python, Haskell, and Scheme at the very least).
if (((a > 0) && (b > 0) && (a + b) < 0) ||
((a < 0) && (b < 0) && (a + b) > 0)) { / * Overflow */ }
Now you may be saying, well, isn't there a branch-if-arithmetic-overflow instruction in practically every single architecture ever? To which I would say simplicity matters.
there's one ? the "C" ABI is just the ABI of whatever platform it's running on, which may or may not have funky behaviour that vendor-provided compilers kindly hide for you - functions being prepended with '_' on macOS, the two-dozen calling conventions on windows with i386, sysv and itanium ABIs...
Do you think you can tell what's the ABI of
struct foo some_function(struct bar);
?
will bar be passed in a register, on the stack ? who knows, that depends on your platform, your compiler, etc etc. Things going cleanly on the stack is just a convenient lie that your first year comp. sci teachers tell you because it's too early to talk about how the real world works yet.
The leading '_' before symbols isn't just a macOS thing, AFAIK this is the only "name mangling" that C does and from what I've seen so far it's the same across platforms and compilers.
As far as the ABI goes: the important thing is that there is a standard ABI on a specific platform that all compilers on that platform agree on. Sounds kinda obvious, but it's not common in other languages.
ABIs have always been platform-specific in that sense. C ABI is beneficial in that on any specific platform, it's interoperable - which is good enough, because native code has to be compiled for a specific platform, anyway. In your specific example, I don't really care how "struct foo" is passed, so long as all shared libraries compiled for that platform agree on how it's done. And we have that on C level today on all major platforms.
Sure, there are compiler switches and language extensions that can break the ABI if you use them. But, well, you don't have to use them (at the interop boundary), and neither do your API clients.
C was one of those write once compile anywhere bits they used to tell us. Then when we actually tried it we found every CRT was different on all of the platforms. Even something as ubiquitous as printf I think I have encountered at least 8 different versions of it. At least from compiler chain to compiler chain they are usually similar in calling conventions (but not always). But try mixing a msvcrt with a glib one and you are in for some fun...
What is the nit of it is it almost works. You have a good shot at getting it to compile in a short amount of time. The rest of the work will be lots of time in ye old debugger and going over the docs for your platform. The fun part is you will find bugs that were there already, or are they just part of the platform, or were you using it wrong?
An advantage of Rust and Golang (and OCaml actually[1]) is they let you write to the C ABI. We've written both components that call into existing C ABI (nbdkit-{rust|golang}-plugin), and also Rust/Golang/OCaml shared libraries that present C ABI functions / structures to the world.
[1] But you need to write a bit of C glue code for OCaml so it's not quite so seamless.
Some form of C FFI has been standard for most languages that appeared after C became ubiquitous. Even something as dynamic and high-level as Python has ctypes.
The ABI may come from C, but it's not limited to C. Many other languages have FFI and use the C ABI as the lowest common denominator.
But the C ABI is awful to work with. The C language itself offers no help to guarantee ABI compatibility. What ABI it compiles for depends on headers, which may depend on a jungle of ifdefs and typedefs.
The part 'if you want to programm video games use C++' is a bit weird though. I remember a talk from a videogame designer, when asked:'why do you use c++' the answer was actually 'because we have to' in the sense everybody does it and management requires it. The features used we basically C.
One reason would be template metaprogramming. In game development and simulations we use numerical computing most of the time, a domain where templates come in handy. The rest is just C. Modern C++ for highly efficient systems is not C with classes anymore, it is more like C with templates. And also functional constructs for specific components (physics) where it makes sense to use it. I have high hopes that a language like Zig will prove less complex and as powerful as C++ for game dev. It won't be a replacement for C++, just an alternative, hard to believe that anything will dethrone the king.
It's certainly more than just templates, but it's also not the whole hairy c++ - it's C + templates, RAII, and namespaces.
It's less template metaprogramming and just templates to generate efficient code - stuff used to be done by abusing the proprocessor can now be handled by a (slightly) more elegant templating engine rather than a string pasting engine.
Classes are used for resource management/RAII, like we have a AQUIRE_MUTEX_IN_SCOPE() macro which will release mutex when scope is exited, this is supremely useful and generalized to many resources.
Lastly namespaces are huge. In big C codebases you have to be super pendantic about naming modules and APIs consistently because otherwise it becomes a nightmare.
C++ does get in the way still sometimes, like when you want to do something slightly dirty for perf reasons, say aliasing between structs. You first write it in a way that makes sense, basically how you would write it in C, but it's UB in C++, so you rewrite it with virtual calls or memcpys such that the compiler should be smart enough to arrive at the same result as C would have with the straightforward implementation. This works great until it doesn't, your last option is to try to solve it with templates, and that hole is very deep.
C++ templates have a poor reputation in the gamedev community. Years ago some companies went too far with it and got burned. EA has their own fork of the STL that's gotten increasingly crufty and behind in performance. AAA game engines also get big, so there's a lot of pressure on avoiding explosions of build time with more advanced template techniques.
Gamedev tends to use its own patterns, particularly arena allocation for long lived fixed size tables, or their own internal object / entity / component model . It's not really c with classes or c with templates, just kind of its own dialect.
From what I've heard in presentations, many game developers seem to be some of the biggest opponents of templates, especially because of the horrible performance on non-optimized builds of template-heavy code, and in general because of the opaqueness of performance of template-heavy code.
For numerical computing:yes. For games, dont you just use float (for non-integer values)? Found the video I was thinking about: https://www.youtube.com/watch?v=rX0ItVEVjHc
from 1:24:00 onwards.
The "we have to" is because all the game development middleware is written in C++ (and sometimes C), the same way you "have to" use Javascript for writing web applications (even though WASM is beginning to change that, but it took 25 years to get there).
Actually, only half of the universe is using C++ to write games, the other half is using Unity and writes their game code in C#.
If language interopability would be dramatically better, this "lock-in" into a specific programming language wouldn't be half as bad as it currently is, and it would be much easier and less risky to use "fringe languages" for game development.
Wrong. Counter-example: locales [1]
Also: String handling, which is responsible for so, so, so many security vulnerabilities. The underlying cause is zero-terminated strings (instead of using start/end pairs or start/length pairs). You can't even tokenize a string without either copying or modifying it!
Also also: just a single, apologetic mention of undefined behavior. Responsible – in cooperation with over-zealous compiler writers – for so many more bugs not already caused by improper string handling.
[1] https://github.com/mpv-player/mpv/commit/1e70e82baa9193f6f02...
But one of the best features of C is that you can mostly ignore the standard library and still enjoy "C the language", e.g. nobody ever choose C for its standard library ;)
(also re UB etc...: use the mighty trio UBSAN, ASAN and TSAN!)
What is the standard library for the C which ends all the other standards libs?
C is one of the hardest languages to do this with given its anaemic dependency management.
> (also re UB etc...: use the mighty trio UBSAN, ASAN and TSAN!)
All of which will miss some cases, even in combination.
Because C is so bare-bones, it usually leans on POSIX as its extended standard library, and that is also full of old cruft.
https://en.cppreference.com/w/c/string/byte/strtok
About getenv: This sort-of fixed with `getenv_s()` in C11
https://en.cppreference.com/w/c/program/getenv
[1] https://github.com/coreutils/coreutils/blob/master/src/yes.c
If you want to know what's broken? Most real time clock modules. They almost all want to store time as HH:MM:SS MM:DD:YY and sometimes 1/256 of a second but sometimes not.
I don't think it has really been proposed anywhere though, unfortunately.
What type (and how big) should length be?
size_t.
I can understand that this might have been too much implementation complexity/risk to contemplate 40 years ago, but this kind of pattern is very well established at this point, especially in scripting languages with loose typing.
Deleted Comment
That's not really a counter-example. Sure the string handling was an unfortunately choice but it is the standard. The standard library must implement it as defined. Being standard-compliant is not lack of polish.
rust-lang saves the day, yet again https://www.brandonsmith.ninja/blog/favorite-rust-function
Deleted Comment
Deleted Comment
It takes more code to use it correctly than to hand-code what you think it is supposed to be doing for you. strtok is Cursed.
strlcpy is similar. If you don't write that much more code, you are not using it correctly, and it is not giving you the value that is the reason you thought was why you were using it.
Perhaps a committee was involved.
Ha ha ha ha ha ha ha ha ha ha.
Locales are a massive clusterfuck, basically too simple to handle localization if you actually care about it, but supports enough of it to screw you over if you don't care about it. The "wide character" support is also a nightmare. The time library support is also quite a bit wonky (years are measured as years since 1900 because Y2K is definitely not a pressing issue in 1989!).
> inline Assembly
Fun fact, here is the C specification's entire mention of inline assembly:
> The asm keyword may be used to insert assembly language directly into the translator output (6.8). The most common implementation is via a statement of the form: asm (character-string-literal);
There is no discussion of what inline assembly can and cannot do, how it interacts with the rest of the code in term of semantics, how to pass arguments to and form inline assembly, etc. You might get some of this information from the manuals of compiler implementations, but even that can be surprisingly free of necessary information. Compare this to Rust's inline assembly documentation: https://rust-lang.github.io/rfcs/2873-inline-asm.html (which is more detailed than even gcc's or LLVM's inline assembly documentation).
Deleted Comment
Deleted Comment
Everyone of these language brought new ideas, but they don't stand a chance because their designers don't understand the point of C. The C language didn't win because it was the "best" language or had the best set of features. Far from it. Even in the mid 70s it was a backward language compared to other cool languages of the day like Algol and Lisp.
C won the competition because it just gives programmers the bare minimum functionality to put an operating system and a compiler in place! It is flexible, you can provide your own library if you want, and therefore gives your easy portability. OS writers will chose C any hour of the day or night because it makes their job easier.
By comparison, other languages will require a huge library to be available, and sometimes a complex runtime system, just for you to write a simple "hello world"! Imagine if you need to write a new OS, a compiler, a linker, or a shell interpreter... you get the idea.
My conclusion is that language designers still didn't get what made C so successful and therefore keep coming up with shiny complex things that don't stand a chance to become the next C.
Not really. C won because it was the standard compiler for Unix, and was a free compiler for a free operating system in a time when both were highly unusual.
But in the past few decades, C has lost a lot of its market share to other programming languages. In the realm of desktop (and mobile) applications, C is basically unused for new projects--its standard library is truly anemic here, and major support libraries (e.g., GUI toolkits) are often in C++ and not C. Where C is still the dominant language is the land of embedded applications, and it's not had a lot of competition here since most languages don't bother trying to define a freestanding implementation.
Rust is really the first language to try to contend this space. And there are signs that it may supplant C: Intel is apparently looking to move its firmware to Rust; Linux is allowing Rust for device drivers and kernel modules. Hell, even some OS programming courses (e.g., Stanford, Georgia Tech) have moved their curriculum to use Rust instead of C.
Oh, did this happen? I remember some discussion some months ago, has it actually been merged?
Operating systems are currently written in C, and therefore have a C interface. Which language is best at interfacing with C? (No trick here, just a rhetorical question.) C of course. Other languages would need some sort of FFI, which is generally unwieldy enough that the designers hid it behind a comprehensive standard library.
C doesn't need an huge library to be available, you say? Oh but it does. It's called the kernel. Comes with a freaking huge runtime too.
> Imagine if you need to write a new OS, a compiler, a linker, or a shell interpreter... you get the idea.
I think I do, but I'm afraid you don't. Writing an OS (and the rest) in Pascal (I'm thinking of Oberon specifically) is no harder than to write it in C. If you write your whole OS in Blub, interfacing with Blub will be easiest in Blub, you won't need an extensive Blub standard library because you already have the kernel, all the tools (debuggers, editors…) will be Blub friendly…
Lisp machines used to be a thing, you know.
> My conclusion is that language designers still didn't get what made C so successful […]
Language designers can't even address what makes C so successful: network effects.
Not true. Operating systems are written in C with some assembly, and the interface to userspace is universally written in assembly. E.g., https://git.kernel.org/pub/scm/linux/kernel/git/stable/linux...
You can't because it is impossible to escape the type system. Without the ability to cast pointers you can't write a memory allocator. You can't write a function like dlopen()...
it did well enough that C stdlibs (at least MSVC's, LLVM's) and compilers (... pretty much all the big ones) are implemented in C++ and just export C symbols nowadays, likewise for newer OSes like Fuschia.
SerenityOS (https://github.com/SerenityOS/serenity) was written from scratch in two years in C++ and goes as far as having a custom web browser & JS engine. Where is the equivalent in C ? Where are the C web browsers, C office suites, C Godot/Unity/Unreal-like game engines ? Why is Arduino being programmed in C++ and not C ?
Objective-C, Java, Swift, and C# have become massively successful as application programming languages because C is/was terrible at it. They learned a lot about how painful it was to do basic higher level programming tasks when you are restricted to C's semantics and memory model.
C is great but I don't think it's worth romanticizing since history has shown that C isn't that great for writing anything but systems code. Which is a restricted domain to begin with, and isn't even that attractive for it anymore.
The one thing C has over anything else is interop. The language of FFI is C. There's no inherent reason for that other than history, and it's not super broken so we're not going to fix it.
C sucks when you need convenience of big standard library or safety above performance. There is nothing like C when you care about speed, memory footprint and efficient memory management. It's great other languages took over in areas C is terrible at but it's not like areas when it's the best and often they only option disappeared.
C started to be popular in the microcomputer world at a time when systems programming was being done in assembly languages. For instance, most arcade games for 8 bit microcomputers, were written in assembler. Some applications for the IBM PC were written in assembler, such as WordPerfect.
The freedom with pointers thanks to arithmetic would instantly make sense and appeal to assembly people, who would find a systems language without pointer flexibility to be too strait-jacketed.
Just _look_ at all these short keywords and special symbols. It's legitimately hard to read without focusing on each character.
This old fart thinks Rust is the new Perl.People proposing every other language that tried to replace C thought the same. Only time will tell, of course, but I wouldn't bet on it. Nowadays C++ is seen by many as a "garbage pile", the same can happen to rust.
I am sure it has its place but I think it's just too ugly to be attractive. There is something pleasing about writing C, Python or even Javascript which you will never get with Rust. It will never be a language a lot of people enjoy writing imo.
C++ too to a lesser extent. I work on spacecraft flight software, and there's a significant push to move from pure C to modern C++.
No single language is going to (or wants to IMO) replace C in every single use case, but replacing C in specific use cases has been a huge boon for productivity.
Otherwise, I can't think of any circumstance Rust isn't trying to muscle in on C and C++'s territory.
New languages mostly don't replace existing ones. Rather, they supplant them for some uses, and open up new kinds of software which are easier to implement or to conceive with the new language. Now, you referred to C++ specifically, and since I'm somewhat familiar with it I'll address some of the points you made with respect to just that one:
C++ was not intended to "replace C", but rather to combine features of BCPL (later C) and Simula. See: https://www.youtube.com/watch?v=69edOm889V4
Bjarne Strousup said: "If you want to create a new language, a new system - it's quite useful not to try to invent every wheel." For a long while now, C++ teachers/trainers encourage their audience _not_ to think of C++ as an "augmented C" or "C with feature X Y and Z", and to avoid most "C-style" code in favor of idioms appropriate to what the language offers today.
Also, C didn't "win the competition" because there isn't a "bestest language for everything" competition. It has been, and is, a popular language with many uses. Writing operating system kernels is one kind of programming task, where C is the most popular. Even at this level (and lower still), other languages are potentially interesting and often used. See, for example:
* IncludeOS - Running C++ without an operating system: https://www.youtube.com/watch?v=cQPrtTsM7Zg * Generating optimal assembly in an embedded setting at (C++) compile time: https://www.youtube.com/watch?v=CNw6Cz8Cb68
Finally, C++ doesn't require a huge library nor a complex runtime system because it has a "freestanding mode" in which the requirements are very limited (although more is required than for C). See: https://en.cppreference.com/w/cpp/freestanding
Imagine a world where most the languages follow Lisp's syntactic conventions, or Pascal's. What a nightmare.
The rest of Pascal's syntax is not particularly problematic, either. I'd say that the two biggest problems with it were begin..end for blocks, and having all local declarations in a block separate from code. But Modula-2 already dropped "begin", and various Pascal dialects added inline variable declarations eventually. So, on the whole, I think we'd actually be better off in terms of code readability if Modula-2 rather than C became the standard systems programming language.
The default is not the best, the default has just beaten the world so many times on the head that anything else became foreign and weird and got laughed out of the room before it even had the chance to say anything.
One programming language history book/blog/paper a day keeps the nonsense notions away, C "Won" like people win the lottery or the roulette.
Programmers have been show, time after time, not to be particularly trustworthy. Have we not learned the lesson that it's really easy to make mistakes, and we should trust tools instead of people to check our work?
> Don’t prevent the programmer from doing what needs to be done.
Ditto the above.
> Keep the language small and simple.
It is in some ways, but it's "smallness" leads to a serious lack of simplicity as seemingly simple things are incredibly hard to do right consistently. For instance, avoiding indexing past the end of an array or rolling over an integer.
> Provide only one way to do an operation.
That's nice, I'll grant you, although there are of course exceptions that prove the rule, like:
is synonymous with > Make it fast, even if it is not guaranteed to be portable.Well... I mean...
When I've learned that RISC-V had no carry bit, I couldn't help but think it might have been designed for C to begin with. Sure, they give reasons for this choice, none of them linked to C. Still, it hurts multi-precision arithmetic any language with BigInt would have benefited from (I recall Python, Haskell, and Scheme at the very least).
I'm no hardware designer, though.
[edit] </sarcasm>
I thought this was funny because these two are the same:
Likewise all of these are the same: As are these: But I guess the point can be applied elsewhere.* It's everywhere
* There's a standard ABI
there's one ? the "C" ABI is just the ABI of whatever platform it's running on, which may or may not have funky behaviour that vendor-provided compilers kindly hide for you - functions being prepended with '_' on macOS, the two-dozen calling conventions on windows with i386, sysv and itanium ABIs...
Do you think you can tell what's the ABI of
?will bar be passed in a register, on the stack ? who knows, that depends on your platform, your compiler, etc etc. Things going cleanly on the stack is just a convenient lie that your first year comp. sci teachers tell you because it's too early to talk about how the real world works yet.
But typically the C ABI is the only stable ABI those platforms provide. That's a huge benefit.
As far as the ABI goes: the important thing is that there is a standard ABI on a specific platform that all compilers on that platform agree on. Sounds kinda obvious, but it's not common in other languages.
Sure, there are compiler switches and language extensions that can break the ABI if you use them. But, well, you don't have to use them (at the interop boundary), and neither do your API clients.
What is the nit of it is it almost works. You have a good shot at getting it to compile in a short amount of time. The rest of the work will be lots of time in ye old debugger and going over the docs for your platform. The fun part is you will find bugs that were there already, or are they just part of the platform, or were you using it wrong?
[1] But you need to write a bit of C glue code for OCaml so it's not quite so seamless.
But the C ABI is awful to work with. The C language itself offers no help to guarantee ABI compatibility. What ABI it compiles for depends on headers, which may depend on a jungle of ifdefs and typedefs.
It's less template metaprogramming and just templates to generate efficient code - stuff used to be done by abusing the proprocessor can now be handled by a (slightly) more elegant templating engine rather than a string pasting engine.
Classes are used for resource management/RAII, like we have a AQUIRE_MUTEX_IN_SCOPE() macro which will release mutex when scope is exited, this is supremely useful and generalized to many resources.
Lastly namespaces are huge. In big C codebases you have to be super pendantic about naming modules and APIs consistently because otherwise it becomes a nightmare.
C++ does get in the way still sometimes, like when you want to do something slightly dirty for perf reasons, say aliasing between structs. You first write it in a way that makes sense, basically how you would write it in C, but it's UB in C++, so you rewrite it with virtual calls or memcpys such that the compiler should be smart enough to arrive at the same result as C would have with the straightforward implementation. This works great until it doesn't, your last option is to try to solve it with templates, and that hole is very deep.
Gamedev tends to use its own patterns, particularly arena allocation for long lived fixed size tables, or their own internal object / entity / component model . It's not really c with classes or c with templates, just kind of its own dialect.
Actually, only half of the universe is using C++ to write games, the other half is using Unity and writes their game code in C#.
If language interopability would be dramatically better, this "lock-in" into a specific programming language wouldn't be half as bad as it currently is, and it would be much easier and less risky to use "fringe languages" for game development.
Check out the Doom 3 source code ;-)