Except he missed the part where Modula-2 was made available in 1978, exactly to fix all the original issues with standard Pascal, everything that he complains about, minus type safety.
The thing we are still trying to add back into C, 50 years later.
There is a good argument to be made, that this was purposefully missed or the blind spots were caused by being an employee of and tied to Bell Labs corporate interests, along with other critics of Pascal.
Many, who don't study or know of the history, can get sucked into the rhetoric of C as the "superior language" versus the reality of it being a heavily corporate pushed language. Pascal was in the way, so got hammered by bad press. Same sort of thing happens today, in these weird programming language scuffles. Rust, Zig, Vlang, Dlang, Odin, C3, etc... If a competitor is in the way, strange one-sided and highly critical blogs can materialize.
By 1987, pretty much any of Kernighan's criticisms involving Pascal and derivatives were moot. But he never retracted or modified what he got wrong or had changed. He had many years to do so. That was before the 2nd edition of his book (The C programming language in 1988). And before ANSI C (1989). Turbo Pascal and Object Pascal were out, which both took lessons from Modula-2, and were widely known successes.
And C++. While it does have slightly better types than C, I can still apply functions that the compiler doesn't even warn me about and that crash my program at runtime. If I'm lucky. When the phase of the Moon is adversarial, my program continues running in a corrupt state.
Apparently, this is a game that two can play. Niklaus Wirth, the creator of Pascal, had this to say in turn:
"From the point of view of software engineering, the rapid spread of C represented a great leap backward. It revealed that the community at large had hardly grasped the true meaning of the term “high-level language” which became an ill-understood buzzword."
Here is C.A.R Hoare's on his "The 1980 ACM Turing Award Lecture", guess to what programming language he is indirectly making a point of, regarding 1980 language designers and users have not learned this lesson.
"A consequence of this principle is that every occurrence of every subscript of every subscripted variable was on every occasion checked at run time against both the upper and the lower declared bounds of the array. Many years later we asked our customers whether they wished us to provide an option to switch off these checks in the interests of efficiency on production runs. Unanimously, they urged us not to--they already knew how frequently subscript errors occur on production runs where failure to detect them could be disastrous. I note with fear and horror that even in 1980 language designers and users have not learned this lesson. In any respectable branch of engineering, failure to observe such elementary precautions would have long been against the law."
Interesting paper, thanks. The findings concerning C seem reasonable. I would add the very weak type checker and the silly syntax, of which the majority of people only use a fraction (many C constructs are surprising and not generally known) to the issue of the small Hamming distance from one valid syntax construct to another, not intended one. Though I doubt that (Common) Lisp is a valid alternative for the typical C use cases.
Looks like in the 70s and 80s, it was wild west for programming language design. So many ideas floating around. But over the decades, most converged to the Algol-style (statements, curly braces, often using semicolons, type before identifier, etc.). Look at what we did to programming:
- Java, C, C++, C#, Kotlin, Rust, Swift, Go, TypeScript, JavaScript, ... → they look more or less the same
Compare with these ones that didn't catch on as much as the ones above:
- ML/Haskell, Erlang, Elixir, APL, Common Lisp (and other Lisps), Lua, Pascal, Delphi, BASIC, Visual Basic, VBA, VBScript, SmallTalk,... → each one bringing something refreshingly new to the PL design space
It feels like you're mixing together a lot of ideas that don't really fit together very well. In terms of syntax, only four of the languages you describe as "more or less the same" in terms of how they look use "type before identifier" like you mention. On the other hand, four of them use the ML-style syntax of identifier-colon-type, one uses identifier before type with no colon, and the final one doesn't even have type annotations. At least four of them don't require semicolons either, and I'd argue that at least in Go, having semicolons at all would probably strike people as odd. You've also left out Python, which doesn't fit any of those syntax descriptions you gave and more popular than at least half of the ones you did include.
The timeline of 70s and 80s also doesn't really seem to fit with what you're saying either. Of the languages you mention either as "more or less the same" or "refreshingly new", the ones from either the 70s or 80s are C, C++, ML, Smalltalk, Pascal, and Smalltalk; all of the other "refreshingly new" ones are from either before or after those years. Common Lisp was from the 80s, but the syntax originated with LISP in the 50s, so if you're going to go with the most common variant, I'd argue you should also remove ML from the list and replace it with OCaml, which is from the 90s.
Even more significantly, it feels like you're comparing apples and oranges with the discussion of syntax at the beginning and then talking about bringing something new to the PL design space after. I'd guess that most people involved in PL design find syntax to be the least interesting aspect of it, and I think it's hard to argue that none of the languages you described as looking similar brought anything new to the PL design space.
Python is like Algol in terms of syntax, certainly, and in fact is almost an ideal Algol syntactically: It has no (or few) block delimiter characters or keywords, but imposes proper indentation as a syntactic requirement.
While I started with Basic and and Turbo Pascal, I started to appreciate a lot the curly braces and semicolons because it makes a lot of sense for me (*). Therefore I see the convergence as a good thing, keeping the best parts of everything and improving with good parts from others.
* As opposed to BEGIN-END in SQL, indenting in Python or weirdness of Cobol in the 90s.
>But over the decades, most converged to the Algol-style (statements, curly braces, often using semicolons, type before identifier, etc.). Look at what we did to programming:
>- Java, C, C++, C#, Kotlin, Rust, Swift, Go, TypeScript, JavaScript, ... → they look more or less the same
The upside being that if you come from C or Java you will fill at home with Go.
It is interesting to see this Great Convergence. Even PHP, which started off as a quick-and-dirty language with deliberately vague expressions, converges to the same standard.
In my 30-something years of programming (admittedly my first programs were toy-like Pascal pieces of code), I saw only one improvement in this Algol-like style that I considered major: named arguments. Their use improves code readability, at least for me.
I miss Pascal. Learned Object Pascal end of 90s but used it nearly never in my professional career (only a few months with Delphi during a student job). I really never used C or C++ in my professional career which is also interesting.
Nowadays it's all Javascript and more specifically Typescript in my case.
Interestingly, Donald Knuth created an entire new system of programming to make up for the deficiencies of Pascal in the programs he was trying to write (TeX and METAFONT):
Why Pascal Is Not My Favorite Programming Language - https://news.ycombinator.com/item?id=37044792 - Aug 2023 (2 comments)
Why Pascal Is Not My Favorite Programming Language (1981) [pdf] - https://news.ycombinator.com/item?id=22222117 - Feb 2020 (62 comments)
Why Pascal Is Not My Favorite Programming Language (1981) - https://news.ycombinator.com/item?id=19221143 - Feb 2019 (55 comments)
Why Pascal Is Not My Favorite Programming Language – Current Status - https://news.ycombinator.com/item?id=8273608 - Sept 2014 (1 comment)
Why Pascal is Not My Favorite Programming Language (1981) - https://news.ycombinator.com/item?id=8260694 - Sept 2014 (64 comments)
Deleted Comment
The thing we are still trying to add back into C, 50 years later.
Many, who don't study or know of the history, can get sucked into the rhetoric of C as the "superior language" versus the reality of it being a heavily corporate pushed language. Pascal was in the way, so got hammered by bad press. Same sort of thing happens today, in these weird programming language scuffles. Rust, Zig, Vlang, Dlang, Odin, C3, etc... If a competitor is in the way, strange one-sided and highly critical blogs can materialize.
By 1987, pretty much any of Kernighan's criticisms involving Pascal and derivatives were moot. But he never retracted or modified what he got wrong or had changed. He had many years to do so. That was before the 2nd edition of his book (The C programming language in 1988). And before ANSI C (1989). Turbo Pascal and Object Pascal were out, which both took lessons from Modula-2, and were widely known successes.
"From the point of view of software engineering, the rapid spread of C represented a great leap backward. It revealed that the community at large had hardly grasped the true meaning of the term “high-level language” which became an ill-understood buzzword."
Source: Niklaus Wirth, A Brief History of Software Engineering, 2008 (https://people.inf.ethz.ch/wirth/Miscellaneous/IEEE-Annals.p...)
Here is C.A.R Hoare's on his "The 1980 ACM Turing Award Lecture", guess to what programming language he is indirectly making a point of, regarding 1980 language designers and users have not learned this lesson.
"A consequence of this principle is that every occurrence of every subscript of every subscripted variable was on every occasion checked at run time against both the upper and the lower declared bounds of the array. Many years later we asked our customers whether they wished us to provide an option to switch off these checks in the interests of efficiency on production runs. Unanimously, they urged us not to--they already knew how frequently subscript errors occur on production runs where failure to detect them could be disastrous. I note with fear and horror that even in 1980 language designers and users have not learned this lesson. In any respectable branch of engineering, failure to observe such elementary precautions would have long been against the law."
http://lambda-the-ultimate.org/classic/message4322.html#4671
"People who are still trying to refer to the article are unaware of the capabilities of current Pascal (Delphi/FPC) implementations."
https://wiki.freepascal.org/Why_Pascal_is_Not_My_Favorite_Pr...
"Software Fault Prevention by Language Choice: Why C is Not My Favorite Language"
https://people.eecs.berkeley.edu/~fateman/papers/software.pd...
- Java, C, C++, C#, Kotlin, Rust, Swift, Go, TypeScript, JavaScript, ... → they look more or less the same
Compare with these ones that didn't catch on as much as the ones above:
- ML/Haskell, Erlang, Elixir, APL, Common Lisp (and other Lisps), Lua, Pascal, Delphi, BASIC, Visual Basic, VBA, VBScript, SmallTalk,... → each one bringing something refreshingly new to the PL design space
The timeline of 70s and 80s also doesn't really seem to fit with what you're saying either. Of the languages you mention either as "more or less the same" or "refreshingly new", the ones from either the 70s or 80s are C, C++, ML, Smalltalk, Pascal, and Smalltalk; all of the other "refreshingly new" ones are from either before or after those years. Common Lisp was from the 80s, but the syntax originated with LISP in the 50s, so if you're going to go with the most common variant, I'd argue you should also remove ML from the list and replace it with OCaml, which is from the 90s.
Even more significantly, it feels like you're comparing apples and oranges with the discussion of syntax at the beginning and then talking about bringing something new to the PL design space after. I'd guess that most people involved in PL design find syntax to be the least interesting aspect of it, and I think it's hard to argue that none of the languages you described as looking similar brought anything new to the PL design space.
Another counterexample to consider is Python. It's quite unlike Algol, but that certainly hasn't stopped it from becoming popular.
* As opposed to BEGIN-END in SQL, indenting in Python or weirdness of Cobol in the 90s.
>- Java, C, C++, C#, Kotlin, Rust, Swift, Go, TypeScript, JavaScript, ... → they look more or less the same
The upside being that if you come from C or Java you will fill at home with Go.
In my 30-something years of programming (admittedly my first programs were toy-like Pascal pieces of code), I saw only one improvement in this Algol-like style that I considered major: named arguments. Their use improves code readability, at least for me.
Nowadays it's all Javascript and more specifically Typescript in my case.
That PDF would be probably much longer about Javascript but it seems Kernighan is quite okay with that language nowadays https://www.youtube.com/watch?v=AB60_uUetJs
http://literateprogramming.com/
WEB is expressive enough that there is a tool to directly transpile to C for compiling:
https://tug.org/web2c/