For example, using "rm" on the command line, or an SQL "delete". I would very much like those short programs to be invalid, until someone provides more detail about what should be destroyed in a way that is accident-resistant.
If I had my 'druthers, the left-to-right prefix of "delete from table" would be invalid, and it would require "where true" as a safety mechanism.
I might write the equivalent signature,
D f(A, B, C)
and then reorganize things to just pass f around, or make a struct if you really want to bake in your first function.> In text blocks, the leftmost non-whitespace character on any of the lines or the leftmost closing delimiter defines where meaningful white space begins.
From https://blogs.oracle.com/javamagazine/post/text-blocks-come-...
It's not a bad option but it does mean you can't have text where every line is indented. This isn't uncommon - e.g. think about code generation of a function body.
I've only seen two that do: the Zig approach, and a postprocessing ‘dedent’ step.
In fact in my academic and professional career most of the highly "functional" C that I've come across has been written by me, when I'd rather amuse myself than make something readable.
(a -> b) -> c -> d
becomes C
D (*f(B (*)(A)))(C)
and it's no surprise that the former is considered much less fancy than the latter.
Of course it's not common — because the language makes it painful :) The causation is the other way around. We've seen in plenty of languages that if first-class functions are ergonomic to use then people use them all over the place.
If you're working with Roman numerals, division is a very fancy thing to do.
I don't really know how else to put it, but it's vaguely like a C derived spiritual cousin of Lisp with structs instead of lists.
- we have a language with a particular philosophy of development
- we discover that some concept A is awkward to express in the language
- we add a special case to the language to make it nicer
- someone eventually invents a new base language that natively handles concept A nicely as part of its general model
Lisp in some sense skipped a couple of those progressions: it had a very regular language that didn't necessarily have a story for things that people at the time cared about (like static memory management, in the guise of latency). But it's still a paragon of consistency in a usable high-level language.
I agree that it's of course not correct to say that Zig is a descendent or modern equivalent of Lisp. It's more that the virtue that Lisp embodies over all else is a universal goal of language design, just one that has to be traded off against other things, and Zig has managed to do pretty well at it.
Choice of specific line-start marker aside, I think this is the best solution to the indented-string problem I've seen so far.
It’s some sort of mental glitch that a number of people fall into and I have absolutely no idea why.
the thing that stands out to me about Zig's syntax that makes it "lovely" (and I think matklad is getting at here), is there is both minimalism and consistency to the design, while ruthlessly prioritizing readability. and it's not the kind of surface level "aesthetically beautiful" readability that tickles the mind of an abstract thinker; it is brutalist in a way that leaves no room for surprise in an industrial application. it's really, really hard to balance syntax design like this, and Zig has done a lovely and respectable job at doing so.
Rather, the sort of beauty it's going for here is exactly the type of beauty that requires a bit of abstraction to appreciate: it's not that the concrete syntax is visually beautiful per se so much as that it's elegantly exposing the abstract syntax, which is inherently more regular and unambiguous than the concrete syntax. It's the same reason S-exprs won over M-exprs: consistently good often wins over special-case great because the latter imposes the mental burden of trying to fit into the special case, while the former allows you to forget that the problem ever existed. To see a language do the opposite of this, look at C++: the syntax has been designed with many, many special cases that make specific constructs nicer to write, but the cost of that is that now you have to remember all of them (and account for all of them, if templating — hence the ‘new’ uniform initialization syntax[1]).
This trade-off happens all the time in language design: you're looking for language that makes all the special cases nice _as a consequence of_ the general case, because _just_ being simple and consistent leads you to the Turing tarpit: you simplify the language by pushing all the complexity onto the programmer.