High level languages succeed by exposing the components of computation as combinators one can manipulate like algebra.
I was given access to an APL terminal as a 19 year old math major familiar with punching cards (first with a paperclip). The experience was like taking acid, and I have never replicated it since.
The notation is a red herring, like parentheses in Lisp. APL succeeds by intensively plumbing an extraordinarily limited domain of multidimensional arrays. It conquered a much wider scope because distance from a problem is countered by the intensity of the light; compare the sun to a flashlight in your hand. And the limited domain allowed the combinators to reach critical mass, like the fusion fueling the sun.
APL was intended for mathematicians, who are used to fitting the tools that they have to a problem. Since APL, we have never accepted general purpose languages built on such a limited scope, making it that much harder for our combinators to achieve critical mass.
At 64 I daydream at what could have been, stunned that computer languages haven't developed further by now. I've chased the promise here in every Lisp, finding experiences lost in other languages but the promise ultimately elusive. Haskell comes closest to a practical language realizing my early soaring with APL; one experiences a fluid algebra of combinators even as other aspects are as calcified and off-limits to manipulation as C.
At their best, the parser combinators of functional languages are indistinguishable from the notation in one's mind for parsing. Most programmers see clumsy parsing in college and never master the idea. If one could imagine a world where all programmers are fluent in parsing, a suitable domain for a language could be the application of parser combinators to strongly typed trees. One gains the critical mass advantages of an APL limited domain, but with a domain better suited to modern computer science. In particular, one could achieve the homoiconicity of Lisp, with the hope of better macro tools. Ever stare in amazement that all life on earth developed from the same impoverished code base? Now look at how all macro systems resemble each other. Is that really the best civilization can do, if we had a million runs of the simulation?
“[..] to present to the eye some picture by which the course of their reasonings might be traced: it was however necessary to fill up this outline by a tedious description, which in some instances even of no peculiar difficulty became nearly unintelligible, simply from its extreme length”
- Babbage on Java, 1821.
My stumblings in APL involve a continuous adjustment of functions and data to fit each other, an impedance matching, a continuous going up and down lochs, now to enclose this, now to align this vector with that nested vector, now to mix that nested vector into a matrix to feed into a function with a first axis rank adjustment.. and it feels like a system built of ill-fitting parts because of it.
Perhaps it’s not APL inherently being an elegant notation, but smart people’s ability to express their thoughts elegantly in whatever notation they are using?
I think that when you see enough of it, these things are pattern matched idioms in your head. You hardly even consciously think about it. You have x, you y and you need it transformed to z; you did something like that 1000 times before and because the language is so terse, you remember the snippets of incantations to do it without looking anything up.
When I programmed Pascal decades ago, I had my own lib, which was huge, it was just a massive collection of functions pulled from projects which I all wrote myself and all remembered (and still remember). When I had to build something, it was mostly cobbling together those functions. Worked a lot faster (with more stable results) than Google->SO or npm search and install.
APL(/j/k) have that but with the added benefit that you don’t have to write or download a library; everything is composable, so instead of remembering/searching the functions I (or someone else wrote) remember the actual implementation... Which, in those languages, is usually the same amount of chars as the function name and usually smaller than the function name with params.
In my experience, existing programmers struggle the most in learning APL because they struggle to internalize the Array Model and how to talk about things as Arrays. The results are, in essence, very poorly typed data, which then requires tons of mangling to make work.
It's the same thing as when you learn various other languages and you start to not appreciate how to appropriately model data in that language, which leads to excessively complex or inadequate types.
IMO, the elegance of a notation is not defined by its ability to prevent people from writing bad expressions, but by the capacity for good expressions to be practically and usefully leveraged. A good notation is about empowerment and clarity, rather than about prevention.
'arcfide is definitely right; don't think of it as the same as the rest of the programming languages in your arsenal. Fortran and ALGOL (which C, Java, Go, Rust, SMALLTALK, etc. are all derivative from) have almost entirely different approaches, and very little valuable carries over.
Approaching it sort of like you would a small Lisp isn't ideal/is a flawed approach but may get you closer: acknowledging it as a separate, distinct paradigm is pretty much necessary to get a feel for it. I think the easiest way to bridge the gap right now is probably learning a concatenative language, just because the resources for teaching those are so far ahead of what APL has right now (the best way I can think of to learn to use notation effectively is by reading one of Iverson's books that use J as notation or doing one of Iverson's J labs) yet still give you a feel for where the paradigm is headed.
Summary: give J's labs a try, they might help you get a hang of the paradigm better; KEI was a fantastic teacher, and Chris Burke, Roger Hui and Eric Iverson have done a great job in maintaining the old & making new labs, and if that doesn't work, try learning a concatenative language first and carrying it over.
Not too far from my personal experience, but it seems to me that all these mismatch scenarios arise when you are programming in the large, either directly or indirectly, i.e., by using third-party libraries, code snippets or even RESTful APIs. In a sense, programming in the large with an APL-like language reminds me of Perl in its best and worse. But things fit pretty well in the small.
I had this feeling exactly listening (great, as everything Aaron does) presentation about manipulation and transformation of trees in APL, https://www.youtube.com/watch?v=hzPd3umu78g
I get the feeling it was originally printed like that.
The issue was an APL special - this link has download links for every article from it - IBM Systems Journal volume 30, issue 4 1991. Includes Iverson's A personal view of APL.
I was given access to an APL terminal as a 19 year old math major familiar with punching cards (first with a paperclip). The experience was like taking acid, and I have never replicated it since.
The notation is a red herring, like parentheses in Lisp. APL succeeds by intensively plumbing an extraordinarily limited domain of multidimensional arrays. It conquered a much wider scope because distance from a problem is countered by the intensity of the light; compare the sun to a flashlight in your hand. And the limited domain allowed the combinators to reach critical mass, like the fusion fueling the sun.
APL was intended for mathematicians, who are used to fitting the tools that they have to a problem. Since APL, we have never accepted general purpose languages built on such a limited scope, making it that much harder for our combinators to achieve critical mass.
At 64 I daydream at what could have been, stunned that computer languages haven't developed further by now. I've chased the promise here in every Lisp, finding experiences lost in other languages but the promise ultimately elusive. Haskell comes closest to a practical language realizing my early soaring with APL; one experiences a fluid algebra of combinators even as other aspects are as calcified and off-limits to manipulation as C.
At their best, the parser combinators of functional languages are indistinguishable from the notation in one's mind for parsing. Most programmers see clumsy parsing in college and never master the idea. If one could imagine a world where all programmers are fluent in parsing, a suitable domain for a language could be the application of parser combinators to strongly typed trees. One gains the critical mass advantages of an APL limited domain, but with a domain better suited to modern computer science. In particular, one could achieve the homoiconicity of Lisp, with the hope of better macro tools. Ever stare in amazement that all life on earth developed from the same impoverished code base? Now look at how all macro systems resemble each other. Is that really the best civilization can do, if we had a million runs of the simulation?
Before his life is done,
To write three lines of APL
And make the sucker run.
My stumblings in APL involve a continuous adjustment of functions and data to fit each other, an impedance matching, a continuous going up and down lochs, now to enclose this, now to align this vector with that nested vector, now to mix that nested vector into a matrix to feed into a function with a first axis rank adjustment.. and it feels like a system built of ill-fitting parts because of it.
Perhaps it’s not APL inherently being an elegant notation, but smart people’s ability to express their thoughts elegantly in whatever notation they are using?
When I programmed Pascal decades ago, I had my own lib, which was huge, it was just a massive collection of functions pulled from projects which I all wrote myself and all remembered (and still remember). When I had to build something, it was mostly cobbling together those functions. Worked a lot faster (with more stable results) than Google->SO or npm search and install.
APL(/j/k) have that but with the added benefit that you don’t have to write or download a library; everything is composable, so instead of remembering/searching the functions I (or someone else wrote) remember the actual implementation... Which, in those languages, is usually the same amount of chars as the function name and usually smaller than the function name with params.
It's the same thing as when you learn various other languages and you start to not appreciate how to appropriately model data in that language, which leads to excessively complex or inadequate types.
IMO, the elegance of a notation is not defined by its ability to prevent people from writing bad expressions, but by the capacity for good expressions to be practically and usefully leveraged. A good notation is about empowerment and clarity, rather than about prevention.
Approaching it sort of like you would a small Lisp isn't ideal/is a flawed approach but may get you closer: acknowledging it as a separate, distinct paradigm is pretty much necessary to get a feel for it. I think the easiest way to bridge the gap right now is probably learning a concatenative language, just because the resources for teaching those are so far ahead of what APL has right now (the best way I can think of to learn to use notation effectively is by reading one of Iverson's books that use J as notation or doing one of Iverson's J labs) yet still give you a feel for where the paradigm is headed.
Summary: give J's labs a try, they might help you get a hang of the paradigm better; KEI was a fantastic teacher, and Chris Burke, Roger Hui and Eric Iverson have done a great job in maintaining the old & making new labs, and if that doesn't work, try learning a concatenative language first and carrying it over.
The issue was an APL special - this link has download links for every article from it - IBM Systems Journal volume 30, issue 4 1991. Includes Iverson's A personal view of APL.
http://gen.lib.rus.ec/scimag/?journal=8991&year=1991&volume=...
No, look at, for instance, page 3 of the PDF (page 557 of the scanned document). You can see faint traces of what was supposed to be there.
> You have exceeded your daily download allowance.
Alright.