This is a reference to the "Can programming be liberated from the von Neumann style?" from 1977.
It argues for functional programming, making the point that the imperative style is more common for efficiency reasons, as the programming model is close to the computer architecture.
It aims to be a general thought framework inviting to step a step back on some notions that have been (hastily?) accepted in the programming world.
It makes the same analogy that Prolog (or logic programming languages in general) have been strongly influenced by the resolution algorithm. In practice that means that if you write a non-trivial program, if performance is not right you'll need to understand the execution model and adapt to it, mainly with the pruning operator (!). So while the promise is to "declare" values and not think about the implementation details, you're railroaded to think in a very specific way.
I personally found that frustrating to find good solutions essentially unworkable because of this, in comparison with either imperative or functional paradigms that are significantly more flexible.
As a result, Prolog-style programming feels limited to the small problems for which it is readily a good fit, to be integrated into a general program using a general-purpose language.
I may be wrong on this, but of the 50 people that learned Prolog around the same time as me, none kept up with it. Meanwhile, other niche languages like Ocaml, Haskell and Scheme had good success.
Rethinking the language foundation could remove these barriers to give the language a broader user base.
It might be a reference to that 1977 paper in name, but unlike that Backus paper using math to make its point, this reads like a shallow ad for using the Curry language. The central (and only) point is merely an example of rewriting a Prolog predicate for appending lists into Curry but without even the claim of generality. The rewritten Curry functions however trivially fix determinacy and variables to input and output roles when the entire point of logic variables in Prolog is that they're working either when bound to a value upon entering a predicate call, or can get bound to as many values indeterministically as needed until the procedure terminates succesfully (and then even more on backtracking over failed subsequent goals). The paper also glosses over the concept of unification here. I sure hope the referenced detail papers come with more substance.
The paper title doesn't even make sense? So he wants to "liberate" Logic Programming from predicates? Predicate Logic is First-Order Logic ... ie what YeGoblynQueenne says in another comment.
From a mere practical PoV, who is the author even addressing? Prolog, from the get go, was introduced with NLP, planning problems, and similar large combinatorical search spaces in mind and is used for the convenience it brings to these applications. That FP could theoretically be more broadly used is completely besides the point; Prolog's focus is its strength.
The way you write imperative programs in Prolog by exploiting the search order, using cuts, etc. seems clever when you see it in school and do a few assignments for a comparative programming languages class (the only 3 credit CS course I took) but it is painfully awkward if you have to do very much of it.
It isn't. I do most of my programming in Prolog, I write oodles of it daily, and it's not a problem. You learn to think that way easily.
The argument is basically that Prolog is not 100% declarative and that if we jump through a few hoops, and translate it all to functional notation, we can make it "more declarative". But let's instead compare the incomplete declarativeness of Prolog to a fully-imperative, zero-declarative language like Python or C#. We'll find I believe that most programmers are perfectly fine programming completely non-declaratively and don't have any trouble writing very complex programs in it, and that "OMG my language is not purely declarative" is the least of their problems. I hear some old, wizened nerds even manage to program in C where you actually can drop to the hardware level and push bits around registers entirely by hand O.o
Man, lately, I feel like this stuff has been following me around. I'd really like to deep-dive into logic programming and related paradigms. Just recently came across Answer Set Programming[0] (via Potassco's clingo[1]), and it has made me realize just how ignorant I am of the design space that's being explored here.
More personally, I recently spent enough time with first Scheme and then APL that the paradigms clicked for me, and the effect that had on the entirety of my outlook on work was dramatically changed as a result. For whatever reason, I feel like breaking down my ingrained technical paradigms has allowed me to integrate and strengthen my soft skills.
Plus, mind-expanding experiences are just plain fun. Looking for more of that juice!
I strongly recommend checking Souffle programming language. It's a dialect of Datalog that can output bulk CSV data that can be easily imported into other databases (like Duckdb or Excel etc). It creates an extremely intuitive framework for logical programming. I.e. you can visualize logical programming as each relation being a giant table of elements, "or" operation being akin to SQL `union all`, "and" operation being akin to SQL `join`, "not" operation being akin to `outer join ... where joined isnull` etc...
I use ASP at work! I used it as the core of a powerful code generator: I modeled the type system I wanted to implement, some base operations and derivation rules, and had it synthesize implementations for every possible operator between every possible pair of types. I run clasp and it dumps out thousands of lines of C# implementing a simple symbolic matrix linear algebra library. It's one of the most beautiful things I've made, imo.
There already is a pretty major effort around the prolog community to build everything as much as possible around pure, monotonic prolog, and to provide a means to support multiple search strategies depending on the best fit for the problem. CLP libraries are also pretty common and the go-to for representing algebraic expressions relationally and declaratively.
I wouldn't say that the logic or relational way of describing effects is a bad thing either. By design it allows for multiple return values (foo/1, foo/2, ...) you can build higher level predicates that return multiple resources, which is pretty common for many programs. It makes concatenative (compositional) style programming really straightforward, especially for more complex interweaving, which also ends up being quite common. Many prolog implementations also support shift/reset, so that you can easily build things like conditions and restarts, algebraic effects, and/or debugging facilities on top. Prolog is also homoiconic in a unique way compared to lisp, and it's quite nice because the pattern matching is so powerful. Prolog really is one of the best languages I ever learned, I wish it was more popular. I think prolog implementations need a better C FFI interop and a nicer library ecosystem. Trealla has a good C FFI.
I think logic programming is the future, and a lot of these problems with prolog are fixable. If it's not going to be prolog, it'll probably be something like kanren and datalog within a lisp like scheme or clojure(script).
Out of my depth here, but I do have 5 Prolog books I can't seem to let go of. I used to dabble. A lot. Out of all my numerous prog-lang infatuations, only 3 have stuck: Erlang, Prolog and Picolisp. At most, 2 degrees of separation between those 3. No one mentioned Picolisp yet, so I have to, because it has a Prolog embedded in it. This seems like an appropriate place to do it.
there is a really interesting space between queries and prolog which includes mundane junk like encoding and rendering and data formatting changes that benefits in evaluation from having less power but maintains the lovely expressibility and analysis that we get from 'real' logic programming.
while I was trying to track down the license for it[1], I found https://git.ps.informatik.uni-kiel.de/curry/curry2go (also BSD-3) which says "A compiler and run-time system to compile and run Curry programs as Go programs"
There's an insightful critique of the paper on Reddit: https://www.reddit.com/r/ProgrammingLanguages/comments/1g1su...
...agree that it's weird the paper doesn't mention constraint logic programming, but it's perhaps pointing at it implicitly by saying "Replacing backtracking by complete search strategies"
Abstract. Logic programming has a long history. The representative of
logic programming in practice, the language Prolog, has been introduced
more than 50 years ago. The main features of Prolog are still present
today: a Prolog program is a set of predicate definitions executed by
resolution steps with a backtracking search strategy. The use of back-
tracking was justified by efficiency reasons when Prolog was invented.
However, its incompleteness destroys the elegant connection of logic pro-
gramming and the underlying Horn clause logic and causes difficulties to
teach logic programming. Moreover, the restriction to predicates hinders
an adequate modeling of real world problems, which are often functions
from input to output data, and leads to unnecessarily inefficient exe-
cutions. In this paper we show a way to overcome these problems. By
transforming predicates and goals into functions and nested expressions,
one can evaluate them with a demand-driven strategy which might re-
duce the number of computation steps and avoid infinite search spaces.
Replacing backtracking by complete search strategies with new imple-
mentation techniques closes the gap between the theory and practice of
logic programming. In this way, we can keep the ideas of logic program-
ming in future programming systems.
I didn't read past the abstract, but it sounds like they are just transforming logic-based programs into function-based programs. But: if I wanted functional programming, I wouldn't be writing in Prolog.
What would be interesting, would be to replace depth-first search while remaining in the world of predicates and Horn clauses.
Naïvely, writing only one logic relation of n parameters is about equivalent to writing n^2 functions (just decide for each parameter whether you give it or not as input). So there clearly is value there.
I say naïvely because on one hand you might not need all versions of the function, but on the other one you can also provide partial values, so it’s not either input or output.
Depth-first backtracking is trivial to implement, and it's trivial to blend into the language. Breadth-first backtracking is much harder to implement, and I'm not sure that it's trivial to build into a language (though this may be a lack of imagination).
Icon had both, but depth-first backtracking was the default and trivial to use, while breadth-first backtracking required using "co-expressions" (co-routines), though at least Icon had a trivial syntax for causing procedure argument expressions to be made into co-expressions. But Icon's approach does not make breadth-first backtracking be a first-class language feature like depth-first backtracking, and this is where my imagination gets stuck. To be fair, I've not truly thought much about this problem.
Is this about the problem that Prolog tends to ping pong infinitely between leafs? I wrote a fully declarative solitaire game solver and remember this being a big issue forcing me to memorize past states of the backtracking and basically exclude them by another predicate. This is obviously slow. I thought, why not at least have a plugin to avoid trivial cases where backtracking gets stuck switching between A and B. Or a plugin for a stochastic traversal of the solution tree.
There are. Tabling (available in most mature implementations) helps when recalculation of the same states is a problem. Meanwhile, custom search strategy is always an option to implement directly in Prolog. You'll see this in many Advent of Code solutions in Prolog when it is applied to path finding puzzles, in which depth first search is rarely a workable solution.
It makes the same analogy that Prolog (or logic programming languages in general) have been strongly influenced by the resolution algorithm. In practice that means that if you write a non-trivial program, if performance is not right you'll need to understand the execution model and adapt to it, mainly with the pruning operator (!). So while the promise is to "declare" values and not think about the implementation details, you're railroaded to think in a very specific way.
I personally found that frustrating to find good solutions essentially unworkable because of this, in comparison with either imperative or functional paradigms that are significantly more flexible. As a result, Prolog-style programming feels limited to the small problems for which it is readily a good fit, to be integrated into a general program using a general-purpose language. I may be wrong on this, but of the 50 people that learned Prolog around the same time as me, none kept up with it. Meanwhile, other niche languages like Ocaml, Haskell and Scheme had good success.
Rethinking the language foundation could remove these barriers to give the language a broader user base.
The paper title doesn't even make sense? So he wants to "liberate" Logic Programming from predicates? Predicate Logic is First-Order Logic ... ie what YeGoblynQueenne says in another comment.
From a mere practical PoV, who is the author even addressing? Prolog, from the get go, was introduced with NLP, planning problems, and similar large combinatorical search spaces in mind and is used for the convenience it brings to these applications. That FP could theoretically be more broadly used is completely besides the point; Prolog's focus is its strength.
The argument is basically that Prolog is not 100% declarative and that if we jump through a few hoops, and translate it all to functional notation, we can make it "more declarative". But let's instead compare the incomplete declarativeness of Prolog to a fully-imperative, zero-declarative language like Python or C#. We'll find I believe that most programmers are perfectly fine programming completely non-declaratively and don't have any trouble writing very complex programs in it, and that "OMG my language is not purely declarative" is the least of their problems. I hear some old, wizened nerds even manage to program in C where you actually can drop to the hardware level and push bits around registers entirely by hand O.o
We had to do goddamn Paint on Prolog. Yup.
More personally, I recently spent enough time with first Scheme and then APL that the paradigms clicked for me, and the effect that had on the entirety of my outlook on work was dramatically changed as a result. For whatever reason, I feel like breaking down my ingrained technical paradigms has allowed me to integrate and strengthen my soft skills.
Plus, mind-expanding experiences are just plain fun. Looking for more of that juice!
[0]:https://en.wikipedia.org/wiki/Answer_set_programming
[1]:https://potassco.org/
I wouldn't say that the logic or relational way of describing effects is a bad thing either. By design it allows for multiple return values (foo/1, foo/2, ...) you can build higher level predicates that return multiple resources, which is pretty common for many programs. It makes concatenative (compositional) style programming really straightforward, especially for more complex interweaving, which also ends up being quite common. Many prolog implementations also support shift/reset, so that you can easily build things like conditions and restarts, algebraic effects, and/or debugging facilities on top. Prolog is also homoiconic in a unique way compared to lisp, and it's quite nice because the pattern matching is so powerful. Prolog really is one of the best languages I ever learned, I wish it was more popular. I think prolog implementations need a better C FFI interop and a nicer library ecosystem. Trealla has a good C FFI.
I think logic programming is the future, and a lot of these problems with prolog are fixable. If it's not going to be prolog, it'll probably be something like kanren and datalog within a lisp like scheme or clojure(script).
This is a great resource for getting a good feel of prolog: https://www.youtube.com/@ThePowerOfProlog/videos
https://www.swi-prolog.org/pldoc/doc_for?object=section(%27p...
https://software-lab.de/doc/ref.html#pilog
See: - The fastest non-incremental embedded Datalog engine https://github.com/s-arash/ascent - The state-of-the-art non-embedded and non-incremental Datalog engine https://github.com/knowsys/nemo - A python library that contains an embedded incremental Datalog engine https://github.com/brurucy/pydbsp - A Rust library that provides a embedded incremental Datalog engine over property graphs https://github.com/brurucy/materialized-view
there's lots of exploring left to do
1: maybe this is it? it does include a lot of curry named submodules; anyway, BSD-3-Clause https://git.ps.informatik.uni-kiel.de/curry/pakcs/-/blob/v3....
What would be interesting, would be to replace depth-first search while remaining in the world of predicates and Horn clauses.
For that you want tabled Prolog, or in other words Prolog executed by SLG-Resolution. The paradigmatic implementation is XSB Prolog:
https://xsb.com/xsb-prolog/
SWI-Prolog also supports tabling but I think the XSB implementation is more mature.
I say naïvely because on one hand you might not need all versions of the function, but on the other one you can also provide partial values, so it’s not either input or output.
Icon had both, but depth-first backtracking was the default and trivial to use, while breadth-first backtracking required using "co-expressions" (co-routines), though at least Icon had a trivial syntax for causing procedure argument expressions to be made into co-expressions. But Icon's approach does not make breadth-first backtracking be a first-class language feature like depth-first backtracking, and this is where my imagination gets stuck. To be fair, I've not truly thought much about this problem.
Breadth first search is complete even if the branches are infinitely deep. In the sense that, if there is a solution it will find it eventually.
Backtracking is a complete search of the problem-space.
What is incomplete is the Horn-SAT problem space, which is a subset of SAT, that can be solved in polynomial time, and is what Prolog is based on.
A complete logic system would have to solve SAT, which is NP-complete.
At least that's what I understood they meant by that.