Readit News logoReadit News
manasij7479 commented on Dataflow Analyses and Compiler Optimizations That Use Them, for Free   blog.regehr.org/archives/... · Posted by u/ingve
efferifick · a year ago
This post and the Hydra paper reminds me a lot of Ruler and Enumo.

  * Nandi, Chandrakana, et al. "Rewrite rule inference using equality saturation." Proceedings of the ACM on Programming Languages 5.OOPSLA (2021): 1-28.
  * Pal, Anjali, et al. "Equality Saturation Theory Exploration à la Carte." Proceedings of the ACM on Programming Languages 7.OOPSLA2 (2023): 1034-1062.
I will need to read more about both of these techniques along with Synthesizing Abstract Transformers.

Thanks for sharing! Really exciting stuff!

manasij7479 · a year ago
These are indeed neat papers!
manasij7479 commented on Logic programming is overrated, at least for logic puzzles (2013)   programming-puzzler.blogs... · Posted by u/alex_stoddard
slaymaker1907 · 2 years ago
Of the logic programming languages, Prolog is actually pretty friendly when it comes to construction. You can easily force evaluation with an empty left clause, i.e. ":- thing1(x), thing2(y).", and Prolog gives you a lot of tools to control the current database with assert/retract (handy for memoization). Minikanren (which core.logic is based on) is much more restrictive and really wants things to be pure, though it does have the advantage of working as a library in an imperative language as a small runtime.
manasij7479 · 2 years ago
Neat, I wonder if this idea could be developed into a more refined programming experience where you you spend time in the imperative world by default and only express some iterations as logic puzzles to be solved like this.
manasij7479 commented on Logic programming is overrated, at least for logic puzzles (2013)   programming-puzzler.blogs... · Posted by u/alex_stoddard
jerf · 2 years ago
For me, I think the problem is that normal, boring, stupid, unsophisticated, plebian imperative programming lives in a world of O(1) operations. That is to say, a "normal" line of code you type will be O(1), and then you generally start gluing those together with various things that start stacking on O(n) complexities. As things like the accidentally quadratic blog [1], in the normal programming world it is a bit humorous to even so much as accidentally stack two of those things unnecessarily on top of each other.

Certainly, people manage to make poorly performing things even so, but at least at the base level, your primitives may be stupid, but they are generally fast.

The logic programming world works in a default space of O(n) operations, that stack together more freely than the imperative world, and that gives easy access to O(2^n). Since this is essentially impossible, a great deal of work is done to try to get that down, but you're always intrinsically starting behind the eight-ball. It is easier to work up from O(1) operations than to write an exponential or super-exponential algorithm and then try to trim it back down to a decent complexity for a normal programmer.

I think this is the root cause as to why logic programming is generally not something we see a lot of. It's like a wild stallion; it may be powerful but the amount of effort you pour into just keeping it under control may exceed any benefit you could get.

It isn't useless, of course. Amazing work has been done in the field of SAT solvers, and there's certainly a niche for it. The problems that are intrinsically higher-order polynomials or (technically) exponential, well, they are what they are and if you're going to be stuck in that world, logic programming may offer you a much better toolset than conventional programming on its own. But there was a hope a long time ago, in the Prolog era, that it could become part of the normal toolkit of general purpose programming, and I don't think that will ever happen, because of this line of logic.

This is a bit tangential to the article, it's just what I happened to read that finally crystallized this in my mind.

[1]: https://accidentallyquadratic.tumblr.com/

manasij7479 · 2 years ago
The issue you have circled around here is a bit deeper and nuanced than trimming down exponential code to something more tractable. In fact, it's often worse, and O(2^n) can be a blessing!

The problem with logic programming is the 'logic' part.

Modern imperative and functional programming languages are constructive.

Logic programming is not, and the expressive power varies with the exact logic being used.

Elementary logic gives you the following intuition. Propositional logic -- easy. First order logic -- easy (with caveats). And (some kinds of) second order logic -- only easy if you get lucky with the problem you are trying to solve.

For logic based programming languages and systems, both the language implementer and the programmer have to be careful about supporting and using language constructs which boil down to tractable computation.

This is much more difficult than it seems like.

For example, when using SMT solvers you learn quickly that multiplication and division with constants is very fast while the same operations with variables can be intractable. i.e. reasoning about x * C --easy, while x * y is often (but not always..) going to hang.

manasij7479 commented on My first superoptimizer   austinhenley.com/blog/sup... · Posted by u/azhenley
harerazer · 2 years ago
This is essentially calculating the Kolmogorov complexity of whatever the final state of memory wrt to the constrained assembly (and giving the associated program). Since the any program in the constrained assembly always halts, this is possible a la brute force.

It also doesn't seem particularly interesting because it doesn't allow the programs to get input. Obviously that makes things much more difficult wrt to proving program equivalence.

manasij7479 · 2 years ago
Input's and other side effects are not too tricky to handle.

In most cases, you just slice the program to isolate pure computation, and just optimize that.

Most traditional compiler optimizations stick to that as well, the exceptions to this rule are carefully engineered.

manasij7479 commented on My first superoptimizer   austinhenley.com/blog/sup... · Posted by u/azhenley
manasij7479 · 2 years ago
The pruning part can do a lot of heavy lifting to make it a practical tool.

Related: I work on Souper (https://github.com/google/souper).

Feel free to reach out if anyone has questions!

u/manasij7479

KarmaCake day22December 4, 2018
About
CS PhD student at the University of Utah, working on LLVM, formal methods, superoptimizers, and compiler development tools.

manasij7479@gmail.com

View Original