TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Can logic programming be liberated from predicates and backtracking? [pdf]

197 点作者 matt_d8 个月前

13 条评论

woolion8 个月前
This is a reference to the &quot;Can programming be liberated from the von Neumann style?&quot; from 1977. It argues for functional programming, making the point that the imperative style is more common for efficiency reasons, as the programming model is close to the computer architecture. It aims to be a general thought framework inviting to step a step back on some notions that have been (hastily?) accepted in the programming world.<p>It makes the same analogy that Prolog (or logic programming languages in general) have been strongly influenced by the resolution algorithm. In practice that means that if you write a non-trivial program, if performance is not right you&#x27;ll need to understand the execution model and adapt to it, mainly with the pruning operator (!). So while the promise is to &quot;declare&quot; values and not think about the implementation details, you&#x27;re railroaded to think in a very specific way.<p>I personally found that frustrating to find good solutions essentially unworkable because of this, in comparison with either imperative or functional paradigms that are significantly more flexible. As a result, Prolog-style programming feels limited to the small problems for which it is readily a good fit, to be integrated into a general program using a general-purpose language. I may be wrong on this, but of the 50 people that learned Prolog around the same time as me, none kept up with it. Meanwhile, other niche languages like Ocaml, Haskell and Scheme had good success.<p>Rethinking the language foundation could remove these barriers to give the language a broader user base.
评论 #41829161 未加载
评论 #41827925 未加载
评论 #41835271 未加载
评论 #41829257 未加载
xelxebar8 个月前
Man, lately, I feel like this stuff has been following me around. I&#x27;d really like to deep-dive into logic programming and related paradigms. Just recently came across Answer Set Programming[0] (via Potassco&#x27;s clingo[1]), and it has made me realize just how ignorant I am of the design space that&#x27;s being explored here.<p>More personally, I recently spent enough time with first Scheme and then APL that the paradigms clicked for me, and the effect that had on the entirety of my outlook on work was dramatically changed as a result. For whatever reason, I feel like breaking down my ingrained technical paradigms has allowed me to integrate and strengthen my soft skills.<p>Plus, mind-expanding experiences are just plain fun. Looking for more of that juice!<p>[0]:<a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Answer_set_programming" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Answer_set_programming</a><p>[1]:<a href="https:&#x2F;&#x2F;potassco.org&#x2F;" rel="nofollow">https:&#x2F;&#x2F;potassco.org&#x2F;</a>
评论 #41825966 未加载
评论 #41826422 未加载
评论 #41826083 未加载
ristos8 个月前
There already is a pretty major effort around the prolog community to build everything as much as possible around pure, monotonic prolog, and to provide a means to support multiple search strategies depending on the best fit for the problem. CLP libraries are also pretty common and the go-to for representing algebraic expressions relationally and declaratively.<p>I wouldn&#x27;t say that the logic or relational way of describing effects is a bad thing either. By design it allows for multiple return values (foo&#x2F;1, foo&#x2F;2, ...) you can build higher level predicates that return multiple resources, which is pretty common for many programs. It makes concatenative (compositional) style programming really straightforward, especially for more complex interweaving, which also ends up being quite common. Many prolog implementations also support shift&#x2F;reset, so that you can easily build things like conditions and restarts, algebraic effects, and&#x2F;or debugging facilities on top. Prolog is also homoiconic in a unique way compared to lisp, and it&#x27;s quite nice because the pattern matching is so powerful. Prolog really is one of the best languages I ever learned, I wish it was more popular. I think prolog implementations need a better C FFI interop and a nicer library ecosystem. Trealla has a good C FFI.<p>I think logic programming is the future, and a lot of these problems with prolog are fixable. If it&#x27;s not going to be prolog, it&#x27;ll probably be something like kanren and datalog within a lisp like scheme or clojure(script).<p>This is a great resource for getting a good feel of prolog: <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;@ThePowerOfProlog&#x2F;videos" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;@ThePowerOfProlog&#x2F;videos</a>
评论 #41828504 未加载
评论 #41876562 未加载
rebanevapustus8 个月前
Datalog does not use backtracking, and is getting ever increasingly more popular.<p>See: - The fastest non-incremental embedded Datalog engine <a href="https:&#x2F;&#x2F;github.com&#x2F;s-arash&#x2F;ascent">https:&#x2F;&#x2F;github.com&#x2F;s-arash&#x2F;ascent</a> - The state-of-the-art non-embedded and non-incremental Datalog engine <a href="https:&#x2F;&#x2F;github.com&#x2F;knowsys&#x2F;nemo">https:&#x2F;&#x2F;github.com&#x2F;knowsys&#x2F;nemo</a> - A python library that contains an embedded incremental Datalog engine <a href="https:&#x2F;&#x2F;github.com&#x2F;brurucy&#x2F;pydbsp">https:&#x2F;&#x2F;github.com&#x2F;brurucy&#x2F;pydbsp</a> - A Rust library that provides a embedded incremental Datalog engine over property graphs <a href="https:&#x2F;&#x2F;github.com&#x2F;brurucy&#x2F;materialized-view">https:&#x2F;&#x2F;github.com&#x2F;brurucy&#x2F;materialized-view</a>
评论 #41827353 未加载
tempodox8 个月前
The Curry language (<a href="https:&#x2F;&#x2F;www.curry-language.org" rel="nofollow">https:&#x2F;&#x2F;www.curry-language.org</a>) does look interesting. Does anybody have practical experience with it?
评论 #41829443 未加载
jkbyc8 个月前
There&#x27;s an insightful critique of the paper on Reddit: <a href="https:&#x2F;&#x2F;www.reddit.com&#x2F;r&#x2F;ProgrammingLanguages&#x2F;comments&#x2F;1g1su6c&#x2F;can_logic_programming_be_liberated_from&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.reddit.com&#x2F;r&#x2F;ProgrammingLanguages&#x2F;comments&#x2F;1g1su...</a> ...agree that it&#x27;s weird the paper doesn&#x27;t mention constraint logic programming, but it&#x27;s perhaps pointing at it implicitly by saying &quot;Replacing backtracking by complete search strategies&quot;
评论 #41828765 未加载
xelxebar8 个月前
<i>Abstract</i>. Logic programming has a long history. The representative of logic programming in practice, the language Prolog, has been introduced more than 50 years ago. The main features of Prolog are still present today: a Prolog program is a set of predicate definitions executed by resolution steps with a backtracking search strategy. The use of back- tracking was justified by efficiency reasons when Prolog was invented. However, its incompleteness destroys the elegant connection of logic pro- gramming and the underlying Horn clause logic and causes difficulties to teach logic programming. Moreover, the restriction to predicates hinders an adequate modeling of real world problems, which are often functions from input to output data, and leads to unnecessarily inefficient exe- cutions. In this paper we show a way to overcome these problems. By transforming predicates and goals into functions and nested expressions, one can evaluate them with a demand-driven strategy which might re- duce the number of computation steps and avoid infinite search spaces. Replacing backtracking by complete search strategies with new imple- mentation techniques closes the gap between the theory and practice of logic programming. In this way, we can keep the ideas of logic program- ming in future programming systems.
评论 #41825579 未加载
评论 #41832447 未加载
评论 #41826079 未加载
2-3-7-43-18078 个月前
Is this about the problem that Prolog tends to ping pong infinitely between leafs? I wrote a fully declarative solitaire game solver and remember this being a big issue forcing me to memorize past states of the backtracking and basically exclude them by another predicate. This is obviously slow. I thought, why not at least have a plugin to avoid trivial cases where backtracking gets stuck switching between A and B. Or a plugin for a stochastic traversal of the solution tree.
评论 #41827463 未加载
amboo78 个月前
<a href="https:&#x2F;&#x2F;icfp24.sigplan.org&#x2F;home&#x2F;minikanren-2024#event-overview" rel="nofollow">https:&#x2F;&#x2F;icfp24.sigplan.org&#x2F;home&#x2F;minikanren-2024#event-overvi...</a> is related. There has been work on turning slow bidirectional code into faster functional code (in 2023, reachable from this url).
PaulHoule8 个月前
Reminded me of the “rules and schemes” concept I was thinking of about 10 years ago that would separate “pure logical” rules from the stuff it takes to turn them into a program that runs by either forward chaining (production rules) or backwards chaining (like prolog). I got so far as writing a macro compiler that would let you write a base set of rules and rewrite them for different purposes such as transforming document A into document B or going in the opposite direction.<p>I like how they let you write functions to control the search order because boy that is essential.
carapace8 个月前
I&#x27;m not an expert, and I&#x27;ve only just skimmed the paper, but it seems to me that this is a kind of <i>partial evaluation</i> for Prolog, eh? (Perhaps &quot;partial resolution&quot; is more technically correct?)<p>I think some Prolog systems do something like this already as an optimization, but I could be totally off on that.
YeGoblynQueenne8 个月前
[Note I&#x27;m sick and tired; literally. I may not be firing on all cylinders in the following.]<p>The article is fudging things with its use of &quot;backtracking&quot; as a stand-in for backtracking <i>Depth First Search</i> (DFS)- the latter is the search strategy that is &quot;fixed&quot; in Prolog. And it&#x27;s not really fixed. Prolog programs can also be executed by &quot;tabling&quot; a.k.a. SLG-Resolution, which basically replaces DFS with Breadth-First Search modulo momoization. Tabling avoids an important source of &quot;incompleteness&quot; in Prolog, that of non-terminating left-recursions.<p>To clarify, that is what makes Prolog &quot;incomplete&quot;: that executing Prolog programs by DFS makes Prolog loop infinitely when encountering some left-recursions. The article gives the example of a last&#x2F;2 predicate:<p><pre><code> last([_H|T],E) :- last(T,E). last([E],E). </code></pre> This does indeed loop. But this one doesn&#x27;t:<p><pre><code> last([E],E). last([_H|T],E) :- last(T,E). ?- last_(Ls,3). Ls = [3] ; Ls = [_,3] ; Ls = [_,_,3] ; Ls = [_,_,_,3] ; Ls = [_,_,_,_,3] ; Ls = [_,_,_,_,_,3] . </code></pre> And that&#x27;s what the article is pointing out with the allusion to &quot;a higher, declarative programming style which frees the programmer from thinking about low-level control details&quot;. With such a &quot;higher declarative programming style&quot; the programmer does not have to think about program structure or execution strategy and can write whatever, however, and still get results!<p>The problem with that argument, which is as old as Prolog and possibly even older than that, is that it&#x27;s an argument from taste, or, indeed, &quot;style&quot;, to use the article&#x27;s terminology. More specifically, for every non-terminating Prolog program that can be written, we can <i>most likely</i> write a Prolog program that does terminate, and that is (success-set) equivalent to the non-terminating program, by keeping in mind the structure of the search space for proofs traversed by ordinary Prolog with DFS, or tabling. And possibly with a few cuts here and there (oh, the humanity!). The article is simply arguing that it is &quot;more declarative&quot; to not have to do that. But, why is &quot;more declarative&quot; better? It&#x27;s style all the way down.<p>As to the second axis of the argument, the horror of predicates that do not neatly map to &quot;many&quot; real world problems that are better represented as functions, asking whether we can liberate logic programming from predicates is like asking whether we can liberate the First Order Predicate Calculus from predicates, or, equivalently, make ice cream without the ice or the cream. Sure we can. The question is again: why? I don&#x27;t see a clear justification for that. In particular, setting implementation details aside (as the article does on this point) SLD-Resolution is already sound and refutation-complete, or complete with subsumption. That is the best that can ever be achieved, and the article doesn&#x27;t seem to claim anything else (or the ghosts of Alonzo Church and Alan Turing would be very, very sad). So this, too, seems to be a matter of taste: functions are more stylish than predicates. M&#x27;kay.<p>In fact, if I may blaspheme a little, this is what Church did with his Lambda calculus: he turned everything into typed functions to extend First Order Logic (FOL) into a higher order. And in the process completely destroyed the simple elegance of FOL. Why?
评论 #41832487 未加载
HeralFacker8 个月前
Yes, it&#x27;s called functional programming and it&#x27;s been around for a while