_______               __                   _______
       |   |   |.---.-..----.|  |--..-----..----. |    |  |.-----..--.--.--..-----.
       |       ||  _  ||  __||    < |  -__||   _| |       ||  -__||  |  |  ||__ --|
       |___|___||___._||____||__|__||_____||__|   |__|____||_____||________||_____|
                                                             on Gopher (inofficial)
   URI Visit Hacker News on the Web
   URI   Annotated implementation of microKanren: an embeddable logic language
        giraffe_lady wrote 2 days ago:
        Love all the racket I'm seeing around here recently. Most underrated
        lisp imo.
        nerdponx wrote 2 days ago:
        Let's assume I'm a hipster and I wanted an excuse to embed a logic
        programming DSL in my application. What is a valid excuse / usecase?
        Does anyone here have any real-world success stories from something
        like this?
        Logic programming always appealed to me, but I always felt like I never
        understood what to actually do with it other than write toy programs
        about parent-child relationships (I tried learning Prolog a few times,
        but stopped for this reason).
          lukashrb wrote 2 days ago:
          If I recall correctly: cisco threat grid is using core.logic
          rad_gruchalski wrote 2 days ago:
          RBAC. Google Zanzibar comes to mind.
          protomyth wrote 2 days ago:
          For normal enterprise programming, they are fairly good for writing
          rules that normal business folks can eventually figure out.  Every
          business has jargon to describe their business.
          sterlind wrote 2 days ago:
          I attempted to use Microkanren for type deduction in a transpiler I
          had to build at work. I wasn't actually trying to be a hipster, I
          just had to compile a dynamic language into the strictest possible
          static types. I'd reinvented unification, remembered Kanren which is
          built on that, so I reached for it.
          It didn't work out in my case. I found it really fiddly to work with,
          and the implementation I was loosely working from was very idiomatic
          YeGoblynQueenne wrote 2 days ago:
          Here's some stuff I've written in Prolog, some for my own enjoyment,
          one for my degree project.
          Most of the benefits I found come down to two things:
          a) Prolog, like the various kanrens, is a relational language so a
          program is effectively a database. There's no need to do anything
          special to glue together a data layer and a logic layer, because you
          have both written in Prolog. No object-relational impedance mismatch
          b) Prolog's declarative style makes translating rules and directives
          into code a breeze. The three projects below are all games and
          benefit heavily from this feature. The same would go for business
          logic of any shape or form.
          1. Warhammer 40K simulation: [1] Runs simulations of combat between
          WH40k units.
          2. Gleemin, a Magic: the Gathering expert system: [2] Doesn't work
          anymore! Because backwards compatibility. Includes a) a parser for
          the rules text on M:tG cards written in Prolog's Definite Clause
          Grammars notation, b) a rules engine and c) a (primitive) AI player.
          The parser translates rules text from cards into rules engine calls.
          The cards themselves are Prolog predicates. Your data and your
          program are one and now you can also do stuff with them.
          3. Nests & Insects, a roguelike TTRPG: [3] WIP! Here I use Prolog to
          keep the data about my tabletop rpg organised, and also to
          automatically fill-in the character sheets typeset in the rulebook.
          The Prolog code runs a character creation process and generates
          completed character sheets. I plan to do the same for enemies' stat
          blocks, various procedural generation tables, etc. I also use Prolog
          to typeset the ASCII-styled rulebook, but that's actually not a good
          application of Prolog (too imperative).
          You asked about "logic programming" in general and not miniKanren in
          particular. I haven't actually used miniKanren, so I commented about
          the logic programming language I've used the most, Prolog. I hope
          that's not a thread hijack!
          All three of the projects above are basically games. I have more
          "serious" stuff on my github but I am currently experiencing a
          certain shortfall of gravitas.
   URI    [1]: https://github.com/stassa/wh40ksim
   URI    [2]: https://github.com/stassa/Gleemin
   URI    [3]: https://github.com/stassa/nests-and-insects
            nerdponx wrote 2 days ago:
            Thanks for the examples! I will have to review these.
          evnc wrote 2 days ago:
          I'm casually exploring using logic programming to define rules for
          games, as an embedded scripting language in a game engine, which
          would handle input and rendering etc. and be written in a more
          "conventional" language.
          We'll see if performance is an issue, but the space I'm most
          interested in is turn based strategy / board games, where it doesn't
          seem to be a big deal if things are as fast as possible.
          It feels like a pretty natural fit, e.g.:
            can_move_to(unit_id, tile) :- 
              Unit(unit_id), Tile(tile),
              reachable_to(unit_id, tile, move_cost),
              unit_move_points_remaining(unit_id, mv),
              move_cost <= mv.
          You can define a lot of rules from basic unit movement to "what
          happens when you activate Super Special Rule-Bending Ability" this
          way, and it's pretty easy to change the definitions if you want to
          experiment with different game rules / logic. In practice, you want
          various syntactic sugar.
          Haven't got a chance to take this very far yet, just casual
          experimentation, but it seems like there's potential.
            convolvatron wrote 2 days ago:
            top-down datalog can almost certainly be compiled to something
            quite reasonable.
            you also might enjoy looking at incremental datalog
            (differentiating rules and operating on events). this generally
            does a really good job of touching just what need to update the
            global state.
            this is a great application
            YeGoblynQueenne wrote 2 days ago:
            Oh yeah, encoding game rules is a killer app for Prolog's
            declarative style. See my sibling comment- I've done this with M:tG
            and WH40k. With M:tG in particular, translating from the rulebook
            to Prolog, to code a rules engine, is a breeze, just because the
            M:tG rules are in such strict, formal language already.
            The jump from a well-written, unambiguous and precise set of board
            game rules and its translation in Prolog is a tiny step, that could
            even be automated with some elbow grease.
            Now, you mention "rendering" and that, on the other hand, I'm not
            sure is a very good application for Prolog. Nothing to do with
            speed, but arithmetic in Prolog is a bit meh, and if you want to do
            stuff like matrix arithmetic, you have to roll your own. Most
            Prologs don't even have arrays, as such (they kind of do but it's a
            bit of a hack).
          lukev wrote 2 days ago:
          One thing I'd love to see logic programming used for is in an
          infrastructure-as-code platform to provision cloud resources.
          I don't want to have to define security groups, subnets, route
          tables, etc. Instead, I want to declaratively say "Service A must
          communicate with Service B", and based on the properties of those
          services, all the correct cloud resources will be defined. Subject,
          of course, to constraints around security, etc.
          We are tantalizingly close to this with current IaC tools (CDK,
          Pulumi, etc) but at the end of the day you still need to define each
          granular cloud resource (even if the code to do so can be tucked away
          in a library abstraction.)
          I'm busy with my current gig now, but if you forced me to launch a
          startup at gunpoint, this would definitely be it.
          rscho wrote 2 days ago:
          You want pure I/O in an untyped language. DCGs are the easiest way to
          have that (although they also have their limits).
            nerdponx wrote 2 days ago:
            What do you mean by "pure I/O" in this case? And how does logic
            programming help with that?
            I've seen things like this in the context of "algebraic effects",
            where instead of performing I/O (and instead of using monadic I/O)
            you build up what amounts to an execution graph of effectful
            operations, which is then "interpreted" at runtime. Is that what
            you're describing?
              rscho wrote 2 days ago:
              No, what I'm describing is the "monadic" behaviour of
              `phrase_from_file/2` and similar predicates. In conjunction with
              definite clause grammars, this means that your I/O operations
              either conform to the provided spec or fail. This is a "hot
              topic" (if LP topics can really still be "hot") and sees active
              development in new prolog systems.
              Check this out for prolog: [1] [2] [3] "New" stuff in the
              MiniKanren world: [4]
   URI        [1]: https://youtu.be/Dqpxy4W7fAo
   URI        [2]: https://www.swi-prolog.org/pldoc/man?section=pio
   URI        [3]: https://github.com/mthom/scryer-prolog/blob/084fc845902f...
   URI        [4]: https://staff.fnwi.uva.nl/c.u.grelck/nl-fp-talks/kourzan...
   URI        [5]: https://ifl2014.github.io/submissions/ifl2014_submission...
                YeGoblynQueenne wrote 2 days ago:
                (Not the OP) Thanks for the links, because 14 years after I
                first started coding in Prolog I'm still not sure what Pure I/O
                is or why it is.
                I mean, my intuition is that a file is a stream of bytes and
                wading through that stream is not going to be "pure" no matter
                what, just because a byte is not a concept in First Order
                Logic. But, OK, "pure" Prolog is not the same as FOL anyway,
                and so I may be convinced otherwise when I have the chance to
                peruse your links. So thanks again and sorry for any negative
                vibes emanating from my comment.
                  rscho wrote 2 days ago:
                  "pure" prolog I/O is every bit as pure as Haskell pure I/O,
                  which is ... not very much and in a hand-wavy sort of way. My
                  understanding is that it basically means "either it
                  corresponds to the spec, or it fails". At first, it seems
                  trivial and a strange point of pride to have. Until you
                  realize that most languages actually don't have anything
                  built-in to do that, and must resort to some sort of parser.
                  At which point, the data is already past your I/O gateway and
                  may lead to unpredictable effects in your program. This is
                  not an academic opinion, just my 2c.
                    YeGoblynQueenne wrote 2 days ago:
                    Thanks for taking the time to reply! I appreciate your 2c.
                    I guess my confusion about pure I/O stems from the
                    documentation of phrase_from_file/2 in SWI-Prolog. The
                    documentation says the predicate is for pure I/O and how it
                    was implemented, and by whom, but it doesn't say what pure
                    I/O is, and why it is needed. I guess the information is
                    somewhere in one (or more) of the canonical Prolog
                    textbooks, which I've only read several years ago (though
                    multiple times over, each).
                    I have to confess also I'm not much of a purist. My Prolog
                    code looks like bog standard imperative code where I feel
                    it doesn't really matter (e.g. I don't have any prejudice
                    about setof/3 or maplist/2 over findall/3), and I guess
                    file I/O is one of those cases.
                    Anyway, thanks again, for your links and explanation. I owe
                    you a quantum of knowledge :)
          qsort wrote 2 days ago:
          You have to find a valid assignment of resources according to some
          business rules that are unstable/partially unknown/very likely to
          You have to solve some kind of 3-SAT-reducible problem that would be
          difficult to implement otherwise.
          You have to do some heavy pattern-matching.
          Closely related:
   URI    [1]: https://en.wikipedia.org/wiki/Answer_set_programming
          giraffe_lady wrote 2 days ago:
          Yeah I've had the same questions for a while. What specific concrete
          problem can you solve with this, that does not have as-good solutions
          available in any language powerful enough to easily implement it?
          I am kind of half-considering it for mud scripting? Each line from
          the mud represents a fact or relation so this could be used to query
          for specific information as needed.
          For this case though, it doesn't really seem better than writing a
          regex-based DSL though, something I'm far more familiar with and that
          is better supported.
            rscho wrote 2 days ago:
            Constraint programming. Everyone will tell you that C++ and Java
            solvers are so much faster. That's true, but prolog for example is
            incredibly easy to use in comparison, and in fact not that slow. So
            if you're prototyping or aiming for something exotic that doesn't
            easily fit into existing libs, LP will be miles ahead.
            Generally speaking, you'll see the same pattern for any problem
            involving searching a space of potential solutions. In a word, the
            strength of LP is flexibility when facing the unkown (future
            feature requirement).
            EDIT: you mention parsing. If you're planning anything
            context-sensitive, you have to check out eDCGs. You probably won't
            use them, but they'll give you an idea of what's possible.
              mncharity wrote 2 days ago:
              > You probably won't use them,
              Curious - I fuzzily recall usually ending up manually threading
              state (eg, doing a Perl-Compatible Regular Expression engine
              (apropos "why LP?" - it was just a page of code, with good
              performance)), but I don't recall what drove that decision...
              mncharity wrote 2 days ago:
              > you have to check out eDCGs. [1] ,
   URI        [1]: https://github.com/kamahen/edcg
   URI        [2]: https://occasionallycogent.com/prolog_edcgs/index.html
              giraffe_lady wrote 2 days ago:
              Ahhhh this is extremely helpful, thank you. I've been puzzling at
              the mud parsing thing because while it's in the shape of natural
              language (and somewhat annotated in it eg there is spurious
              "decorative" information), it does represent a 1:1 mapping with
              computation already.
              This feels like a much better path than trying to get it into an
              AST with like yacc or whatever. But since it isn't actually
              natural language NLP didn't seem right either. I've got a lot of
              reading to do but this is very interesting.
                YeGoblynQueenne wrote 2 days ago:
                I started a comment about DCGs also before seeing rscho's and
                mncharity's comments above which covered it. If you're going to
                do any parsing, I don't think there's any better tool than
                DCGs. It blows my mind that everyone is still trying to kludge
                their way around regexes instead, when there is such a better
                tool around.
                Your mud language is probably a "Controlled Natural Language",
                btw: [1] Game languages often are.
   URI          [1]: https://en.wikipedia.org/wiki/Controlled_natural_langu...
            ModernMech wrote 2 days ago:
            Logic languages are great for expressing database queries, so you
            can treat your program like a database. And if your program is a
            database you can do database-y things on it like using transactions
            to "rewind" the program state. Or in the context of distributed
            programming, well... distributed databases are a thing, and we know
            how to handle updates to databases in that context, so writing your
            distributed logic program is as easy as forming a series of
            database queries, and you can let the DBMS do the magic of
            distributing it.
            I guess I think of logic languages as SQL but I actually want to
            write my whole program in it.
        Jtsummers wrote 3 days ago:
        Along with μKanren, miniKanren is introduced in the book The Reasoned
        Schemer. The language is pretty straightforward to implement in other
        languages (or at least other lisps).
   URI  [1]: http://minikanren.org/
        ashton314 wrote 3 days ago:
        Author here: very happy to see some interest in microKanren. It's such
        a lovely little gem. Has anyone used microKanren as part of a
        production program?
          rscho wrote 2 days ago:
          Not really production, but probably THE most impressive biomedicine
          research work I've seen (and I'm an academic MD): [1] This is a FOL
          theorem prover that uses medical research articles as terms. They use
          it to do genetics and drug repurposing metaresearch. It's like the
          wet dream of all the biomed machine learning fanboys out there,
          except that:
          1. it's not machine learning
          2. it really works
   URI    [1]: https://github.com/webyrd/mediKanren
            YeGoblynQueenne wrote 2 days ago:
            Note you can do machine learning of logic programs. My PhD
            research: [1] In which case it _is_ machine learning and it still
            really works :D
   URI      [1]: https://github.com/stassa/louise
              rscho wrote 2 days ago:
              Yeah, I wanted to try your system on the same database as they're
              using for medikanren. I got stuck bc loading the whole thing into
              the prolog fact database was too huge. Perhaps I should retry by
              querying the SQL database from prolog.
                YeGoblynQueenne wrote 2 days ago:
                Yes, I've seen this before. I'd suggest partitioning the
                database into parts that fit comfortably in memory and
                processing each separately, then simply combininig the learned
                programs into one.
                You can do this with Prolog because you can keep adding clauses
                to a program, or removing them from it, for as long as you
                want, and the program won't (necessarily) break, it will just
                become more specialised, or generalised (respectively). John
                McCarthy called that "elaboration tolerance".
                To, er, elaborate a bit on this, a logic program is a set of
                clauses interpreted as a conjunction (even if they look like
                alternatives!) so that the program as a whole has a set of
                "facts" it entails, but without any other assumptions about the
                relation between the clauses with each other. If you add
                clauses, your program entails more facts. If you remove
                clauses, it entails fewer facts. With negation-as-failure, the
                relation is non-monotonic, but it's still the same idea.
                This is what makes it possible to learn Prolog programs from
                examples, in the first place. With Louise in particular,
                because it builds up programs clause-by-clause, you can even
                train it with a single example at a time. It used to be it was
                limited in its one-shot learning capabilities but it's much
                better now.
                In fact, if you're doing any preliminary work with Louise, you
                really want to start by training it on a handful of examples at
                a time, so you can best figure out the right configuration to
                use to learn the kind of program you want it to learn.
                Otherwise, if you throw the entire database at it, even if it
                fits into memory, you'll get so much in the output that it will
                just be overwhelming.
                ILP and Louise work a bit differently than ordinary machine
                learning. You want to go slowly, processing small data in small
                batches, otherwise you get a firehose of clauses in the face
                and drown. It's the tradeoff you get for being able to inspect
                the learned "model": there's still a lot of it and you have to
                develop strategies to deal with its complexity.
                Hope that makes sense? Please do get in touch if you need help
                with Louise. Or anything ILP!
          lukev wrote 2 days ago:
          Clojure's core.logic (a Clojure implementation of miniKanren)
          definitely gets some use in production.
          avmich wrote 2 days ago:
          No, and for me the problem is the following. I (can) have a variant
          of microKanren embedding in my environment, but the microKanren
          interface is rather inconvenient to use. miniKanren would be better,
          but to implement miniKanren in other languages is a bigger problem -
          actually I hoped to get some help with that using this annotation
            ashton314 wrote 2 days ago:
            > but the microKanren interface is rather inconvenient to use
            What specifically would make it easier? I'm not that familiar with
            miniKanren. Is it stuff covered by the convenience macros from the
              avmich wrote 2 days ago:
              Good question. I suspect the problem is in that the goals are
              expressed as functions in microKanren, and it's hard to figure
              out how to represent what is needed to get as such functions. It
              looks like Prolog approach is more understandable - there are
              symbols and rules; not sure if Prolog would be easier to embed
              in, say, Java and switch between Java objects and Prolog rules.
              microKanren is practically by definition for embeddability - so
              it's this aspect which for me present a problem.
        avmich wrote 3 days ago:
        For those knowledgeable enough in Scheme the original microKanren - and
        some immediate extensions towards miniKanren - should be rather clear.
        It's for users of other languages, with other popular mechanisms, that
        questions arise. More comments for them would be welcome.
          ejdo wrote 2 days ago:
          I did a mostly-true-to-the-paper python implementation a while back,
          that may be helpful as well:
   URI    [1]: https://github.com/Erik-J-D/microKanren-py
          ashton314 wrote 3 days ago:
          The original microKanren is pretty clear; the one struggle I had with
          the paper was some of the variable names were a little too "cute":
          e.g. `s/c` all over the place; I assume this was a contraction for
          "substitution/counter" (i.e. the state) which is what I expanded it
          to. Super terse variable names are difficult for me to grok when I'm
          looking at something new.
          I did hew to the standard convention of representing a type
          environment with Γ (Gamma); hopefully that's not too confusing for
          people looking at the type checker. :)
        thebeardisred wrote 3 days ago:
        Because of the way the line wrapped on my phone I initially read that
        as "Annotated implementation of microKaren".  At that point my brain
        just went nuts and filled in the rest with "an embeddable non-logic
        language". :)
        qsort wrote 3 days ago:
        Obligatory reference:
   URI  [1]: https://aphyr.com/posts/354-unifying-the-technical-interview
          gavinray wrote 2 days ago:
          What in the absolute f*ck did I just read?
            ashton314 wrote 2 days ago:
            Now go read Hexing the technical interview and others:
   URI      [1]: https://aphyr.com/posts/341-hexing-the-technical-interview
            qsort wrote 2 days ago:
            I noticed just now that it was also linked at the bottom of the
            article. Definitely a fun one :)
   DIR <- back to front page