It's by Stuart Russell's group at Berkeley (he coauthored the AI bible with Peter Norvig). Their tutorial: <a href="http://bayesianlogic.github.io/download/BLOG-tutorial-2014.pdf" rel="nofollow">http://bayesianlogic.github.io/download/BLOG-tutorial-2014.p...</a> Page 58 has some good sample code. Semantics are on page 70.<p><i>Every well-formed BLOG model specifies a unique proper probability distribution over all possible worlds definable given its vocabulary
•No infinite receding ancestor chains;
•no conditioned cycles;
•all expressions finitely evaluable;
•Functions of countable sets</i><p>They instantiate some parts of the network and do inference with MCMC. I wonder how it compares to the Markov Logic approach from the University of Washington.
Interesting. They talk about plain old Metropolis Hastings, which is pretty questionable.<p>Anyone excited about this, I highly recommend checking out Stan; it's under active development, actually works with real problems, and is used in the real world. With NUTS and HMC they've really made good on their promises, and quite soon they'll have meaningful ADVI support. See this former discussion: <a href="https://news.ycombinator.com/item?id=10244771" rel="nofollow">https://news.ycombinator.com/item?id=10244771</a>
"We've coded up the application that will run your business. It has a 80% chance of working correctly roughly 20% of the time with a 95% confidence interval."
More resources related to this subject:<p><a href="http://probabilistic-programming.org" rel="nofollow">http://probabilistic-programming.org</a><p>There's <i>probably</i> some alternative, actively developed projects that have the same objective as BLOG listed on that page.
Doesn't seem to be under active development. <a href="https://github.com/BayesianLogic/blog" rel="nofollow">https://github.com/BayesianLogic/blog</a>