I think it would be interesting to use an LLM to distill Wikipedia into a set of assertions, then iterate through combinations of those assertions using OpenCyc.<p>You could look for contradictions between pages on the same subject in different languages, or different pages on related subjects.<p>You could synthesise new assertions based on what the current assertions imply, then render it to a sentence and fact-check it.<p>You could use verified assertions to benchmark language parsing and comprehension for new models. Basically unit test NLP.<p>You could produce a list of new assertions and implications introduced when a new edit to a page is made.
> Is anyone playing with the combination of generative AI and OpenCyc?<p>OpenCyc specifically? No. But related semantic KB's using the SemanticWeb / RDF stack? Yes. That's something I'm spending a fair bit of time on right now.<p>And given that there is an RDF encoding of a portion fof the OpenCyc data out there[1], and that may make it into some of my experiments eventually, I guess the answer is more like "Not exactly, but sort of, maybe'ish" or something.<p>[1]: <a href="https://sourceforge.net/projects/texai/files/open-cyc-rdf/" rel="nofollow">https://sourceforge.net/projects/texai/files/open-cyc-rdf/</a>
I'll be the antiquated person here. Writing/speaking well has brought people to knowledge because the authors are uniquely positioned to use genius, humour, gentleness and generosity to bring an inquisitive but ignorant person into a new area of knowledge. When the value of that can be quantified, then we can compare AI "generation" of "efficiently written, useful knowledge" with what we had/have.<p>Same for fiction, visual art, raising children, caring for old people, and so on, and so on.
Unsure specifically, but there's a long-standing movement to combine GOFAI symbolic approaches, and modern neurally-influence systems. <a href="https://en.wikipedia.org/wiki/Neuro-symbolic_AI" rel="nofollow">https://en.wikipedia.org/wiki/Neuro-symbolic_AI</a>
At the risk of sounding dumb: I don’t understand what are the practical applications of OpenCyc. I get LLMs, you can ask questions and they’ll answer, they can write an article, they can summarize documents… what are the practical applications of OpenCyc?
Hmm… maybe we could train /tune a model on symbolic logic similar to or even using CycL instead of python, and then when we have it “write code” it would be solving the problem we want it to think about, using symbolic logic?<p>You might be on to something here. The problem being there isn’t billions of tokens worth of CycL out there to train on, or is there?
Haven't LLMs simply obsoleted OpenCyc? What could introducing OpenCyc add to LLMs, and why wouldn't allowing the LLM to look up Wikipedia articles accomplish the same thing?
if anyone is interested:<p><a href="https://2ro.co/post/768337188815536128" rel="nofollow">https://2ro.co/post/768337188815536128</a><p>(EZ - a language for constraint logic programming)