Very cool topic... and not the article I was expecting!<p>I actively work with teams making sense of their massive global supply chains, manufacturing process, sprawling IT/IOT infra behavior, etc., and I personally bailed from RDF to bayesian models ~15 years ago... so I'm coming from a pretty different perspective:<p>* The historical killer apps for semantic web were historically paired with painfully manual taxonomization efforts. In industry, that's made RDF and friends useful... but mostly in specific niches like the above, and coming alongside pricey ontology experts. That's why I initially bailed years ago: outside of these important but niche domains, google search is way more automatic, general, and easy to use!<p>* Except now the tables have turned: Knowledge graphs for grounding AI. We're seeing a lot of projects where the idea is transformer/gnn/... <> knowledge graph. The publicly visible camp is folks sitting on curated systems like wikidata and osm, which have a nice back-and-forth. IMO the bigger iceberg is from AI tools getting easier colliding with companies having massive internal curated knowledge bases. I've been seeing them go the knowledge graph <> AI for areas like chemicals, people/companies/locations, equipment, ... . It's not easy to get teams to talk about it, but this stuff is going on all the way from big tech co's (Google, Uber, ...) to otherwise stodgy megacorps (chemicals, manufacturing, ..).<p>We're more on the viz (JS, GPU) + ai (GNN) side of these projects, and for use cases like the above + cyber/fraud/misinfo. If into it, definitely hiring, it's an important time for these problems.