The most telling thing about the state-of-the-art currently is that somewhere between all the marketecture in no-code and low-code, we can essentially create a recursive crawler on-the-fly on par with a large cloud provider (because it is) and ask it to do recursive crawls that at least begin to approach a sensemaking machine, i.e. it can provide consensus for subjective truth agreed upon by experts above and beyond simple objective truth. To me, that's the great innovation that truly embraces and extends average intelligence by providing a prosthetic device for the brain to make inferences that would have only been approachable by high functioning individuals previously, i.e. the kind of argument about consciousness that you see from low latency inhibition researchers like Peterson. What any individual does with this is really where prompt engineering becomes the human-computer agent collaboration that should be our default mode in computing - a kind of tortoise wins the race story where we lost our minds in the race to interaction via javascript. It's not terribly interesting to watch a computer type. There's a place for batch mode (queuing, etc) , if the tools built up around it handle long running job management well. Sadly, that seems rarer to me now than 30 years ago.