Yeah, it turns out that it is totally misleading to dismiss something that is based upon text prediction as being only capable of things that require text prediction.
In order to be able to predict the 'correct text', you, as a human, or as a 'device' are not just 'looking for instances where 'b follows a' in your training data.
This would just amount to nothing more than running a query on a database, finding the data in there and then outputting it.<p>Sure, there is a bit of this going on with an LLM system.<p>But it turns out that both the finding and the outputting in a neural net are so different to what is happening in a database-driven solution, that although the information contained within the answer might not be something that would be impossible to derive from a 'conventional' database and therefore still make it unrealistic to claim that a conventional database has any 'understanding' of the query that it is attempting to execute, this turns out not to be the case with a neural net-based solution like ChatGPT.<p>In order to 'predict the next word', the neural net can't just look it up, because it is unlikely to be there. The neural net needs to have used its training process to have built up what we might call 'intuitions'.
It isn't 'just' consulting its training data. It is mostly consulting its 'intuition-base'. To all intents and purposes, it is consulting its 'understanding' of the training data, rather than consulting the data itself.
This intuition-base is so much more optimised than any conventional data base that lookup is near-instantaneous. That's why, when we 'try to figure something out' we can need to take a little time to 'think things through', but when we react to something, it's instant, nee jerk response.
Using our own minds, when we are instantly reacting to something, are using the brain's neural net in the same way as ChatGPT, but if we are are figuring things out on a 'if this then do that' basis, our minds are working more like a computer running lines of code. Most of ChatGPT is based on NOT 'just' running lines of code, but there is a tad of code in there as well as a neural net. But to cut a long story short, yes, the intuition-base side gives ChatGPT its ability to analyse and understand and the word prediction aspect is jus ta deceptively misleading characterisation of what's really going on.