Gen ai doesn't "understand" _anything_. It simply spits out the next "likely" occurrence of something based on preceding patterns. I'm not sure why people keep thinking that it "understands" anything.
Related: <i>Despite its impressive output, generative AI doesn’t have a coherent understanding of the world</i> <a href="https://news.ycombinator.com/item?id=42049482">https://news.ycombinator.com/item?id=42049482</a>