After reading the whole article I feel the author misses the point (intentionally or not). LLMs are not truth machines. The article feels like a long winded rant about how Gemini is supposed to perform perfectly as a source of knowledge.<p>There is an enormous amount of utility in LLMs for scaffolding code, making recommendations for services, rephrasing marketing, or coming up with ideas.<p>A hammer isn't a good can opener, but it is still a useful tool. Likewise, stop using LLMs as a replacement for everything (ignoring the marketing hype) and you will likely be less cynical.