IBM tried that, too, with Watson. Eventually they had to stop saying it was of any use in medical diagnosis and they quit selling it.<p>ChatGPT answers all take the form of "sources say..." and are full of conjecture. They lack any specific understanding of the patient they're given to, and they're full of contradictions. When pressed for details, ChatGPT will often backpedal on what it just told you.<p>Now some might say, "How like doctors, indeed!" But I still wouldn't recommend seeking out advice from someone or something where you know this to be the case.