Far less harmful but I've had Bing chat give me a wrong solution to a programming problem it pulled from Stack Overflow because it confused the code in the question with being an answer to the question. Guilt and innocence by association seems to be something it is prone to without understanding context.
"Microsoft Bing Copilot has falsely described a German journalist as a child molester, an escapee from a psychiatric institution, and a fraudster who preys on widows.<p>Martin Bernklau, who has served for years as a court reporter [...] asked Microsoft Bing Copilot about himself. He found that Microsoft's AI chatbot had blamed him for crimes he had covered."