> exploiting the fact that factual knowledge in an LLMs has generally been shown to be localized to particular transformer layers<p>This is surprising
Just call it correctness. Hallucination as an alternative to incorrect is fine for marketing I guess but factuality is especially awkward besides being pretty Orwellian.