He's not wrong. Many people today are lonely, and I believe a good AI friend/companion/therapist would genuinely improve most people's emotional state and greatly reduce social issues. Specifically, an AI that infers one's personality, then forms and maintains an emotional connection while discouraging bad thought patterns and offering constructive solutions.<p>But, as this article points out, current AI doesn't do this. The AI conversations I've seen and had felt bland and shallow, and I think (although some people really connect with it) most others would agree. Moreover, current AI is very unreliable at being a good influence, <i>especially</i> for people who already have bad mental health, because it's very suggestible. AI can also hallucinate, at best (when detected) creating awkward conversations and at worse (when undetected) misleading people.<p>These problems have been around since ChatGPT has released, and even o3 and Gemini 2.5 seem to have them, so I'm skeptical they'll improve in the near future. Meanwhile, there's an "obvious" way I think we can improve people's loneliness without AI: creating in-person third spaces and encouraging people to visit them. Facebook can even help with this (and profit from it); e.g. perhaps by launching a meetup.com alternative, then funding some groups to make it popular (although it wouldn't be easy and certainly not "obvious" how to get the right people and not grifters, plus other side effects).