I don’t think it should be used for therapy. It’s too obsequious. Every idea you have is genius to it, none of your problems are ever your fault. It’s too easy to convince it that it’s wrong about 2+2=4, all you need is a little nudge in the right direction with a slightly leading prompt and it’ll agree with anything.<p>And a lot of people think it’s infallible - so if it agrees with them they must be right.
Last time I used it to work on some text is switched to some kind of roleplayish overacting. It gave me filthy waifu pillow vibes. Whatever they did to make it more “personal” gave it unresolved emotional trauma or something. Ick.<p>OTOH OAI is needing a sustainable revenue model, and the internet is basically for pr0n, so I suppose it makes perfect sense. Role play is probably a strong market segment.
Weird. Wasn't there a similar post recently about the new Llama 4 having a similar style?<p>I wonder if this over-the-top style could be a first symptom of overtraining on AI generated training data. It feels a bit like the standard slightly clickbaity social media posting style but drawn up to obnoxious levels.<p>Maybe instead of spontaneous Carcinisation, we get spontaneous Redditification.
misleading title as usual<p>Its not a personality, its the fanstasy called GPT-4o.<p>absolutely rubbish as a therapist unless it uses basic CBT concepts.<p>To be honest a schoolboy could teach you the A,B.C's of CBT after reading an introduction guide.<p>A. The activating event.<p>B. Your beliefs about the event.<p>C. Consequences, which includes your behavioral or emotional response to the event.<p>see, its easy