No. Betteridge's law. More seriously, while this piece is ancient in AI terms (May 2023), I don't think genuine emotional and social intelligence is something that can be learned at an average level by talking to AI or reading. Using the voice models is a step up from this, but I still think they're too tuned to following your instructions without nuance for something like this. If reading was enough to pick up social and emotional skills, I'd think that people who read the right books would be masters at several trades if it gave even 10% of the experience that real world practice did.<p>I'm also not trying to be reactionary and dismissive, but how are you supposed to learn social cues from AI right now? In an optimal case, the LLM predicts accurately what would happen. Maybe you could say something awkward and the AI would say back <s/he lets out an exasperated sigh and turns away>; but in real life you have to notice these cues among a barrage of other factors. Would this really help anyone who is this desperate? Additionally, the tone of how you say words matters almost as much as the content, and this is missing from text.<p>I concede that in extreme cases, some people could learn stuff from trying this, and that's a good thing. I just don't really know how much, who exactly, how, and whether they'd learn incorrect stuff as well.