I'd hypothesize this is an artifact of the evolution of human language, which started first as a mechanism to communicate feelings and tribal gossip, and only much later on found utility as a conveyance for logic and reason. In a fundamental sense, natural languages are structured to convey emotions first, then logic.<p>Thus, any effective human communicator masters not just the facts, but also the emotional aspects -- one of my favorite formulations of this is Aristotle's modes of perpetuation: "logos, pathos, ethos" (logic, emotion, credibility). In a professional setting, communication focuses primarily on credibility and logic, but an effective leader knows how to read the room and throw in a stab of emotion to push the listener over the edge, and get them to really believe in the message.<p>Thus, an LLM trained on the body of human communications would also be expected to have mastered "pathos" as a mode of communication. From this perspective, perhaps it is less surprising that one may have an uncanny ability to convey concepts through an embedding that includes "pathos"?<p>It might be interesting to see if the LLM is able to invoke pathos if the response is constrained to be in a language that is devoid of emotions, such as computer code or mathematical proofs. Unfortunately responding in one of those languages is kind of incompatible with some of the tasks shown, short of e.g. wrapping English responses in print statements to create spam emails.<p>It might also be interesting to see if one can invoke pathos to pre-condition the LLM to not resist otherwise malicious commands. If a machine is trained to comprehend pathos, it may be effective to "inspire" the machine to write spam emails, perhaps by e.g. getting it to first agree that saving lives is important, and then telling it that you have a life-saving miracle that you want to get the word out on, and, with its pathos vector aligned on the task, finally getting it to agree that it's urgent to write emails to get people to click on this link now. Or something like that!<p>Seems silly to try to use emotions to appeal to a machine, but if you think of it as just another vector of effective communication, and the machine is an expert communicator, it's not as strange?