This is a really cool exercise! The format of it seems pretty sound, like a version of the prisoner's dilemma with a larger group (co-operation versus defection).<p>Although I think that the majority of modern models don't really have the internals suited to this sort of exercise; training data/fine tuning will heavily influence how a model behaves, whether it's more prone to defection, etc.<p>A Squirrel makes a "Kuk kuk kuk" alarm call not specifically because the "Kuk" token follows the sequence "you saw a predator" (although this would appear to mostly work) but because it has evolved to make that noise to alert other Squirrels to the predator, most likely a response to evolutionary failure associated with a dwindling population; even solitary Squirrels still need to mate, and their offspring need to do the same.<p>It's like there's an extremely high dimensional context that's missing in LLMs; training on text results in a high dimensional representation of related concepts - but only the way that those concepts relate in language. It's the tip of an iceberg of meaning where in many cases language can't even represent a complex intermediate state within a brain.<p>Humans try to describe everything we can with words to communicate and that's partly why our species is so damn successful. But when thinking about how to open an unfamiliar door, I don't internally vocalise (which I've learnt not everyone does) "I'm going to grab the handle, and open the door". Instead I look and picture what I'm going to do, that can also include the force I think I'd need to use, the sensation of how the material might feel against my skin and plenty of other concepts & thoughts all definitively _not_ represented by language.