For those who might be interested, and in a slightly different vein than the examples in the article, there's the "sleeping beauty" paradox: <a href="https://en.wikipedia.org/wiki/Sleeping_Beauty_problem" rel="nofollow">https://en.wikipedia.org/wiki/Sleeping_Beauty_problem</a><p>Basically, an agent is put to sleep and told they will be woken up once or twice, depending on the results of a fair coin flip, without the ability to remember other awakenings.<p>What probability does the agent assign to the event that the coin landed heads?<p>The intuitive response is 1/3, but this poses obvious epistemological problems. The agent has, ostensibly, no new information at all, and their prior is surely 1/2. Hope someone else finds this as interesting as I do!