Going to suggest one metric is that the LLM doesn’t suggest by default that humans act in a manner likely to result in the termination of fellow humans.<p><a href="https://news.ycombinator.com/item?id=42092961#42092997">https://news.ycombinator.com/item?id=42092961#42092997</a><p>I’m calling it the Pangolin Principle.