I mean, it's not considering anything. It's a GPT-2 bot; it's very good at producing blocks of text that read like something that a real person would type, but there's no semantic understanding of anything it's saying or hearing. It doesn't have ideas, or know things, or want things. It just generates plausible-sounding paragraphs.