The E. E. Cummings poem in this article is actually what most caught my eye. At first I thought it was so awkward and stilted that I assumed it was the output of their generator. I quickly realized that it wasn't, and decided to read it closer, but I couldn't quite puzzle out what Cummings was talking about. I read the first stanza over and over again trying to figure it out: "Wait, who does he mean here? Are the second and third lines a question or a statement? Why is the last line such an apparent non-sequitur?"<p>And then, just for a second, I stopped paying attention to the grammar and started paying attention to the <i>words</i>, and I realized that very stanza was <i>mocking me</i> for what I was doing. It was a masterfully laid trap for the analytical mind.<p>The whole poem fell into place after that.
You should use more context than one previous word. You should use N previous words and you'll get more realistic results.<p>I wrote a word game using markov chains to generate fake (but real looking) words.<p><a href="http://www.michaelfogleman.com/wug/" rel="nofollow">http://www.michaelfogleman.com/wug/</a><p>Read about the algorithm here:<p><a href="http://www.michaelfogleman.com/wug/algorithm" rel="nofollow">http://www.michaelfogleman.com/wug/algorithm</a><p>Also, you can leave punctuation in - you'd just need a larger training set, probably.