You can't prevent scraping, but you can poison it. I can think of two approaches:<p>1. Replace bits of text on output with unicode look-alikes. Humans will still read what you want them to read, but non-humans get crap.<p>2. The mountweasel approach: put fake entries in that humans would never find. Then you can google these fake entries - any site other than your own with Mt. Weasel, is the result of scraping your site.<p>But honestly, most of our efforts to protect "our" work is just misguided busy-work...