I was born a hundred years after this was gifted to America. When I was young, about the same time my elementary school class could have been covering the importance of France as it related to American Independence, pop culture had decided that making fun of French people was cool. I grew up instilled with the idea that they were lazy, smelly, 'weird' people who never helped in any major wars. My parents said nothing about them, so it was entirely what I consumed as a watcher of television.<p>That kind of disturbing falsehood wouldn't be changed in me until I was much older, post-911 freedom-fries even.<p>So when did that all change? Did the Greatest Generation come back from the war and stop thinking about its former allies? I always looked up to the Statue of Liberty, and I was surprised to hear it was a gift from France (certainly not something I heard the first time I saw it).