Thursday, July 14, 2011

Are Americans taught that they won the 1812 war in school?

I keep hearing Americans perpetuating the myth that they won the 1812 war, and I was just wondering if this was a result of their education.

No comments:

Post a Comment