Re: I wonder...
Americans have always been the bad guys. I was brought up on post-cold war propaganda just like every Westerner, but history is all lies. America destabilized the middle east and got their come-uppance. America is the evil empire and needs to collapse for any progress to be made on earth.