How Hollywood Helps to Obscure the Ugly Truth about American Militarism

Jeff Nall @ Truthout - In American popular culture, the United States government and military are almost always portrayed as agents of good struggling to overcome evil. This is particularly true of Hollywood depictions of US warfare.  Read more.

Comments