Nearly every war in which the US has participated has been turned into a film. A lot of films glorify it, many show the horrors, others show human interest and ignore the truths completely. Few of us here have experienced or will ever experience war first hand, and so our only contact with it is through a relative or friend who has. I was just wondering how much war films effect people's perception of war.
A friend of mine came up with the conspiracy theory that Hollywood is against the next Gulf War because they want to throw everyone of the scent that they are actually delighted at the provision of fresh film making material.
Anyway, I read The Onion and it reminded me:
A friend of mine came up with the conspiracy theory that Hollywood is against the next Gulf War because they want to throw everyone of the scent that they are actually delighted at the provision of fresh film making material.
Anyway, I read The Onion and it reminded me:
Comment