Forum Thread

The Intertwined History of the Academy Awards and Politics

Reply to ThreadDisplaying 1 Posts
  • Strongly Liberal Democrat
    Democrat
    Portland, OR
    Are you sure you want to delete this post?
        
    The Oscars are just around the corner, which means it is time to crown the winners of the years best films. This years nominees for best film follow a pattern of being overtly political in some form or another, which is of no surprise if you follow the politics of Hollywood as much as I do. Many people bemoan the "liberalization" of Hollywood and how they try to indoctrinate America's youth. It doesn't matter if that is a factual statement, which it is not, because the idea has taken on a life of its own regardless of the truth.

    You will not get an argument from me that Hollywood is currently tilted towards the left, but many of our members will remember a time when Hollywood was tilted towards the right. Hollywood is not a catchall, liberal community that dictates to the American people what they should think. Hollywood is hodgepodge of individual capitalist who want to create films that will make them the biggest profits. Plain and simple.

    The right in this country get up in arms about the political movies that Hollywood releases, but they neglect to come to terms with why Hollywood releases these movies: they make a lot of money. The nine movies nominated for Best Film raked in an astounding $1.65 billion dollars this year. Money talks and that is a heck of a lot of money.

    What many on the right also disdain is overtly political movies that make them feel uncomfortable. Movies like "12 Years a Slave" make the right uncomfortable because they would love nothing more than whitewash our nations brutal history with slavery. They would love nothing more than the glory days where Hollywood brushed over our nations racist past with broad strokes. The "Wolf of Wall Street" is another prime example. To many on the right, it is a film demeaning the rich, "job creator's."

    That is not to say that Hollywood is without fault when it comes to many of societal norms. They are only recently coming around to accepting the fact that they often depict black actors as the villain in many films. Many of you may disagree with me, but I ask you to take a sober look at the way blacks are depicted in many films. You can give me certain examples that may be to the contrary, but I challenge you to look at the bigger picture and ask yourself if I am wrong.

    So, in this season of the Oscars, let's accept that Hollywood is political and that it always has been political. To bemoan the idea that it pushes a liberal agenda is easy for conservatives to complain about, but they often forget the fact that Hollywood is just like the nation. Hollywood ebbs and flows with the times. Maybe the thing that conservative critics of Hollywood are so scared about is the fact that the nation is changing and they want to stay in the past.