Hollywood really changing

 

Is Hollywood really changing or just a biased version


Long considered a bastion of pathological progressiveness and wanton liberalism (Remember the blacklist? The one not starring James Spader?), film and television were accused of obsessing too much about things like transgender rights and how many black actors got Oscar nominations and not enough worrying about the concerns of “real Americans”: Rust Belt unemployment, devotion to guns, fear of porous borders, disillusionment with government, feelings of personal alienation and a general sense of a world run amok.How, many wondered, could the creators and arbiters of popular culture have been so out of step with the viewers and moviegoers they serve? The answer is they weren’t and aren’t. Because there is no notion more thoroughly absurd than that of Hollywood’s liberal agenda. The real elitism of film and television — we like to watch people who seem richer than they should be. Although many members of the entertainment industry espouse, often publicly, a left-leaning political slant, Hollywood is still dominated by white men who prefer to make movies and television shows that revolve around other white men — men beset by feelings of alienation, who often wield guns, who fight (or represent) corrupt government, and generally attempt to survive and/or save a world run amok.Across galaxies, through the centuries, in every genre imaginable. For every film that does not revolve around such a lead character, there are 78 others that do, just as for every series that features a transgender character, there are 8,000 that do not. In representing the actual demographics of “real” America, television has done a slightly better job than film. But it’s still a mostly white, mostly male, mostly straight world fighting forces that range from the daily stress of family life to armed rebellion against encroachment by aliens, zombies and fascists both historical and imaginary.

Post a Comment

0 Comments

Recent, Random or Label