To no one’s surprise, a new report shows that leftwing Hollywood has not come anywhere close to practicing what it both screeches and preaches. On the diversity front, whites and males are not on the wane in Tinelstown. Quite the opposite. Nope, when compared to last year, whites are still in control of 89% of network television, while women of color are actually losing ground:
About 89 percent of executive producers of new series airing this season on the four U.S. broadcast networks are white, and 79 percent are male. Minorities have advanced by less than two percentage points since the 2016-2017 TV season, and women of color claimed fewer executive producer spots than they did a year ago at the two networks that provided data.
So let’s suss this out a little bit, shall we …
Whites make up only 73% of the American population. And yet, whites make up a whopping 89% of leftwing Hollywood’s executive producers.
Women make up 51% of the American population. And yet, men make up a whopping 79% of leftwing Hollywood’s executive producers.
Moreover, women of color are LOSING ground in Hollywood!
Honestly, how is this even possible? It is not as though the Civil Rights and feminist movements just arrived or are still learning to walk. Both occurred before most of the enlightened, oh-so woke Leftists who run Hollywood were even born.
We are talking 50 years, for crying out loud.
The report also claims that the television industry is working hard to hire and train more women and minorities, but isn’t that part of the problem? Isn’t that the worst kind of racial pandering and patronizing? Isn’t that a regressive form of segregation that automatically removes the idea of equality by first and foremost categorizing an individual by gender and race?
Then there is the raging hypocrisy.
Hollywood constantly lectures, hectors, and peers down its superior nose at the rest of America. The preaching and moralizing from this industry has, of late, especially in the Trump-era, become as sanctimonious as it is unbearable. It seems as though you can no longer turn on your television or read an interview without some self-regarding Hollywood prig telling you what a racist you are for this or for that.
Well, now we know that they don’t really mean it. It is all BS, all a political tactic to keep us divided, all a scam meant to empower the dangerous Leftists who appoint themselves as the only ones who can grant the rest of us racial absolution.
Naturally, that absolution will never-ever come, no matter how many Barry Obamas we elect, no matter how big the reparations package, no matter how much of our freedom we surrender to the demonic altar of diversity and equality.
Yep, by its very own standards, Hollywood remains one of the most racist and sexist institutions in America.
And it is our job to never let them forget that.