TRENDING NEWS

POPULAR NEWS

Why Do Certain Tv Shows Still Portray The President As Being A White Man

Why is the President of the United States still almost always played by a white actor?

This is still completely the case, as proven by the statistics and lists that are supposed to disprove it. 87.5% of the actors who have portrayed the president in the most recent 48 movies on Wikipedia's list of people who have portrayed American presidents have been white men. Why? Mostly because to date 97.8% of actual American presidents have been white men. This means there is a powerful, and (to date) statistics-backed image of what a president "looks" like.When you have this sort of bias, it is hard to fight both internally and externally. Even when some people overcome their own inherent tendency toward seeing a president as a white male, it has often been used as a storytelling element. Several decades of "black president" roles were cast to show that America has, at least in this one fictional world, moved forward in its conversation on race, or to show that it is vaguely "the near future". And at least one of the movies from the answer wiki, Chris Rock in Head of State, was the story of a political party's smoky back room operatives only picking a non-white candidate because they thought the election was doomed anyway, so it might as well be doomed in ways that would lay the groundwork for later elections.This story will change over time, because much of the inertia I described came from a 100% white data set. It is no longer enough to show "more progressive America" by casting a black actor. It'd be nice to say that this means this storytelling hack will go away, but it's more likely that it'll just change its clothes. Expect future asteroid strikes, zombie plagues, terrorist invasions of Ohio, and central-Asian prisoner extractions to be overseen not by a president who looks like Chris Rock, but by one who looks like Danny Pudi.

So if white people aren't racist, why is everything so whitewashed?

I keep hearing a lot of white people say "Oh, I'm not racist. I'm for equality. Everyone should be equal." But I can't help but see that no one (other than people of color) are doing anything about how the media portrays races. There are usually several white characters in a show meant for the general public. You've got your blonde, brunette, black haired, and second blonde white people. And that might just be the female portion of the cast. The show may have 3 more white males. And then it's like producers pick a minority out of a hat and place an asian or a black in the mix (male OR female. Almost never both unless they are a couple). If white people are for equality, then why are there are usually 5 white people on a show and 1 minority?

The only time I do not see this is when the show or movie is specifically targeting minorities. So a show with a majority black cast might show on BET or UPN nights, where they are made specifically for black audiences. Same goes for the AZN channel. But as far as GENERAL entertainment goes, white people are always in the majority. And, to add to that, the few minorities they might decide to add are white washed to the degree of almost non-recognition. Why is this okay? And why aren't white people don't more since so many claim to want equality?

So that was a rant. This is kind of my real question: Tell me, have you seen a show where the main character is black (just because the director felt like casting a black person), the supporting characters are white, and the show itself is NOT about race nor is it marketed primarily towards minorities (i.e screen on BET, UPN, or dubbed a 'black' movie)? If you have, please name it.

Are Politically Correct tv shows like "24" and "Grey's Anatomy" the reason why Obama was elected president?

i think your right. If you average the amount of actual black head doctors with the amount portrayed on tv, youll see its pretty lopsided. Blame the liberals

Why do most black people act so racists toward whites?

I see this all the time on the media and at school.Since I go to a mostly black school I always hear the black kids say racist stuff to the white teachers and students like "White people ain't got no booty" or "White people always doing (insert word or phrase)", especially when they get mad.

I especially can't hear the end of it in the media.Every time I turn to a show like House of Payne or anything with black people on it they always say stuff like "We as African Americans..." or "Too many black youths are...", which wouldn't be a bad thing if it wasn't always stressed.

And why do most black people always need a black role model to look up to,for example, a lil black kid wants to be a karate master but doesn't do it because he doesn't see a black man or woman doing it even when there's hundreds upon hundreds of other successful men & women who's not black doing it.

Why do most black people play the race card, like if they get kicked out a job they always go on the news crying "racial discrimination" even though 8 times out of 10, its not.Why do some feel like all white people STILL owe them something for slavery that happened years ago, the Jews were tortured by the Nazis in Germany but they don't start rioting every time a German or someone with nazi ancestors walk by,Why, because it's old news.

Another thing that gets me is how movies about black people almost ALWAYS involve someone in the hood or someone in the gang tyring to get out. Or someone cheating or getting pregnant at 16, and its suppose to represent a lot of black people accurately.

I do realize that there is still racism but that's for every race, not just for black people.Slavery was wrong but that's dead and gone.

TRENDING NEWS