For years there have been complaints about the misrepresentation of black people and the whitewashing of black culture in Hollywood films, and for equally as long those complaints have fallen on deaf ears.
To deal with the backlash, Hollywood supported it's stance on misrepresenting/underrepresentation of blacks in film/black films by stating there simply is no market for those types of film. Keep in mind Tyler Perry when seeking backing for one of his films was once told by a Hollywood exec that "blacks who go to church don't go to the movies" (image the ridiculousness of a white man dictating to a black man, what blacks do in their leisure time).
The recent success of 3 films starring blacks in lead roles (which all held the top spot in the box office for five consecutive weeks) has single handedly dismantled Hollywood's argument for not producing more films with blacks in leading roles.
Even though it's been proven in times past and now that there certainly is a market for black films/blacks in film, today's question is, should blacks still be seeking acceptance in Hollywood?:
["The Perfect Guy,” a romantic-thriller starring Sanaa Lathan, Michael Ealy and Morris Chestnut, debuted at no. 1 this weekend and earned nearly $27 million at the box office. The previous weekend, “War Room,” a Christian drama starring Priscilla C. Shirer, T.C. Stallings and Karen Abercrombie, among other black actors, earned $12.6 million as well as the lead status its second week in theaters. Prior to that, the N.W.A. biopic, “Straight Outta Compton,” secured its top spot for three weeks straight and is now the highest-grossing musical biopic.
This streak of success is significant because of Hollywood's ongoing lack of representation of actors of color among its films. The 2015 diversity report by UCLA’s Ralph J Bunche Center for African American Studies found that the top three talent agencies in the country, which are dominantly led by white executives, are to blame for the lack of minorities on the big screen.]source
What are your thoughts? Should blacks and other underrepresented groups in Hollywood still be seeking acceptance or is it time for them to establish their own movie studios, distribution networks, theaters, etc., where they can tell their own stories and celebrate their accomplishments? Is a mass exodus of blacks from Hollywood to start their own thing the only way blacks will get proper representation in film? Why don't black actors, directors, producers, etc., pool their resources together to create their own film industry completely independent of Hollywood?
You know the routine: View, Share and Discuss... Don't forget to leave a comment below.
Follow me on Twitter @bigjyesupreme