real quick lesson for those schooled by Hollywood. Who has been called Boy throughout the history of America? You, the African-American man. Working with cattle was and is a dirty job. A job that men of color performed with great skills before the advent of motion pictures. Now, Hollywood wanted to make movies about horses, cattle and cowboys. So who got the leading role? John Wayne and other whites
Posted on: Sun, 28 Dec 2014 10:33:54 +0000
Trending Topics
Recently Viewed Topics
© 2015