For almost a century, the movies depicted the “White men” as the good guys and the “Indians” as the bad guys.
These images were ingrained in the minds of generations of young Americans and it is only now that the truth of what happened in the past is being given the prominence it deserves.