FREE WHITEWATER

Film

A Black Cowboy’s Story

Cowboys are among the most iconic figures of the American West. They’re mythologized as strong, independent people who live and die by their own terms on the frontier. And in movies, the people who play them are mostly white. But as with many elements of Americana, the idea of who cowboys are is actually whitewashed…