One can tell what the world thinks of them by looking at the cultural productions in which they are depicted. For Southerners, the images are abundant and varied. From “Forrest Gump” and “Fried Green Tomatoes” to “Beasts of the Southern Wild,” “Treme” and even “Hustle & Flow,” movies and TV shows often portray those below the Mason-Dixon line as outside the mainstream, with rich and unique cultures that transport viewers to new worlds. As if the South weren’t really part of the United States. These images — even when full of admiration for the South, or told by Southern natives — are often paired with the general sentiment that Southerners are not only conservative but also politically regressive and out of touch with everyone else — or at least those in coastal major cities like Los Angeles.
Ironically, though, “out of touch” with the American mainstream is exactly what many Southerners think of Hollywood and the entertainment it produces.
“The folks in Hollywood, those that went there to ‘make it big,’ they got enamored with the bright lights and the money and forgot where they come from,” says Ann Jones of Flowery Branch, Ga. “They forget that we’re all just people, but I think that comes from getting away from the family farm and getting ensconced in themselves.”
Jones is one of three Southerners we talked with in depth about their entertainment and media consumption — what they like, what they dismiss and what they think Hollywood does and doesn’t get about their lives in this time of cultural and political division. Their expressions are part of a continuing conversation we began after the presidential election — and will continue to have with others across the country — about the perceived clash of Hollywood values and American values.
‘Hollywood is dominated by people who lean left.’