Outrageously Funny Search Suggestion Engine :: Southernism

🔎


What is the definition of Southernism? 🙋

👉 Southernism is a concept in American literature that refers to the idea of the Southern United States as being more conservative and traditional than other parts of the country. It was popularized by authors such as Ralph Ellison and John Updike, who used it to criticize the perceived laxness and lack of progress in southern America. The term was first coined by Ralph Ellison in his novel "Invisible Man" (1963), where he describes a group of Southern Americans who are depicted as being


southernism

https://goldloadingpage.com/word-dictionary/southernism


Stained Glass Jesus Art