Find the word definition

Wikipedia
Southernization

In the culture of the United States, the idea of Southernization came from the observation that "Southern" values and beliefs had become more central to political success, reaching an apogee in the 1990s, with a Democratic president and vice-president from the South and Congressional leaders in both parties being from the South. Some commentators said that Southern values seemed increasingly important in national elections through the early 21st century. American journalists in the late 2000s used the term "Southernization" to describe the political and cultural effects.