When people think of the south, they think deep south plus Kentucky and Tennessee. And even by your reckoning, these areas are not gaining much.
Don't tell me they don't include Texas in that idea! And Virginia is many times included in the South as well. It happens when you were capital of the Confederacy .
So there has been no change to the culture, unlike in the South, were outsiders are bringing cultural pressure.
There has been as much change to the culture of NYC as there has been to the South. I'm not really sure where you've been, but I don't really see this 'cultural pressure' that is destroying what the South is. If anything the South is accepting these new cultures and integrating them with their own. At least that is what I see in Atlanta and Tennessee, etc.
It's not like you can't tell Atlanta is in the south anymore!
Comment