Sure, I admitted it was laughable, but people keep mentioning the benefits of western culture as if the colonizers actually cared to transmit these things. The British did a fair job of teaching natives about a brand new way of looking at warfare, showed them the perfect example of oppressive government, methods of brutality that rivalled the native culture's if not surpassed. I still don't see the British systematically transmitting the humanitarian or educational merits of western society to all of the native cultures they colonized. You say that colonization did this quicker then more peaceful means would have but what has it done? In most cases the British ruled for centuries and it can't be clearly said that in most cases they left the natives better off then when they came.
As you said yourself, the local ways of governing of the colonized people were often quite a match to the worst of the colonizers. This goes even with comparison to the Spaniards.
But imperialism and colonialism brought you lots of technological developements that wouldn't exist without the US ( which is a product of colonialism ), as well as introduced crops to the old world that enabled it's population to grow immensly.
I'd say that it was a mixed blessing.
Comment