contrary to popular opinion...
i don't know how this idea of the continent of africa (or afrika) being a paradise has been sustained for so long. everything from barber shops to halal meat stores got signs with the shape of Africa on them... and i noticed that while hearing/reading words like "Europe" or "Asia" gives me no emotional feelings, Africa (as well as America) oddly makes me feel good if that makes sense...
but the point is, africa has been Edenized or something despite the fact that it is almost entirely a continent of poverty, dictators, feudal warlords, and roving bands of teenage mercenaries. am i wrong in this summary?
and are people blind or something? mis-educated? why does everyone seem to think africa is a paradise? i understand the black nationalists back in the day hyped it up but look at the news people!
(and am i right in assuming, it is just an urban/black opinion that africa is great... yall don't got people in the suburbs dreaming of africa?)
thanks
i don't know how this idea of the continent of africa (or afrika) being a paradise has been sustained for so long. everything from barber shops to halal meat stores got signs with the shape of Africa on them... and i noticed that while hearing/reading words like "Europe" or "Asia" gives me no emotional feelings, Africa (as well as America) oddly makes me feel good if that makes sense...
but the point is, africa has been Edenized or something despite the fact that it is almost entirely a continent of poverty, dictators, feudal warlords, and roving bands of teenage mercenaries. am i wrong in this summary?
and are people blind or something? mis-educated? why does everyone seem to think africa is a paradise? i understand the black nationalists back in the day hyped it up but look at the news people!
(and am i right in assuming, it is just an urban/black opinion that africa is great... yall don't got people in the suburbs dreaming of africa?)
thanks
Comment