Originally posted by Boris Godunov
Absolutely not. Since the time of the Roman Empire, the Germans have altered the course of Western history. First, both England and France owe their origins to the Germanic tribes, and the Germanic tribes spread from norther europe as far as northern Africa. The descendants of the Franks became France and of the Angles and Saxons became the English.
Absolutely not. Since the time of the Roman Empire, the Germans have altered the course of Western history. First, both England and France owe their origins to the Germanic tribes, and the Germanic tribes spread from norther europe as far as northern Africa. The descendants of the Franks became France and of the Angles and Saxons became the English.
In addition, the Protestant Reformation, which began in Germany, has arguably had as much world-wide impact than anything done by any other European power. And then of course, much of the 20th century's history was a product of the world dealing with Germany's military and imperialist aspirations.
I'd also point out that the contributions to culture--music, art, literature, religion, SCIENCE, philosophy, etc. from the Germans far outstrips anything coming from the Spanish or Scandinavians in terms of world impact.
Comment