I had an interesting thought when someone on a thread said, the US is not the center of the world, it really made me think.
Is the US the center of the world?
By no means do I wish to convey that Americans are superior, after all Americans is a term that covers so many different ethnicities and nationalities that is is impossible to name them all.
Think of this? If Panama was wiped cleanly from exsistense, would there be a big problem, no bananas, no cocca, things like that. If Germany was wiped from exsistence thigs would be bad, the money would be gone, the various things that they make and produce, the world would be severly fucked up for a long time, this also goes along with many other countries like Japan, France, Italy, Switzerland, Canada, ect ect.
But if the USA was wiped out cleanly from exsistence what would happen? World chaos? Huge economic depression? Lost science discoveries and medical discoveries, great minds. The impact of the USA loss would be greater than any other nation in the world in the roles that it plays. Same with the Middle East and losing the life blood that is Oil.
So if some ways, yes, the USA is the center of the world. That doesnt mean everything evolves around it.
------------
The French have only ever won ONE war, the French Revolution, because the opponent was also French.