standard Did Germany Finally Conquer Europe?

Unnoticed to most, and right on the heels of the aggressive remilitarization of Japan, is the brilliant subtlety of how Germany has finally conquered continental Europe – urging many in the Bible Belt to see this as a precursor to the 7th resurrection of the Holy Roman Empire. (I get aggressive at times when I see the result of Obamian foreign policy -slash- that of his handlers.)

After all these years I thought the Germans had taken over America since they headed our space program after World War II while other Germans landed in key parts of the U.S. government. But what do I know?

Did Germany Finally Conquer Europe?

Yes. Germany now controls Europe.

Comment Using Facebook