No, Germany didn't win the World War 2 at all.
Lose
Schlieffen Plan
The Allied Powers won WWI
russia
Germany!
No they did not win the war
world war 1 ended when Germany was losing alot of land and noticed they would not win they surrendered on November the 11 world war 2 ended when Russia captured Germany and japan surrendered some time in 1945
The Allies, joined by America's less worn-out troops, were able to push Germany back and win World War I.
No Germany was defeated & the Kaiser went into exile in the Netherlands.
No. Germany lost World War I.
Japan emerged as a world power, but Germany was weak and humiliated