answersLogoWhite

0

Have the Germans ever won the World Cup?

Updated: 8/18/2019
User Avatar

Wiki User

13y ago

Best Answer

Yes but it was only West Germany they have won three world cups. The last win was in 1990.

User Avatar

Wiki User

13y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Have the Germans ever won the World Cup?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

Has hondraus ever won the World Cup?

No Honduras has never won the world cup in the history of the world cup.


Has Venezuela ever won a World Cup?

In soccer (football)?No, they have not won the World Cup.


Has Ivory Coast ever won the world cup?

No they have not won the world cup as yet.


Have the US football team ever won the world cup?

No they have not yet won the world cup.


Did salvador ever won the World Cup?

I think you mean to ask, "Did El Salvador ever win the world cup?" The answer is no, no CONCACAF team has won the World Cup.


Has Tunisia ever won the World Cup?

Turkey have never won the soccer World Cup.


Has Swaziland ever won the FIFA World Cup?

No they have never won the world cup, I do not think they even qualified for a world cup.


Who won the first ever fifa world cup?

The first world cup was won by Uruguay in 1930.


Has paragauy ever won a fifa world cup title?

No Paraguay have never won the world cup.


Has Uruguay ever won the FIFA world cup?

Yes Uruguay won the world cup in 1930.


Who won the 1st ever World Cup?

The first F.i.f.A world cup was won by Uruguay in 1930.


Did wales ever won a world cup at anything?

no wales has not won any world cup's in football