answersLogoWhite

0

Did Americans ever win the World Cup?

Updated: 8/16/2019
User Avatar

Wiki User

16y ago

Best Answer

The American men's team has never won the World Cup. They finished third once, in 1930. The American women's team has won the World Cup twice (1991, 1999) and finished third on three other occasions (1995, 2003, and 2007).

User Avatar

Wiki User

16y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Did Americans ever win the World Cup?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

Did Mexico ever win a World Cup?

No Mexico have still to win a world cup.


Has Kenya ever win the world cup?

no


Did china ever win the world cup?

No


Did portegal ever win the world cup?

No


Did Korea ever win the world cup?

No


Who was the first ever team to win the world cup?

Uruguay was the first country to win the world cup in 1930.


Is England football team ever going to win the world cup?

No they are not going to win the world cup soon.


Did US ever win the world cup literally?

no


Was there ever an Asian country to win the world cup?

no.


Did North Korea ever win the world cup?

no:(


Did russia ever win the fifa world cup?

No.


Did Chile ever win the Football world cup?

No, Chile have never won the World Cup.