answersLogoWhite

0


Best Answer

no the amerians are weak soccer players and should not be in the cup. why they are even allowed a jersey is beyond me. they are incredibly overrated because they are from the most media frenzied country on earth.

User Avatar

Wiki User

13y ago
This answer is:
User Avatar
More answers
User Avatar

Wiki User

14y ago

No, the USA has never won the World Cup. Their best performance is quarter finals in 2002.

This answer is:
User Avatar

User Avatar

Wiki User

13y ago

Never. Their best performance was in 1930, when they placed third.

This answer is:
User Avatar

User Avatar

Wiki User

13y ago

No the U.S.A have not won a world cup.

This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Has America ever won the World Cup?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

How many times America won the cup?

America have not yet won the world cup.


Have NZ ever won the America's cup?

No


Has hondraus ever won the World Cup?

No Honduras has never won the world cup in the history of the world cup.


Has Venezuela ever won a World Cup?

In soccer (football)?No, they have not won the World Cup.


Has Ivory Coast ever won the world cup?

No they have not won the world cup as yet.


Have the US football team ever won the world cup?

No they have not yet won the world cup.


Did salvador ever won the World Cup?

I think you mean to ask, "Did El Salvador ever win the world cup?" The answer is no, no CONCACAF team has won the World Cup.


Has Tunisia ever won the World Cup?

Turkey have never won the soccer World Cup.


Has Swaziland ever won the FIFA World Cup?

No they have never won the world cup, I do not think they even qualified for a world cup.


Who won the first ever fifa world cup?

The first world cup was won by Uruguay in 1930.


Has paragauy ever won a fifa world cup title?

No Paraguay have never won the world cup.


Has Uruguay ever won the FIFA world cup?

Yes Uruguay won the world cup in 1930.