answersLogoWhite

0

no the amerians are weak soccer players and should not be in the cup. why they are even allowed a jersey is beyond me. they are incredibly overrated because they are from the most media frenzied country on earth.

User Avatar

Wiki User

14y ago

Still curious? Ask our experts.

Chat with our AI personalities

RossRoss
Every question is just a happy little opportunity.
Chat with Ross
ReneRene
Change my mind. I dare you.
Chat with Rene
ViviVivi
Your ride-or-die bestie who's seen you through every high and low.
Chat with Vivi
More answers

No, the USA has never won the World Cup. Their best performance is quarter finals in 2002.

User Avatar

Wiki User

15y ago
User Avatar

Never. Their best performance was in 1930, when they placed third.

User Avatar

Wiki User

14y ago
User Avatar

No the U.S.A have not won a world cup.

User Avatar

Wiki User

14y ago
User Avatar

Add your answer:

Earn +20 pts
Q: Has America ever won the World Cup?
Write your answer...
Submit
Still have questions?
magnify glass
imp