answersLogoWhite

0

no the amerians are weak soccer players and should not be in the cup. why they are even allowed a jersey is beyond me. they are incredibly overrated because they are from the most media frenzied country on earth.

User Avatar

Wiki User

14y ago

Still curious? Ask our experts.

Chat with our AI personalities

JordanJordan
Looking for a career mentor? I've seen my fair share of shake-ups.
Chat with Jordan
SteveSteve
Knowledge is a journey, you know? We'll get there.
Chat with Steve
BlakeBlake
As your older brother, I've been where you are—maybe not exactly, but close enough.
Chat with Blake
More answers

No, the USA has never won the World Cup. Their best performance is quarter finals in 2002.

User Avatar

Wiki User

14y ago
User Avatar

Never. Their best performance was in 1930, when they placed third.

User Avatar

Wiki User

14y ago
User Avatar

No the U.S.A have not won a world cup.

User Avatar

Wiki User

14y ago
User Avatar

Add your answer:

Earn +20 pts
Q: Has America ever won the World Cup?
Write your answer...
Submit
Still have questions?
magnify glass
imp