answersLogoWhite

0

The United States Soccer men's team has never won the world cup. The best finish came in 1930, during the first World Cup tournament, where the US placed third.

The US women have won twice, in 1991 and again in 1999.

User Avatar

Wiki User

14y ago

What else can I help you with?