0
Anonymous
The U.S. men's team has never won the FIFA World Cup. The U.S. women's team has won it twice.0
Wiki User
Chat with our AI personalities
The U.S.A have yet to win a world cup.