Chat with our AI personalities
The US Men's team has never won a FIFA World Cup. The US Women's team has won two FIFA Women's World Cups (1991, 1999).
The USA has only hosted the FIFA World Cup once, in 1994.
They also hosted the FIFA Women's World Cup in 1999.