No.
United States have never won the world cup.
Yes in 1994. Brazil won the World Cup that year.
The men's United States soccer team first played in the World Cup in 1930. The women's United States soccer team first played in the World Cup in 1991.
it was held in the United States
The United States
Willow Cricket, DirecTV, and Dish Network will broadcast the 2011 Cricket World Cup in the United States.
The United States advanced to the Semifinals in the first World Cup in Uruguay in 1930. Since then, the US has only made it to the Quarterfinals.
The 1994 FIFA World Cup was held in the United States of America.
The United States of Amercia came third in the 1930 World Cup.
The United states women did defeating the Chinese in PK's
The United States.
United States.