answersLogoWhite

0

Was the world cup ever played in the United States?

Updated: 8/18/2019
User Avatar

Wiki User

13y ago

Best Answer

Yes in 1994. Brazil won the World Cup that year.

User Avatar

Wiki User

13y ago
This answer is:
User Avatar

Add your answer:

Earn +20 pts
Q: Was the world cup ever played in the United States?
Write your answer...
Submit
Still have questions?
magnify glass
imp
Related questions

Did United States ever won the world cup?

No.


Did us ever win the World Cup?

United States have never won the world cup.


Will the United States ever collapse?

Yes the economy of United States of America collapse.


What was the significance in Woodrow Wilson attending the Peace Conference in Versailles?

no president had ever represented the United States at a peace conference


Does the law ever change in the united States?

No


Did the United States ever invade Jamaica?

no


What was different about the USA and Switzerland game in 1994 world cup?

The match between United States and Switzerland is the first ever to take place indoors, having been played under the roof at the Pontiac Silverdome.


Did the Nazis ever attack the United States during World War 2?

just shipping along the Atlantic coast


Is Australia part of the United States?

No. Australia is not now, nor has it ever been, a part of the United States.


Will there ever be a WA to treat aids an the United States?

no.


Has there ever been a women as president?

Not in the United States.


Has Clarence Seedorf ever played for Newcastle United?

Clarence Seedorf has never played for Newcastle United.