While the USA Men’s national team couldn’t even qualify for the World cup 2018, their female compatriots won the 2019 FIFA Women’s World cup held in France.
Throughout the tournament they did not have any draws, just beat all their opponents. The Thailand national team had the worst luck, having been defeated with a score of 13:0. We should note that this success is not the first in the history of the USA Women’s soccer national team. Actually, they always reach semi-finals. In the inaugural 1991 World cup they also were the best, in 1995 they took bronze, 1999 – gold, 2003 and 2007 – bronze, in 2011 they took silver medals, in 2015 tournament hosted by Canada – gold. Impressive, right? This success is surely having a positive influence on women’s football in the US. However, there is a second, less positive trend – soccer in the USA is more and more associated with
the women. As children, men go to American football, which is considered to be “for real men”, because it requires you to be strong, physically, while women – to soccer, which is deemed to be a gentler sport. This is why, while men’s soccer is less popular in the USA, than in Europe, women’s football in the USA is more popular than anywhere in the World.