So just how important are sports in America? I believe that sports are very important. Sports allow children and adults to get outside or inside, depending on the sport. Sports have a great impact on how we live our lives. The focus of sports on children can help with their self-confidence and they can learn to work as a team that will later on help them in the workforce. Sports have an impact on everyone from celebrities to the government.
In movies, (a lot of the time, it's war and crime movies), sports are use to discuss and plan what is going to happen next. The impact is mostly made on boys and how men are shown in the movie. Boys are made to believe that they can't act like 'girls', can't cry, or act hurt. The same thing is seen in sports where, like football, the players will keep playing even if they are hurt. This way, it seems that boys are being taught that they have to grow up to act like the hero of the story, either in movies or in sports. Boys are made to believe that only boys can be seen as the leader or the hero of the story. Boys who are seen as 'girls' would always get in the way. With movies and TV shows, they always seem to be referring to some kind of sport like football or baseball.