Sunday, February 26, 2012

The Importance of Sports in America

So just how important are sports in America? I believe that sports are very important. Sports allow children and adults to get outside or inside, depending on the sport. Sports have a great impact on how we live our lives. The focus of sports on children can help with their self-confidence and they can learn to work as a team that will later on help them in the workforce. Sports have an impact on everyone from celebrities to the government.

In movies, (a lot of the time, it's war and crime movies), sports are use to discuss and plan what is going to happen next. The impact is mostly made on boys and how men are shown in the movie. Boys are made to believe that they can't act like 'girls', can't cry, or act hurt. The same thing is seen in sports where, like football, the players will keep playing even if they are hurt. This way, it seems that boys are being taught that they have to grow up to act like the hero of the story, either in movies or in sports. Boys are made to believe that only boys can be seen as the leader or the hero of the story. Boys who are seen as 'girls' would always get in the way.  With movies and TV shows, they always seem to be referring to some kind of sport like football or baseball.

2 comments:

  1. I agree with what you are saying that sports is important thing in America especially in our society today. With that in our society boys are taught to be tough and not to act like a girl.

    ReplyDelete
  2. Some sports are a single man game, like tennis, or track. You may be on a team but you are playing alone with the opponent and there is no need for team work, just team support. Which not everyone does anyway.

    ReplyDelete