Saturday, May 5, 2012

Sex Roles

Google Images 2012

Sex roles or gender roles are the roles that society has assigned to men and women. These roles are based on the gender of people. These sex roles especially influence relationships between men and women. These gender roles affect women. It is stated by society that women are supposed to get married and stay home to raise a family. Society has dictated that the  man is expected to go out to work to support his family. Yet, if a woman chose to have a career, then she is considered to be a bad mother. The man would be looked down on because he would be thought to be unable to support his family and needs his wife to work.

No comments:

Post a Comment