By: Sierra Ramos
I once overheard a conversation between a couple of girls who were talking about how women are treated unfairly in the work field, during one of our class discussions about internships and future jobs. What mainly stood out to me and got my attention was when one of my classmates said that she felt violated because she was assaulted by one of her male co-workers at her recent internship at a magazine. Apparently, she said the man felt her “derriere” and would say vulgar things to her, but when she told her boss about the incident, the man denied it and went on his way. I think stories like this need to be taken more seriously because next time, the incident could be much worse that just being touched inappropriately.
Another thing that did not help my classmate’s case was when she said that she was one of the only girls working among a bunch of men, so they automatically felt like they had more power than her. I do not know why males are more dominant than women when it comes to work, like how they get paid more and earn better opportunities to advance in their careers, but I do not think that is fair. Unfortunately, I do not think that this way of thinking will ever change, hopefully it will soon though, because a man is a “man” and women are just there to take care of the men and please them. Women are supposed to be glamorous and not have high career goals like men because that is what society has come to think. Even hundreds of years ago when men went off to war, the women stayed at home and took care of the house and their family because it was unheard of for a woman to do a “man’s job.”
As a woman myself, I will try and keep these things in mind when I have my own real career one day. I hope to work at a magazine one day, preferably an entertainment or fashion magazine, like Cosmopolitan, so there will probably be more women than men there, but you never know. I will just keep my head high and always strive to do the best that I can possibly do.