What do those words mean to you?
I took a class about Women and Communication last semester. We dealt with a brief overlook of feminism, and then discussed the way that women communicate and are communicated about. It was a women's studies/comm class.
Admittadly, I don't have much historical knowledge on feminist movements, aside from the obvious ( right to vote, etc.)
The class I took really opened my eyes to so many things, and I was very happy for taking it. I think that a lot of people have negative reactions towards the word "feminist." They picture some masculine, militant, bra-burning, man-hating extremeist. ( Kind of like how PETA gives all animal rights activists a bad name.)
I think that our culture is still ridiculously prejudiced against women. There are still a zillion and one stereo-types that work against us, and no matter what we do, we always end up being the object of sexual attention.
MY particular area of concern is in the job market. Statistics don't lie, and affluent, white men (especially good looking ones) have the most advantage when applying for jobs, and are the most promotable. Women are faced with sexism, and we are also expected to be attractive, but not too attractive- because then we're just seen as a pretty face. I think that the treatment of women is practically an epidemic. I also think that women are not aware of how bad it still is for us. According to one of my proffs, less than 15% of women hold execuative positions, despite the fact that more women than men are enterting the work force these days.
So, I have a few questions.
1) What does feminism mean to you? Do you consider youreself a feminist..why/why not?
2) Do you think women like Jessica Simpson, who flaunt their body and portray an image of helpless stupidity harm women, or it doesn't make them different?
3) Have you been affected by sexism in the workplace ( not limited to just sexual harrassment.) How did you deal?
I'm incredibly interested in this topic. Also, a big pet peeve of mine is guys who think it's ok to HONK, yell lewd things, or say dumb things at me while I'm walking and they drive by. I've taken to swearing various things back at them, which I'm sure will get me slapped ( or worse) some day. If I had a nickel for every time this has happened to me, I could by myself an Hermes Birkin. It disgusts me to think men have the right to yell out comments about my body, or say thing like "hey hunny..bla bla bla." It's like, um, I'm sorry but just because I'm in public doesn't mean you can treat me like an object. !!! Also, women should NOT have to be afraid of walking alone at night. but guess what, we are.
So many things to be pissed off about! Lol.
Just interested in your responses.
I took a class about Women and Communication last semester. We dealt with a brief overlook of feminism, and then discussed the way that women communicate and are communicated about. It was a women's studies/comm class.
Admittadly, I don't have much historical knowledge on feminist movements, aside from the obvious ( right to vote, etc.)
The class I took really opened my eyes to so many things, and I was very happy for taking it. I think that a lot of people have negative reactions towards the word "feminist." They picture some masculine, militant, bra-burning, man-hating extremeist. ( Kind of like how PETA gives all animal rights activists a bad name.)
I think that our culture is still ridiculously prejudiced against women. There are still a zillion and one stereo-types that work against us, and no matter what we do, we always end up being the object of sexual attention.
MY particular area of concern is in the job market. Statistics don't lie, and affluent, white men (especially good looking ones) have the most advantage when applying for jobs, and are the most promotable. Women are faced with sexism, and we are also expected to be attractive, but not too attractive- because then we're just seen as a pretty face. I think that the treatment of women is practically an epidemic. I also think that women are not aware of how bad it still is for us. According to one of my proffs, less than 15% of women hold execuative positions, despite the fact that more women than men are enterting the work force these days.
So, I have a few questions.
1) What does feminism mean to you? Do you consider youreself a feminist..why/why not?
2) Do you think women like Jessica Simpson, who flaunt their body and portray an image of helpless stupidity harm women, or it doesn't make them different?
3) Have you been affected by sexism in the workplace ( not limited to just sexual harrassment.) How did you deal?
I'm incredibly interested in this topic. Also, a big pet peeve of mine is guys who think it's ok to HONK, yell lewd things, or say dumb things at me while I'm walking and they drive by. I've taken to swearing various things back at them, which I'm sure will get me slapped ( or worse) some day. If I had a nickel for every time this has happened to me, I could by myself an Hermes Birkin. It disgusts me to think men have the right to yell out comments about my body, or say thing like "hey hunny..bla bla bla." It's like, um, I'm sorry but just because I'm in public doesn't mean you can treat me like an object. !!! Also, women should NOT have to be afraid of walking alone at night. but guess what, we are.
So many things to be pissed off about! Lol.
Just interested in your responses.