What's a feminist? Recently this has become the subject of much confusion, debate and importance in our society. I hear this debate all the time. What constitutes a feminist? Do you have to be a man-hating militant to be a feminist? Does simply the belief that as a woman you should be in control of your own life and your own future and as such have equal opportunity as men make you a feminist?
If that's the case, why has feminism become a dirty word? Much of what we know of what feminism is and what shapes our opinions on it comes from the media, so let's look at what some currently relevant cultural icons have to say about the topic.
Strictly speaking, feminism is defined as "the doctrine advocating social, political, and all other rights of women equal to those of men." Doesn't sound too scary does it? I think most women would be offended today if someone told them they couldn't do something, just because they were a woman. I mean, this is the 21st century, we're supposed to be past that right? Why then has "feminist" become such a loaded, controversial and largely misunderstood term?
Unfortunately the stereotype of a feminist is a misleading one, often depicting a radical and angry female misandrist walking around naked and taking irrational offense to anything a man might say to them. As a result, many women find themselves prefacing statements about their desire for equality with the phrase "I'm not a feminist, but..." just like Katy Perry did when accepting Billboard's Woman of the Year award last year.
While some public figures have (perhaps unwittingly) maintained this stereotype through comments such as this one, many others have utilized their celebrity as a tool to combat this misconception. For example, in a recent interview with The Observer, Louise Brealey, a British actress, was quoted saying "Seriously, though, I'd like every man who doesn't call himself a feminist to explain to the women in his life why he doesn't believe in equality for women."
On the other side of things are people like Carla Bruni who claimed in an interview for Vogue in September that "We don't need feminism in my generation...I'm not at all a feminist...I love family life" insinuating that to be a feminist you cannot love family life, an insinuation that I don't buy into.
Check out the photo gallery to see what some other contemporary icons have said regarding feminism!
What's your stance, Lovelies? Is feminism still an important aspect of our society? What do you think about celebrities speaking out about it?