I am a feminist. And I'm neither embarrassed nor unwilling to say it. What does embarrass me is the bad press the term gets from the perception of the "stereotype feminist". I am a feminist, and I hate the stigma that so often comes with such a simple and honest statement. Feminism is simply about equal rights between the sexes, why should that be such a hard thing to accept? More often than I should, I hear people uttering and grumbling about how feminism is outdated and unnecessary in this wonderful modern society we have. However just because women are equal in the eyes of the law, does not mean that all social inequality is magically eradicated. In fact, I was watching a documentary recently [Blurred Lines: The New Battle of the Sexes] that suggested that due to a rise is social media and sexism in comedy, the situation is worsening. Sure, women can vote and access job positions equal to men but that can't stop social views and distasteful jokes. Feminism still exists; because equality is something you don't give up on.
|Homage to some of the important females in my life, past and present.|