Yesterday at work, I heard one of my customers exclaim in a very negative way, “it’s because they’re feminists.” It doesn’t really matter what they were talking about, the comment was not in positive light. Why is feminism always talked about in such a negative context?
Feminism means, “the advocacy of women’s rights on the grounds of political, social, and economic equality to men.” Oh yeah, I can see how that’s horrible and humorous…just kidding. Feminism is believing in equality between men and women; pay, opportunities, education, employment, who doesn’t want that? I am so tired of people putting feminism down. Women have fought long and hard to have the freedoms and opportunities that we have today! Uhm…the 19th amendment anyone? Women’s suffrage? We were not allowed to vote in our own country until the year of 1920, that is insane. In the 1930s, coeducation was a controversy; there were two primary fields that women studied, education and nursing. The first wave of feminism occurred in the 19th century when women were pushing for equal education. Nowadays, women earn more masters degrees then men; how far we’ve come is pretty awesome!
Most men don’t think twice about things such as going for jog at night or through underpopulated trails or walking through the city at night alone. I don’t know about you, but I am constantly aware of what I am doing and who is around me. I don’t jog at night or through trails while by myself and I always have pepper spray with me in my purse. I wish that I didn’t have to think about things like this, but I do. We should not be viewed as objects.
Let’s be grateful for the opportunities that we have today, yet at the same time put our foots down to the endless cat calls.