Magazine, website & books written by teens since 1989

The Real Meaning of Feminism

Custom User Avatar
More by this author

Feminism. Probably the one of the most infamous words out there right now. Just uttering that word in front of any boy or man can almost certainly be met with a scoff or a shudder.


Why?


Because feminists, they hate men, or they are too preachy, or the are feminazis. There is such a strongly hateful and negative stigma surrounding this one word.


Personally, when I think of this word, I think of equality, and confidence, and self worth. I think it encases all the feelings of love and opportunity that women are deprived of. We are body shamed, s*** shamed, and looked over constantly.


Right now, any woman in this world has to work twice as hard to get half as much as any man.


Feminism: it's not the hate of men, or misandry, or wanting better opportunities than men.


So, to clear the air, for that boy who scoffed at the mention of feminism in second period yesterday, I will tell you exactly what it means.


Feminism is the need for equality between male and female. It is equal pay for equal work, it is the end of invalid s*** shaming and being called bossy just for trying to be a leader. It is the end of women being made inferior to men.


Feminists, we look for change. It’s all about perception. To be seen as more than weak or worthless.


Feminism is not a word to fear. Feminism itself is about acceptance and confidence and equality, so please, don't shudder, or turn your back when feminism comes up in a conversation. It needs to be addressed. People need to treat it as a valid problem and raise awareness.


Because if anyone passes up their chance to defend this word, they are passing up a up a chance to defend someone’s dignity, self identity, and rights.




Post a Comment

Be the first to comment on this article!

Site Feedback