Why feminism is no longer relevent in the United States

February 21, 2018
By ProudRepublican BRONZE, NYC, New York
ProudRepublican BRONZE, NYC, New York
1 article 0 photos 1 comment

Is Feminism Even Relevant?

Feminism as we know it, should not be relevant in the United States of America. Let us start with the basics shall we? The definition of feminism is the advocacy of women's rights on the basis of the equality of the sexes. This belief was very important years ago when women didn't have the right to vote, run for office or yell at their husband. But now, women can do anything men can do. As a matter of fact, some of the most powerful people in recent years are women! The queen of England, Hillary Clinton, Sarah Palin and many more! What feminists propose is that women do not have the same rights as men. However, this is untrue. For example, a statistic that I have heard from feminists over and over again, that women make 75 cents to the dollar that men make. This statistic takes into account one thing. The raw salary of men to the raw salary of women in the workplace. But, this statistic doesn’t take into account a few critical points, there are more men in the workplace, men typically have higher paying jobs (I will explain why this is in a moment) and women are not able to work as many hours as men because they take care of children (I will explain this as well).

Men have higher positions and paying jobs because of a little thing called pregnancy. Men can’t get pregnant, women can. Imagine you work for a million dollar industry. The owner of this industry happens to be a women, what would you do if she held a meeting saying that she would miss months of work because she would have to take care of the baby inside of her. You would probably not like that, considering that her job would have to be filled by someone either new, inexperienced or border line not ready for the responsibilities that are present when having an ownership role in a big company. Now, to explain why women tend to not work long hours. Women do not work long hours because it is there duty to take care of the elderly, children and their spouses. But this duty does not come from gender-stereotyping, rather it comes from a national study that shows why and how women choose to become caregivers. (Information here is given by caregiving.org) Although men do help out, women spend 50% more time caregiving than men do. 33% of women decreased working hours, 29% passed up a job promotion, training or assignment, 22% took a leave of absence and 20% switched from full time to part time employment. These statistics show one thing, men don’t give this responsibility to women. Women give it to themselves. Women are just naturally better caregivers and there's nothing wrong with that and nothing that feminist groups can change. Since evolution began men were always the big buff dumb creatures. They had one job. Provide for their family. Women were always more careful, more intelligent and gentle. Be honest, would you really want a 15 year old boy taking care of your infant or a 15 year old girl? I sure would choose the girl. Simply because of her naturally more responsible genes.

Now, I’m going to talk about something that has really riled up over the past year. Sexual harassment. Sexual harassment is a huge problem that people have the right to protest. But, there a few things the protesters choose to leave out, and here they are. Did you know that if a man rapes a women, he is expected to receive a maximum of 25 years in prison. However, if a women rapes a man she is expected to receive a maximum of 5 years. Why is this? Well it’s pretty simple. Men are seen as aggressive dominant creatures (and they have been since the beginning of time) which is why they get in so much more trouble. Is this fair. Not at all. Another one of the things that feminists leave out about sexual abuse are the amount of false accusations there actually are. In the 21st century thousands of white women have falsely accused black men. Why do feminists choose to leave this out? Oh I know, it makes them look very bad. (Keep in mind that these last sentences had evidence that was found in a National Post article by a women, Barbara Kay)

Feminism as we know it, is no longer about standing up for the rights of women. It has become a tool to shame the other gender. To make us look bad. But why? What have men done in recent years to deserve this? It is completely unfair to blame men now, for the mistakes we made years ago. What has the world come to now? Now it is apparently sexual harassment to give a flirtatious comment, give a hug or a smile. This is not even the worst part. The worst part is when parents brainwash their children taking them to rallies with a bunch of people wearing Pink Pussyhats not explaining to them what is going on. All they do is just bring their children and tell them what to believe. These poor kids who don’t know any better,  just nod their heads. What their parents don’t tell them, is to go research, find evidence and find a side that suits them the most. You think that I am an anti-feminist and a proud republican because of what my parents do and say? No. I do my research, If I have a question I attempt to find the answer and I disagree with my parents on political views all the time. I do not just nod my head and agree.


David Ford


The author's comments:

I am a 14 year old student in New York. In case you have not ever been able to visit my city, it is filled with liberals and feminists. But thats just how it is, and that does not bother me a bit. I have seen rally's and protests about womens rights. Did anyone ever seem to realize that feminism is no longer about women's rights but rather to critic and harass men, because of events that have happened in the past. Follow me on instagram and slide into your dms if you want to chat 1v1. (david.4ord)  


Similar Articles

JOIN THE DISCUSSION

This article has 0 comments.



SciArc

MacMillan Books

Aspiring Writer? Take Our Online Course!