Sadly, I have experienced many more "bad" men than "good". Today, I automatically distrust men. I am constantly suspicious, and usually assume that they are up to no good, and are using women for their own pleasure. I know that it is completely unfair to generalize men like this, as not every guy I meet will be a dick. However, I have yet to say that I know more than five guys who treat others the right way.
Men these days often feel superior and entitled. I feel like more men are being abusive, whether sexually, physically, mentally or emotionally. Why is it that in the world we live in, it is not unusual for a girl to have experienced some kind of abuse? In fact, I have met more girls who have gone through a traumatic experience than girls who haven't. Why is it considered the norm these days?
Last week, while feeling exhaustion, I went on a rant, in which I used a phrase I recently heard. "If you have a dick, you are a dick." After a few minutes of raving, I was interrupted by one of the boys in the room. He started yelling about how I was unfair saying that all guys were the same, and not all guys are abusive or have anger issues or commitment issues and not all guys just pick up and leave.
It struck a chord in me, even in my tired state, that he was right, and it was wrongful of me to assume all men are equal.
Recently, while talking to my (male) best friend L, I mentioned that I had androphobia. He then told me that he does too. Surprise is not a strong enough word to describe my feelings. Shock, pain. I was disappointed, disturbed. He explained that men often taunted him or even harassed him for not being "manly enough." My best friend is a great person. He suffers from schizophrenia, and struggles with eating disorders, amongst other things. He has an unsupportive family, and he is one of the strongest people I know. To hear that he gets anxious around men, and even restricted himself to not leaving his room for a few days for fear of people at the supermarket or on the street, that terrified me. Is this what society is coming to these days?
Why is there such a high number of people who feel they have been abused by men? Why is this the norm? More importantly, why is it considered okay?
I could go on for hours about this subject, but instead I think I will end with a question:
Why are men never the ones to blame?
Often, a girl is blamed for harassment because of her clothes, or attitude, and boys are picked on because they choose to dress or act a certain way. Instead of shaming the people who are being abused, why aren't men taught that the way they act is despicable?