Have Women Become Their Own Enemies?

Ladies, look around you. I don't mean literally look around you, but look around at your life. Do any of you have that one friend who loves rubbing it in your face how much more sex she's having than you, or how many more guys are going after her than your measly one, or two if on a good day? Have you ever sat down for a chat with said friend and within ten minutes into the chat you feel slightly deflated?

A friend of mine said something that got me thinking a while back. She said, "At our age, in our generation, women tend to see every other woman, friend, foe, best-whatever as a rival." 

Truth is, I agree with her. We all have that select few who we wouldn't want to cross on a day that isn't going your way for fear of it getting worse. True, we could go out and try meeting charming men, but honestly, do we need another person reminding us of what we don't have?

What ever happened to the times when we all stood together and fought the sexist society to gain our rights and respect? We used to be such a team and now, here we are, ripping each other to shreds (verbally and metaphorically speaking, of course), over men and careers and money.

So, I have to ask: what has happened to us women? Why do we need to maintain the illusion that we are better than those other women around us? When did it become appropriate to secretly dislike your friend because she complains about too many men or she gloats about the attention, while innocently peeking at you out of the corner of her eye?

In this world full of powerful women, beautiful men, and rising societal expectations, have women become their own enemies? 

The End

3 comments about this work Feed