The Hard Truth about Women's Health
Let's talk about women's health. Something so essential, yet often not discussed other than at our yearly check-ups. When I think about the word health and what it has come to mean for women, I think about exercise and diet restrictions. I think about our doctors and gynecologists telling us we've gained too much weight this year, the stress of feeling like we need birth control but having to deal with all of its side effects, the relentless UTIs and yeast infections, the bleeding, the cramping, the hormones...you name it, we deal with it - and all because we have a vagina.
And while we have been taught to be hush-hush about the most intimate part of ourselves, it is this very organ that gives us our womanhood. It is this which allows life to be brought into this world. It is this which gives us strength. Take it from Betty White when she said, “Why do people say 'grow some balls'? Balls are weak and sensitive. If you wanna be tough, grow a vagina. Those things can take a pounding.” And, like every woman knows, they sure do.
Our lady parts are sacred - this we know. So why is it considered so taboo to talk about? How often do we Google "is this smell normal", "why am I itchy", and "what is this bump"? While Google might be grateful to be our loyal confidant, can we normalize talking about women's health with actual people for God's sake? Many of us women might not feel so insecure if we just had a little more education about our own bodies. What is considered "normal" and when should we raise an eyebrow?
And, education isn't the only thing we need. Us as women know the drill - it's been instilled in us for generations that we have to morph our bodies to the expectations society holds. Whether this is not being allowed to wear tank tops at school because our bra straps will show, maintaining a certain weight range, or buying all the products to make sure our vajayjay smells like roses (even though they're awful for us), we are taught that we need to alter ourselves to make others comfortable.
Can I just be the first to say...what the eff is up with that?! Do men have a strict dress code? Do they have to take birth control? Does society have any qualms about the male genitalia? Who made these rules and how can we reverse them? Is it even possible?
We've all read the articles - How to tighten up! Eat your pineapple! Use your wipes! Why should we as women be made to feel self-conscious about how we are in our natural state? Why do we need to shave our entire bodies, and get rid of every last speck of hair that is instinctive for our bodies to grow? Why must we go to tanning booths and risk skin cancer to get that tan "glow" because our natural skin isn't deemed pretty enough?
My main question is, why is women's health not focused on women? Why are the things we do daily to maintain acceptable hygiene for...well...everyone else?! And, maybe this is a feminist way of thinking - but don't we want to live in a world where women feel not only comfortable in their own skin, but supported in both their physical and mental health? Why must we be constantly changing, tweezing, shaving, plucking, spraying our bodies to be something they're naturally not?
This isn't to say these things should be outlawed - many of us women do these things for ourselves too, and we should do whatever makes us feel good. But we should be brought up with the knowledge instilled in us that our bodies are perfect as they are. While we may shave or tan for our own benefit, we should be changing the narrative that this is for us and for us alone.
So, what do you say? Are we ready to put the focus of women's health on us? Are we ready to instill in our daughters, granddaughters, and nieces that our natural state of being is perfect and doesn't need to be altered? If there's one thing we as women can do in this world, I hope it is to stand up against all those opposing and start leading the healthy, happy lives we deserve.