Chances are, you’re thinking: Feminism?! What’s that got to do with my health?
Well, more than you’d think.
Over the past century, women’s rights has gotten a long way in many countries around the world. From being able to vote to access to birth control and receiving equal wages, lots of effort has been made to close the gender gap.
But there’s one important area where there’s still work to be done: medicine.
And I’m not talking about having more female doctors in hospitals, although that of course would be great too. No, I’m referring to the fact that, until recently, the medical world wrongfully assumed that women and men were pretty much the same health wise.
Historically, women have been discouraged from taking part in clinical trials. Except for obvious areas like gynecology and breast cancer, the (white) male body has long been the standard for the human body. That meant that half of the population was overlooked when it came to developing new meds and diagnostic guidelines.
Let’s take a closer look at 5 reasons why we still need feminist medicine today.