One of my favorite articles that explains why men are seen as superior to women in a great majority of cultures is Sherry Ortner’s article, “Is Female to Male as Nature is to Culture?”. You can read it here. Ortner argues that “women’s devaluation” can be explained by the fact that women are seen as a symbol of nature, which is devalued by culture, identified with men. So women are identified with nature, and men with culture. Women are depicted as wild, unpredictable, and raw; men as logical, sound, and thoughtful.
If we look at food in this example, women are often those who cook at home, while most chefs are male. Being a chef is all about overcoming nature, making the natural “subordinate” to culture – to pots and pans, to heating. So that tomatoes growing naturally in the field will not be eaten as they are, but rather cooked and chopped, made into preserved tomato sauce. At this point, they are not simply “natural”, they are cultured to become a better version of their original selves.
With this superiority comes the societal attitude towards the two genders. Women are often not taken as seriously. I mean come on, we see a chef as much more important and superior than a home cook who feeds a family of 5, right? In this way, we also find women’s issues to be small, unimportant. Their problems to be trivial.
In this sense, it is obvious that women’s pain is also belittled. Take a look at this article, documenting the journey of a couple in the hospital, and the way that doctors and nurses ignored the woman’s pain for hours and hours, something which is actually a phenomenon and not an isolated case.
Of course the article argues that our society should take women’s suffering, issues, and ideas more seriously as a whole. That medical professionals should not ignore women’s issues. And I agree with that, surely.
But I’d like to suggest a more radical idea for women’s empowerment, which I truly believe in.
Why should nature be inferior to culture? Why should women (and men, of course), rush to the doctor’s office at every little problem, receiving their cultured care? Why should we look down upon more natural treatments and solutions to our issues?
I’d like to suggest that nature and women’s empowerment need to come together in a more interesting way:
We as women should learn to trust our bodies, to get to know them, be in touch with them. Nature is within us (and I do believe it is not unique to us – it is within men just as much). And we should trust nature, often much more than culture, to lead us to healing.
Empowerment is not about trusting other professionals to solve our problems. I don’t deny that sometimes, the medical field can do wonders for the human body, and that it CAN salvage our bodies way beyond nature’s wildest dreams.
However, I argue that much of the time, nature is just as important as culture, if not more. When I observe nature, I receive answers to my deepst concerns about how I want to live life; I receive lessons about the human spirit. I also receive answers to disease: in every plant and herb, there is something medicinal. In ever weed, there is magic. Women’s empowerment is about returning to our intuition, to nature. It’s about consuming RAW FOOD that will allow us to hear our intuition better, with a cleaer body. It’s about returning to our roots, and taking our health into our own hands, as much of the time as we can.
Doctors are human too, and they make mistakes. We as women need to know when to say – “I know somehting about my body that you do not. And you need to hear me.” We also need to know when it’s ok to heal ourselves using nature’s magical bounty. This is true empowerment.
Latest posts by Marina Yanay-Triner (see all)
- 3 Tips that Make Living Plant Based so Easy, Delicious & Healthy - December 29, 2019
- Healthy Vegan Meals: 3 Easy Ideas for Every Meal of the Day, In Under 15 - October 13, 2019
- Healthy Vegan Lunch Ideas to Last You a Whole Week - August 4, 2019