Since its initial growth of popularity in post-War Europe, naturism—the practice of nonsexual social nudity—has been touted as a sort of antidote to many modern social ailments. When industry overtook cities with its smoke, factories and de-humanizing working conditions and World War I razed countrysides, people were searching for a return to nature, a return to being real, autonomous humans instead of working and fighting automatons. Then when naturism spread to North America, it offered a refreshing liberation from the regressive Puritanical ideals that had shaped Canada and the United States, along with a new perspective of human bodies outside of marketing and advertising. To be in nature was an escape from a culture that had removed humans as far as we had ever been from our natural environments. And to do that naked with others was to re-learn, in a way, what it meant to be human. To be more than a shameful, sexual, machine-like object.
And so today naturist parks and resorts have become quaint, idyllic and seemingly egalitarian utopias. Here you don’t have to worry about what you look like; no one will judge you based on your size or scars. You can see what real, natural people look like and experience a deeper connection to nature. This is where you can unlearn body shame, hypersexualization and internalized industrialization (the idea that we must always be working, producing or consuming). Naturism is effective as an antidote. It’s one of the few practices that can offer such deep healing with something so simple.
“Naturism is a way of life in harmony with nature characterized by the practice of communal nudity with the intention of encouraging self-respect, respect for others and for the environment.”International Naturist Federation’s definition of naturism
But have we taken our lessons back with us from our naturist parks and resorts? Is the antidote of naturism a true antidote and not just a temporary bandage?
Are we actually healing or just escaping?
It’s always worth repeating the International Naturist Federation’s definition of naturism: “Naturism is a way of life in harmony with nature characterized by the practice of communal nudity with the intention of encouraging self-respect, respect for others and for the environment.”
Spending time naked can heal our relationship with our own body. It teaches us to accept ourselves as we are. Spending time naked with others can heal our social relationships. It allows us to see others as more than commodified, hypersexualized objects. And spending time naked in nature can heal our relationship to the earth, reminding us of our place in nature. While this healing may begin in a state of undress, it shouldn’t be confined to a naturist setting. We need to shift our entire consciousness to a better understanding of the naked human body. And that affects every area of our lives, both inside and outside clothes-free communities. If we’re to truly respect others and the environment, naturism should affect our economics and our politics. Naturism isn’t a means to escape, but to heal. To grow as both individuals and as a society.
Original publication 30 September, 2020
Posted on NatCorn 19th October 2020
Reference to an article does not infer endorsement of any views expressed.