The Global Wellness Institute (GWI) recently reported that the wellness industry, which encompasses everything from nutrition and fitness to personalized and preventive medicine, accounted for 5.3% of worldwide economic output in 2017. That translates to a staggering $4.2 trillion, up from $3.4 trillion in 2013, the year of GWI’s inaugural report.
You might think that this revolution in holistic health would correspond with declining rates in sickness and disease, but the numbers tell a much more complicated story. When it comes to chronic conditions like asthma, allergies, obesity, and even diabetes, we seem to be getting sicker. This is true in the United States and around the world.
Let’s look at the data.
In 2009, a sweeping study of 17 European countries showed that the prevalence of Type I diabetes rose 3.9% annually from 1989 to 2003, and predicted that cases would double in the years between 2005 and 2020.1 In 2018, a study of American adults found that obesity rates rose dramatically, from 33.7% to 39.6%, in the years between 2007-2008 and 2015-2016.2
In the rapidly developing economies of the east, similar trends are emerging. A pair of cross-sectional studies performed ten years apart showed that the prevalence of food allergies in China more than doubled between 1999 and 2009, from 3.5% to 7.7%.3 Meanwhile, a 2018 study of urban school children in Jaipur, India showed that asthma rates nearly tripled from 2008 to 2018, from 7.59% to 18.2%.4
And that’s just the tip of the iceberg. While these conditions and geographies may seem (and in some cases may be) disparate or disconnected, a growing number of scientists believe there could be a surprising link between them—bacteria.
But it’s probably not the kind of bacteria with which you are familiar.
Our relationship with bacteria is complex, to say the least. By cell count, they comprise about half of the human body. There are 38 trillion bacteria in your microbiome alone.5 Yet, for the vast majority of our shared history, we have completely misunderstood them—and done so at our own peril.
We didn’t even catch our first glimpse of the microbial world until 1674, when a Dutch lens maker named Antonie van Leeuwenhoek described the very first microorganisms, which he called “animalcules.” Even then, Leeuwenhoek was way ahead of the scientific curve.
It would be another 200 years before we started to understand the role microbes play in human health. In 1857, a French biologist named Louis Pasteur (think: pasteurization) collected the first evidence that microorganisms were responsible for spoiling beverages.6 In the 1860s, a British surgeon named Joseph Lister (think: listerine) discovered a way to disinfect wounds.7
The work of these two scientists, along with other pioneering researchers of the day—Robert Koch, Theodor Escherich, and Paul Ehrlich—laid the foundation for the Germ Theory of Disease, which states that many diseases are caused by microorganisms. Those four words would define the next 100 years of medicine, for better and for worse.
Germ Theory paved the way for miracle inventions like antibiotics, but it also had another side effect. It made us fear bacteria. What followed was a century of purification, sterilization, and eradication. We developed and deployed powerful (and indiscriminate) weapons to defeat the microbial menace—commercial and industrial antibiotics, antibacterial soap, and bleach, etc.
The good news? They’ve saved (and continue to save) hundreds of millions of lives.
The bad news? In waging war against microbes, we inadvertently waged war on ourselves.
In 1989, a British professor named David Strachan became the first to sound the alarm. He believed that the rapid 20th century rise of allergic conditions like hay fever, asthma, and eczema was linked to our increasing obsession with hygiene. In his view, allergies could be avoided by exposing children to infectious microbes while they were young—to help bolster the immune system.
To test it, he analyzed sampling data collected from 17,414 British schoolchildren over a period of 23 years, from 1958 to 1981. This epidemiological study showed that larger families with more than one child, were less likely to report hay fever, asthma, and eczema, than those with single children.8
Strachan concluded that as family size decreases, so does the chance of ‘cross infection’ as seen in larger families. In addition, he theorized that improving standards and methods of cleanliness were partially, if not largely, to blame for the imbalance observed. In other words, the escalating use of antibiotics, bleach, and other indiscriminate antimicrobial products was keeping children’s immune systems from developing properly. As a result, chronic diseases were creeping in.
This idea would become known as the Hygiene Hypothesis.
Strachan was definitely onto something, but he was missing key pieces of the puzzle. Those wouldn’t be revealed until the dawn of metagenomics in the mid-1990s. Suddenly, we had the tools to sequence the genome of every bacterium, virus, and archaea on the planet, including the ones inside of us. It was a technological leap as great as the microscope—and just as important, too.
Since then, our understanding of the microbial world has expanded dramatically. We now know that 99% of known bacteria are harmless to humans. We know that many have evolved to live in and on us, and perform critical roles in our bodies. We know that diversity of gut microbes is essential for good health. And we know that our obsession with hygiene has depleted that diversity.
We also know that the Hygiene Hypothesis is incomplete.
A growing number of scientists believe it’s time to update or replace Strachan’s theory—taking into account the knowledge we’ve gathered over the last three decades.9 The problem, they say, isn’t just that we’re eradicating too many bad microbes, although that’s fueling its own crisis—antibiotic resistance. It’s that we’re eradicating too many good microbes as well.
Turns out, our resident microbes play a huge role in training our immune systems.10 This training is part of the process of receiving our first microbes, which scientists call seeding. Disruptions in the microbiome during this time, such as taking a course of antibiotics that indiscriminately wipes out both ‘bad’ and ‘good’ bacteria in our gut, can compromise the development of our immune systems.
This idea is called Microbiome Depletion Theory or Lost Friends Theory, and it goes way beyond hay fever and asthma. Over the last few years, scientists have seen a correlation between the human microbiome and food allergies,11 obesity,12 autoimmune diseases,13 and even depression.14 At the same time, we’ve also started to figure out how to prevent these disruptions.
In June 2019, researchers announced the discovery of a distinct microbial ‘signature’ in food-allergic children that healthy children do not have.15 From this signature, they were able to identify specific bacteria in the gut that could help prevent food allergies from developing. They do this by activating special immune cells to halt the body’s immune response to harmless substances like allergens.
Their research marks the first time in history that scientists have witnessed microbes communicating directly with immune cells. It also marks the first time we’ve unraveled one of the core mechanisms governing allergies. Now that we’ve identified the bacteria that can ‘switch’ food allergies on and off, scientists can begin to develop targeted treatments that harness those strains.
The discovery also provides a roadmap for scientists trying to figure out how other chronic conditions take hold. If we can find a way to harness the full power and potential of our microbial allies, we may be able to rid ourselves of these conditions for good. Until then, there are plenty of ways to take care of your beneficial microbes.