I know I feel better and lose some belly fat when I remove white flour from my diet, (sugar and white flour can go hand-in-hand, so is it the flour or the sugar?) but there are more and more of these types of articles. I will not argue with the fact that something else could be going on with our bodies, I mean the sickness in our children alone sends up that red flag. Obviously the Standard American Diet is a real problem for everyone who eats it.
I had to look up Karelia; a map is included below for your ease, dear reader.
I took this from the New York Times on-line.
By MOISES VELASQUEZ-MANOFF
JULY 4, 2015
AS many as one in three Americans tries to avoid gluten, a protein found in wheat, barley and rye. Gluten-free menus, gluten-free labels and gluten-free guests at summer dinners have proliferated.
Some of the anti-glutenists argue that we haven’t eaten wheat for long enough to adapt to it as a species. Agriculture began just 12,000 years ago, not enough time for our bodies, which evolved over millions of years, primarily in Africa, to adjust. According to this theory, we’re intrinsically hunter-gatherers, not bread-eaters. If exposed to gluten, some of us will develop celiac disease or gluten intolerance, or we’ll simply feel lousy.
Most of these assertions, however, are contradicted by significant evidence, and distract us from our actual problem: an immune system that has become overly sensitive.
Wheat was first domesticated in southeastern Anatolia perhaps 11,000 years ago. (An archaeological site in Israel, called Ohalo II, indicates that people have eaten wild grains, like barley and wheat, for much longer — about 23,000 years.)
Is this enough time to adapt? To answer that question, consider how some populations have adapted to milk consumption. We can digest lactose, a sugar in milk, as infants, but many stop producing the enzyme that breaks it down — called lactase — in adulthood. For these “lactose intolerant” people, drinking milk can cause bloating and diarrhea. To cope, milk-drinking populations have evolved a trait called “lactase persistence”: the lactase gene stays active into adulthood, allowing them to digest milk.
Milk-producing animals were first domesticated about the same time as wheat in the Middle East. As the custom of dairying spread, so did lactase persistence. What surprises scientists today, though, is just how recently, and how completely, that trait has spread in some populations. Few Scandinavian hunter-gatherers living 5,400 years ago had lactase persistence genes, for example. Today, most Scandinavians do.
Here’s the lesson: Adaptation to a new food stuff can occur quickly — in a few millenniums in this case. So if it happened with milk, why not with wheat?
“If eating wheat was so bad for us, it’s hard to imagine that populations that ate it would have tolerated it for 10,000 years,” Sarah A. Tishkoff, a geneticist at the University of Pennsylvania who studies lactase persistence, told me.
For Dr. Bana Jabri, director of research at the University of Chicago Celiac Disease Center, it’s the genetics of celiac disease that contradict the argument that wheat is intrinsically toxic.
Active celiac disease can cause severe health problems, from stunting and osteoporosis to miscarriage. It strikes a relatively small number of people — just around 1 percent of the population. Yet given the significant costs to fitness, you’d anticipate that the genes associated with celiac would be gradually removed from the gene pool of those eating wheat.
A few years ago, Dr. Jabri and the population geneticist Luis B. Barreiro tested that assumption and discovered precisely the opposite. Not only were celiac-associated genes abundant in the Middle Eastern populations whose ancestors first domesticated wheat; some celiac-linked variants showed evidence of having spread in recent millenniums.
People who had them, in other words, had some advantage compared with those who didn’t.
Dr. Barreiro, who’s at the University of Montreal, has observed this pattern in many genes associated with autoimmune disorders. They’ve become more common in recent millenniums, not less. As population density increased with farming, and as settled living and animal domestication intensified exposure to pathogens, these genes, which amp up aspects of the immune response, helped people survive, he thinks.
In essence, humanity’s growing filth selected for genes that increase the risk of autoimmune disease, because those genes helped defend against deadly pathogens. Our own pestilence has shaped our genome.
The benefits of having these genes (survival) may have outweighed their costs (autoimmune disease). So it is with the sickle cell trait: Having one copy protects against cerebral malaria, another plague of settled living; having two leads to congenital anemia.
But there’s another possibility: Maybe these genes don’t always cause quite as much autoimmune disease.
Perhaps the best support for this idea comes from a place called Karelia. It’s bisected by the Finno-Russian border. Celiac-associated genes are similarly prevalent on both sides of the border; both populations eat similar amounts of wheat. But celiac disease is almost five times as common on the Finnish side compared with the Russian. The same holds for other immune-mediated diseases, including Type 1 diabetes, allergies and asthma. All occur more frequently in Finland than in Russia.
WHAT’S the difference? The Russian side is poorer; fecal-oral infections are more common. Russian Karelia, some Finns say, resembles Finland 50 years ago. Evidently, in that environment, these disease-associated genes don’t carry the same liability.
Are the gluten haters correct that modern wheat varietals contain more gluten than past cultivars, making them more toxic? Unlikely, according to recent analysis by Donald D. Kasarda, a scientist with the United States Department of Agriculture. He analyzed records of protein content in wheat harvests going back nearly a century. It hasn’t changed.
Do we eat more wheat these days? Wheat consumption has, in fact, increased since the 1970s, according to the U.S.D.A. But that followed an earlier decline. In the late 19th century, Americans consumed nearly twice as much wheat per capita as we do today.
We don’t really know the prevalence of celiac disease back then, of course. But analysis of serum stored since the mid-20th century suggests that the disease was roughly one-fourth as prevalent just 60 years ago. And at that point, Americans ate about as much wheat as we do now.
Overlooked in all this gluten-blaming is the following: Our default response to gluten, says Dr. Jabri, is to treat it as the harmless protein it is — to not respond.
So the real mystery of celiac disease is what breaks that tolerance, and whatever that agent is, why has it become more common in recent decades?
An important clue comes from the fact that other disorders of immune dysfunction have also increased. We’re more sensitive to pollens (hay fever), our own microbes (inflammatory bowel disease) and our own tissues (multiple sclerosis).
Perhaps the sugary, greasy Western diet — increasingly recognized as pro-inflammatory — is partly responsible. Maybe shifts in our intestinal microbial communities, driven by antibiotics and hygiene, have contributed. Whatever the eventual answer, just-so stories about what we evolved eating, and what that means, blind us to this bigger, and really much more worrisome, problem: The modern immune system appears to have gone on the fritz.
Maybe we should stop asking what’s wrong with wheat, and begin asking what’s wrong with us.
Moises Velasquez-Manoff is a science writer and the author of “An Epidemic of Absence.”
Follow The New York Times Opinion section on Facebook and Twitter, and sign up for the Opinion Today newsletter.
A version of this op-ed appears in print on July 5, 2015, on page SR6 of the New York edition with the headline: The Myth of Big, Bad Gluten. Order Reprints| Today's Paper|Subscribe
No comments:
Post a Comment