A busy month moving about for the 580th.
I assume they drove given they have all those ambulances.
Households tend to take pantry food for granted, but canned beans, powdered cheese, and bags of moist cookies were not designed for everyday convenience. These standard products were made to meet the needs of the military.
Food and combat have been intertwined ever since the earliest military rations. Ancient Sumerians rode into battle with barley cakes and beer. In the 11th and 12th Centuries, Mongols preserved their meat by storing it under saddles, using salt from the horses and the weight of riders as a mobile preservation technique.
Drying, salting, smoking and pickling were the go-to methods until 1795, when the French government held a contest to find a new preservation technique. A chef Nicolas Appert came forward with canning, which revolutionized food preservation.
During World War II, however, the United States realized there remained a need for preserved food production to ramp up more quickly in times of crisis, and started investing heavily in food technologies.
In the 1950s, the Combat Feeding Directorate was established at the Natick Soldier System Center on a US Army Base in Massachusetts. Today it remains the epicenter of the modern military diet.
The primary purpose of the Natick Center is to overcome the challenges inherent in food: it spoils, grows mold, or it loses flavor. And their food scientists have come up with inventions like the MRE (Meal Ready-to-Eat).
MREs come packaged with chemical heaters to warm food, oxygen scavengers to prevent spoilage and carefully-concocted meals made to be edible for years after their creation date.
MREs may also contain condiments and side dishes, all the various packets tucked into a lightweight pouch and designed to survive in any climate.
One of the Natick Center’s current goals is to finally grant a longtime wish from servicemen: pizza on the battlefield. They hope to have a shelf-stable pizza, which would last for years without refrigeration, available to the military by 2017. And, soon after that, in your grocery store.
As a means of cost reduction, and as way to readily tap the private sector during wartime, the government has forged a series of public/private partnerships with commercial food producers. The military’s technology and influence can be seen in effectively every grocery aisle.
Many military innovations make their way, in some form or another, into American kitchens. TV dinners, freeze-dried coffee, semi-moist cookies, and condiment packets, were all developed to feed soldiers, sailors, and pilots stationed remotely.
While all these processed and packaged foods have become familiar fare for the American household, most of these products are made to last far longer than the average civilian would need.
There haven’t been many studies about the long-term health impacts of the specific food technologies pioneered by the military, but whether its good for us or not, in the years to come, pizza is moving out of the freezer section and onto room-temperature shelves.
Reporter Tina Antolini, host of the podcast Gravy, spoke with Anastacia Marx de Salcedo, author of Combat-Ready Kitchen; Stephen Moody, the Director of Combat Feeding at the Natick Soldier Research, Development, and Engineering Center and Louisiana-native Ben Armstrong, who spent five years in the United States Marine Corps.
Not all sugars are created equal. Glucose and fructose are simple sugars naturally found in fruit and have the same number of calories, but new research suggests there are important differences in how the body responds to these sweeteners. While glucose is absorbed directly into the bloodstream to produce energy, fructose—which is used to sweeten soft drinks and processed foods—is metabolized in the liver. The body reacts to glucose in the blood by producing insulin, which triggers feelings of fullness. “Fructose doesn’t stimulate insulin secretion, and if there’s no insulin, you don’t get the information that you’re full,” the study’s senior author, Dr. Kathleen Page, tells The New York Times. Consuming fructose also triggers more activity in areas of the brain involved in reward processing, which intensifies cravings for high-calorie foods such as candy, cookies, and pizza. Researchers do not recommend that people forgo fruit, since it provides fiber and nutrients and has relatively small amounts of fructose compared with soft drinks and processed foods. But researchers say it does make sense to limit overall sugar intake.
People who are hospitalized with an infection early in life can lose IQ points, new research suggests. Scientists analyzed the hospital records of 190,000 Danish men born between 1974 and 1994. Before taking an IQ test at age 19, about 35 percent of the men had landed in hospitals with serious infections, such as an STD or a urinary tract infection. The average IQ score of those subjects was 1.76 points below average. Those with five or more hospitalizations for infection had an average IQ that was 9.44 points lower than average. The more severe or recent the infection, the lower the score. Researchers theorize that inflammation caused by immune responses may damage the brain. “Infections in the brain affected the cognitive ability the most,” the study’s lead author, Dr. Michael Eriksen Benrós tells Forbes.com. But he said any infection severe enough to require hospitalization had a negative impact on IQ.
For years, I insisted I didn’t want children, said Michelle Goldberg. But life is full of surprises.
Twelve years ago, I penned an essay for a Salon.com series called “To Breed or Not to Breed,” about the decision to have children or not. It began this way: “When I tell people that I’m 27, happily married, and that I don’t think I ever want children, they respond one of two ways. Most of the time they smile patronizingly and say, ‘You’ll change your mind.’ Sometimes they do me the favor of taking me seriously, in which case they warn, ‘You’ll regret it.’” The series inspired an anthology titled Maybe Baby. It was divided into three parts: “No Thanks, Not for Me,” “On the Fence,” and “Taking the Leap.” My essay was the first in the “No” section.
So I felt a little sheepish when, a year and a half ago, the writer Meghan Daum asked me if I’d be interested in contributing to the book that would become Shallow, Selfish and Self-Absorbed: Sixteen Writers on the Decision Not to Have Kids. I wrote back to tell her that I couldn’t: My son had just turned 1.
My transformation didn’t begin with an unbidden outbreak of baby lust or a sudden longing for domesticity. It began, weirdly enough, when I learned about corpses becoming fathers. In 2011, I reported a piece for Tablet Magazine about the strange Israeli campaign for posthumous reproduction. Israel is the world capital of reproductive technology, and a legal group called New Family wanted to give parents who had lost adult sons the right to extract their sperm and create grandchildren. I have mixed feelings about making dads out of dead men, but I remember being seized by the realization that if my husband were to die young, I’d want to be able to do it to him.
Children, I suddenly understood, would hedge against the unthinkable fact of my husband’s mortality. Not long ago, I learned the Arabic word Ya’aburnee from a friend’s cheesy Facebook graphic. Literally “You bury me,” it means wanting to die before a loved one so as not to have to face the world without him or her in it. It’s a word that captures exactly my feeling for my husband. Part of the reason I didn’t want kids was because I feared they’d come between us, but if he were gone, I’d be frantic to hold on to a piece of him. Grasping this didn’t make me want a baby, exactly, but it started pushing me from “No” to, well, ambivalent.
My husband, Matt, was ambivalent, too. We were pleased with our two-person family, with our consuming careers, constant travel, and many tipsy nights out, all the things people say you lose when you become a parent. We met very young, the summer after my freshman year of college, and we’d never grown bored with each other. Sometimes we puzzled over what people meant when they said that marriage is hard work. We assumed it had something to do with parenthood.
As happy as I am with my marriage, I’m not by nature a cheerful person. Like a lot of writers, I’m given to tedious bouts of anxiety, depression, and self-loathing. I am introverted, and feel shattered if I don’t have time alone every day. Worse, from a parental perspective, I am impatient, easily undone by quotidian frustrations. As much as I love to visit faraway places, I’m often reduced to tears by the indignities of air travel. When I’m stuck in a taxi in traffic, I unconsciously shred my cuticles until my fingers bleed. I pictured parenthood as a clammy never-ending coach flight.
Also, there was my work. As a little girl, I had never imagined myself with babies, or, for that matter, with a husband. My vision of the future had involved an apartment in New York City, a cat, and a typewriter. I was sure children would get in the way of my ambitions—and, worse, that I’d poison them with my resentment.
I started looking online for stories about people who’d had children and then wished they hadn’t. I read about a famous Ann Landers reader survey from the 1970s, undertaken in response to a letter from a young couple who feared, as I did, that parenthood would ruin their marriage. “Will you please ask your readers the question: If you had it to do over again, would you have children?” they asked. She did, and received 10,000 responses. To her dismay, 70 percent answered no. A 40-year-old mother of twins wrote, “I was an attractive, fulfilled career woman before I had these kids. Now I’m an exhausted, nervous wreck who misses her job and sees very little of her husband. He’s got a ‘friend,’ I’m sure, and I don’t blame him.” This helped shore up my faith in our decision.
Looking back, the fact that my faith needed shoring up was a sign that something was changing. As I got older, the constant travel that once thrilled me became wearying. My work still meant a lot to me, but while I once thought that publishing a book would make me feel that I’d arrived, publishing two taught me that arrival is elusive. Where I’d once seen family and intellectual life in opposition, over time I started worrying that it was an intellectual loss to go through life without experiencing something so fundamental to so many people’s existence. Meanwhile, 35 was creeping up on me.
Matt and I went back and forth, and back and forth some more. We both felt like we were atop a fulcrum and could be pushed either way if only the other knew what to do. At some point, we decided that I’d go off the pill and see what happened.
For a few months, nothing did. I started to wonder if I were infertile, if biology had decided the issue for me. I wasn’t sure if I was disappointed or relieved by this. Then—in a development that shocked me despite being completely predictable—I got pregnant, and was immediately convinced I’d made an awful mistake.
Within a couple of weeks, the queasiness came on like a portent, though at the same time I longed for the drinks I couldn’t have. We had a trip coming up—my husband had work to do in London, and I was going to accompany him, then go to Israel and Palestine for work of my own. I wasn’t sure how I’d get through it, but I was determined to go, since it might be my last chance to travel for a very long time.
The first few days in London, I cried constantly. Then, one afternoon, I called my doctor in New York for the results of some routine tests. The news wasn’t good. My progesterone was low, which the doctor said could be either a cause or a symptom of a failing pregnancy. When we got off the phone, I was hysterical with worry over this pregnancy that I didn’t want at all.
Back in New York, I went immediately to the doctor, shaking as I waited to see the result of my 10-week ultrasound. When I saw the beating heart of the ghostly, paisley-shaped creature, I was, for all my qualms, hugely thankful. Over the next two weeks, I started to get a little bit excited about the baby. It helped that the sickness and sleepiness had lifted. When I returned to the doctor at 12 weeks—the end of my first trimester, and the danger zone for pregnancy loss—I was almost relaxed. But this time, the ultrasound showed no heartbeat.
I had never felt as sad about anything as I did about that miscarriage. Actually, sad isn’t the right word, since it suggests a watercolor melancholy, and this was jagged, putrid desolation. The only way to make the anguish disappear, I thought, was to get pregnant again. Before, I’d been baffled by some women’s animal desperation for a baby. Now that desperation took hold of me.
It took five months for me to get pregnant again—not a very long time, though it felt endless, and makes me so sorry for those condemned to spend years in that hideous limbo. I white-knuckled it through much of the pregnancy, terrified of seeing a still heart at each ultrasound.
Perhaps it says something about my pre-baby life that a lot of my metaphors for new motherhood were drug-related. Those endless hours my son and I spent in bed, alternately nursing, dozing, and staring, amazed, at each other, reminded me of the time I’d smoked opium in Thailand. Lugging him around on errands brought to mind the first few times I got stoned as a teenager, when doing normal things like going to school or the drugstore became complicated, strange, and full of misadventure. The oxytocin felt like ecstasy.
Why, I kept thinking, hadn’t anyone told me how great this was? It was a stupid thing to think, because in fact people tell you that all the time. In general, though, the way people describe having a baby is much like the way they describe marriage—as a sacrifice that’s worth it, as a rewarding challenge, as a step toward growing up. Nobody had told me it would be fun.
The fact that it was, of course, was largely a matter of my good fortune and privilege. Getting what a friend of mine calls “the good hormones,” instead of those that cause postpartum debilitation, is largely a matter of dumb luck. I also had a husband who was a full, enthusiastic partner; an established, flexible career; and, crucially, money to afford good child care. My son was (and is) sweet-natured and easy.
Certainly, it sucked sometimes. A purple-clad lactation consultant prescribed a regimen of round-the-clock feeding, pumping, and tea guzzling that, had I followed it, would have broken me in a day; her visit left me feeling crushed, inadequate, and then humiliated for not having stood up to her. I’d worried, throughout my pregnancy, that I would resent my son for taking me away from my work. Instead, I resented my work for taking me away from my son, which created its own sort of identity crisis.
For all that, though, my son’s first year was the best of my life. I learned that while travel with a baby isn’t easy, it’s doable. We took him to Malaysia, where I was speaking at a conference, when he was 6 months old, and then on a reporting trip to Panama a few months later. Both of these were countries we’d been to before; seeing them again with our son made travel feel new. He made staying home feel new too. When I was with him, the habitual churning of my mind eased. Instead of arguing with strangers on Twitter, I spent hours in neighborhood parks I’d barely noticed before. Ultimately, even my work life improved: The crisis motherhood brought on led me to refocus on more satisfying long-form writing. Something Louis C.K. said recently was true for me: “I realized that a lot of the things that my kid was taking away from me, she was freeing me of.”
Matt and I were so delighted by our baby that we started half-seriously mulling a second. I was now in my late 30s and assumed that if and when we resolved to go for it, it would take even longer than before. One night, thinking we needn’t work so hard to prevent a pregnancy that we might soon wish for, we didn’t use birth control. In the morning, we came to our senses, decided we weren’t ready, and vowed not to be so sloppy again. It was too late. Our daughter was born nine months later, almost two years to the day after her brother.
She is a wonder, but having two children in diapers actually is pretty hard, particularly when you live in a fourth-floor walk-up. There are evenings when my husband and I are too harried to say more than a few words to each other as we tag-team two bedtimes and then collapse in front of the television. I’m occasionally incredulous that I’ve ended up with exactly the sort of life I once publicly pledged to avoid.
Unlike Ann Landers’ survey respondents, I swear I don’t regret it, though sometimes I’m mortified when I think about how my 27-year-old self would regard the frazzled, stroller-pushing woman I am now. I try to figure out how to explain myself in a way that would be intelligible to her, but I don’t think I can. The best I can come up with is that before, there was one person in the world for whom I would use the word Ya’aburnee. Now there are three.
Excerpted from an article that originally appeared in NYMag.com.
THE WEEK
July 3, 2015
At the 11th hour on the 11th day of the 11th month of 1918, the Great War ends. At 5 a.m. that morning, Germany, bereft of manpower and supplies and faced with imminent invasion, signed an armistice agreement with the Allies in a railroad car outside Compiégne, France. The First World War left nine million soldiers dead and 21 million wounded, with Germany, Russia, Austria-Hungary, France, and Great Britain each losing nearly a million or more lives. In addition, at least five million civilians died from disease, starvation, or exposure.
On June 28, 1914, in an event that is widely regarded as sparking the outbreak of World War I, Archduke Franz Ferdinand, heir to the Austro-Hungarian empire, was shot to death with his wife by Bosnian Serb Gavrilo Princip in Sarajevo, Bosnia. Ferdinand had been inspecting his uncle’s imperial armed forces in Bosnia and Herzegovina, despite the threat of Serbian nationalists who wanted these Austro-Hungarian possessions to join newly independent Serbia. Austria-Hungary blamed the Serbian government for the attack and hoped to use the incident as justification for settling the problem of Slavic nationalism once and for all. However, as Russia supported Serbia, an Austro-Hungarian declaration of war was delayed until its leaders received assurances from German leader Kaiser Wilhelm II that Germany would support their cause in the event of a Russian intervention.
On July 28, Austria-Hungary declared war on Serbia, and the tenuous peace between Europe’s great powers collapsed. On July 29, Austro-Hungarian forces began to shell the Serbian capital, Belgrade, and Russia, Serbia’s ally, ordered a troop mobilization against Austria-Hungary. France, allied with Russia, began to mobilize on August 1. France and Germany declared war against each other on August 3. After crossing through neutral Luxembourg, the German army invaded Belgium on the night of August 3-4, prompting Great Britain, Belgium’s ally, to declare war against Germany.
For the most part, the people of Europe greeted the outbreak of war with jubilation. Most patriotically assumed that their country would be victorious within months. Of the initial belligerents, Germany was most prepared for the outbreak of hostilities, and its military leaders had formatted a sophisticated military strategy known as the “Schlieffen Plan,” which envisioned the conquest of France through a great arcing offensive through Belgium and into northern France. Russia, slow to mobilize, was to be kept occupied by Austro-Hungarian forces while Germany attacked France.
The Schlieffen Plan was nearly successful, but in early September the French rallied and halted the German advance at the bloody Battle of the Marne near Paris. By the end of 1914, well over a million soldiers of various nationalities had been killed on the battlefields of Europe, and neither for the Allies nor the Central Powers was a final victory in sight. On the western front—the battle line that stretched across northern France and Belgium—the combatants settled down in the trenches for a terrible war of attrition.
In 1915, the Allies attempted to break the stalemate with an amphibious invasion of Turkey, which had joined the Central Powers in October 1914, but after heavy bloodshed the Allies were forced to retreat in early 1916. The year 1916 saw great offensives by Germany and Britain along the western front, but neither side accomplished a decisive victory. In the east, Germany was more successful, and the disorganized Russian army suffered terrible losses, spurring the outbreak of the Russian Revolution in 1917. By the end of 1917, the Bolsheviks had seized power in Russia and immediately set about negotiating peace with Germany. In 1918, the infusion of American troops and resources into the western front finally tipped the scale in the Allies’ favor. Germany signed an armistice agreement with the Allies on November 11, 1918.
World War I was known as the “war to end all wars” because of the great slaughter and destruction it caused. Unfortunately, the peace treaty that officially ended the conflict—the Treaty of Versailles of 1919—forced punitive terms on Germany that destabilized Europe and laid the groundwork for World War II.
An epidemic of childhood obesity in the U.S. has sparked nationwide concern that kids spend too much sedentary time in front of screens and too little time playing outdoors. Now research suggests that children who are aerobically fit aren’t just healthier, they also do better in math, ScienceDaily.com reports. A University of Illinois study found that cardiorespiratory fitness contributes to gray-matter loss, a crucial part of neurological development in children. “The theory is that the brain is pruning away unnecessary connections and strengthening useful connections,” says study leader Laura Chaddock-Heyman. Her team analyzed 48 children ages 9 and 10 who had completed a treadmill test. Half the kids were either at or above the 70th percentile for aerobic fitness, while the other half were out of shape, falling below the 30th percentile. The researchers then imaged the subjects’ brains with MRIs and gave them an achievement test to gauge their math, reading, and spelling skills. The fitter children showed significantly more thinning in the outermost layer of the cerebrum, a process associated with better mathematics performance—and, in fact, they scored higher on their math tests, but showed no such edge in reading or spelling. “These findings arrive at an important time,” says researcher Charles Hillman, noting that many schools have cut back on physical activity during the school day “in response to mandates for increased academic time.”
Pasta sales are declining around the world because of the growing popularity of low-carb diets. In the U.S., sales of dried pasta have dropped 6 percent since 2009, while in Italy, they’ve plummeted 25 percent.
A wild Australian sheep that could barely walk because of its unshorn fleece yielded a record--breaking 89 pounds of wool—enough to make 30 sweaters. The merino ram, dubbed Chris, spotted by a hiker, spent five years in the wild despite being bred to produce a maximum amount of wool. An animal welfare charity took Chris in and gave him the epic haircut. “He looks like a new man,” charity official Tammy Ven Dange said of the now 97-pound sheep. “For one thing, he’s only half the weight he used to be.”
Poor Baby |
Medical experts have long suspected that obesity in midlife increases the risk for Alzheimer’s disease, but new research suggests that typical middle-age spread could also hasten the onset of the degenerative illness. A National Institutes of Health study found that people who are overweight at age 50 may be more likely to develop Alzheimer’s disease sooner than their healthy-weight peers, CBSNews.com reports. The researchers tracked the body mass index (BMI)—the ratio of weight to height—of 142 people who eventually developed Alzheimer’s. They found symptoms of the disease appeared six and a half months earlier for every step up on the BMI chart. Lead author Madhav Thambisetty says the results could help provide clues about the cause of the brain-wasting disease, which has struck 46 million people worldwide. “Understanding how risk factors in midlife may accelerate the onset of Alzheimer’s,’’ he says, could speed “efforts to develop interventions and treatments.”
The origins of schizophrenia are mysterious, but new research suggests a possible connection between the devastating mental disorder and microorganisms found in the mouth and throat. A team from George Washington University analyzed viruses, bacteria, and fungi in 32 people, half of whom were diagnosed with schizophrenia, MedicalDaily.com reports. Those who had the disease displayed levels of lactic acid bacteria, which originate in the gut and travel to the mouth and throat, that were at least 400 times higher than people in the control group. The findings add to a growing body of evidence that the trillions of bacteria that colonize the body may influence the brain and behavior. Larger studies are needed to confirm an association between gut and throat bacteria and schizophrenia, but “the results are quite intriguing,” says co-author Keith Crandall. He said the research could lead to earlier diagnoses and new treatments for the illness, which afflicts 3.5 million Americans.
Hallway from the room |
Looking towards the balcony and the beds |
Very large space - no tub, though |
Looking back towards the entrance |
Partial View from balcony |
In the first 700 million years after the Earth’s formation, scientists have long believed, our planet was a hellish realm devoid of life—with asteroids raining down on a landscape riddled with volcanoes, molten rock, and poisonous gases. But new research suggests that life may have taken root in the Earth’s turbulent youth—300 million years earlier than previously suspected, reports HuffingtonPost.com. Our planet formed roughly 4.5 billion years ago, and was heavily volcanic for eons as it slowly cooled. The earliest fossil records date to about 3.8 billion years ago, when single-celled creatures began to appear. But by studying tiny crystals that form in magma, called zircons, geochemists at the University of California at Los Angeles found microscopic flecks of pure carbon with a signature indicating it had been left behind by living organisms 4.1 billion years ago. “Life on Earth may have started almost instantaneously,” says study co-author Mark Harrison. “With the right ingredients, life seems to form very quickly.” He said the study suggests that simple life-forms may be quite common throughout the universe.
With her flame-red hair tumbling down her shoulders, Blaze Starr would writhe on a couch while slowly shedding her clothes. Then, just as the burlesque performer was about to peel off the last stitch, she would hit a hidden button, causing smoke to billow from the cushions, along with ribbons shaped like flames. That brand of playful eroticism earned Starr the title Queen of Burlesque in the 1950s and ’60s, but she was even more renowned as a seductress. Her brazen affair with former Louisiana Gov. Earl Long inspired the 1989 Paul Newman film Blaze. Despite her scarlet reputation, Starr carried herself with dignity. “I always felt that I was an artist,” she said. “If there is such a thing as getting nude with class, then I did it.”
Starr was born Fannie Belle Fleming in Wilsondale, W.Va., one of 11 children of a railroad worker. At 15, she hopped a bus for Washington, D.C., and was working in a doughnut shop there when a promoter “persuaded her to become a stripper,” said The New York Times. After moving to Baltimore in 1950, she settled for a decades-long residency at the rowdy 2 O’Clock Club, which she would later buy. Starr’s act was in demand around the country, said The Washington Post, and she toured with “an elaborate set of costumes she sewed herself.”
After one performance in New Orleans in 1959, Starr met the married, 62-year-old Long. They carried on their relationship openly and were, she said, engaged to be married when Long died in 1960. Starr boasted of other high-profile lovers, but named only one, John F. Kennedy, claiming she met him on the campaign trail in 1960. She went on to become a successful Baltimore businesswoman, said The Baltimore Sun, and “was so unthreatening to local morals” that she appeared in ads for the city’s gas and electric companies. Starr stripped until the 1980s and spent her later years designing jewelry. Asked in 1988 if she would change anything about her life, she gave a categorical no. “I would just do more of it,” she said. “And seduce a lot more men.”
Exposure to nature is good for kids’ brains, a new study has found. During a 12-month study of 2,593 second- through fourth-graders in Barcelona, researchers used satellite images to assess the amount of “green space” around the children’s homes and schools—grassy fields, trees, and plants. They also measured local levels of traffic-related air pollution. Cognitive tests revealed the kids exposed to more green spaces, particularly at school, experienced a 5 percent increase in working memory and a 1 percent drop in inattentiveness, The Washington Post reports. Why? Scientists theorize that trees and shrubbery help absorb air pollution and cut down on noise; natural environments also improve cognitive development by allowing children to make more discoveries and feel a sense of wonder. “I think it’s also some kind of direct effect,” says study author Mark Nieuwenhuijsen. “You see quite a beneficial effect of green space on mental health.”
Doughnuts. Cake frosting. Microwave popcorn. Besides “being delicious,” these foods have one thing in common, said Lexi Pandell in Wired.com: They generally contain trans fats, an artificial ingredient used for extending shelf life and improving flavor and texture. But not for much longer. The Food and Drug Administration last week implemented a near-zero-tolerance ban on partially hydrogenated oils, the main source of trans fats, giving food companies three years to remove the ingredient from their products. Trans fats were long considered a healthy alternative to lard, but recent studies have linked them to serious health problems like obesity, memory loss, and heart disease. This may be the “most important change in our food supply” in decades, said Roberto Ferdman in WashingtonPost.com. While the FDA has banned numerous ingredients over the years, including artificial sweeteners such as cyclamate, none has been “so clearly linked to tens of thousands of deaths like trans fat.”
But most Americans already know this stuff is bad for them, said USA Today in an editorial. Trans-fat consumption has dropped by about 80 percent since 2003, when the FDA required manufacturers to list trans-fat content on food labels. Worried that “the truth would hurt sales,” big firms simply removed the ingredient from their products. Washington should have stuck with that successful “give-them-the-facts strategy rather than a nanny-state approach sure to produce a backlash.” If the FDA’s goal is to prevent cardiovascular disease, why stop with trans fat? asked David Harsanyi in TheFederalist.com. What about high-fructose corn syrup, a major contributor to obesity? Or cigarettes, which kill 443,000 every year? The FDA’s ban will have only a negligible effect on public health, but it will “create precedents that allow further intrusions into how and what we eat.”
If anything, the “trans fat saga shows how hard it is to get nutritional science right,” said Sarah Kaplan in WashingtonPost.com. Until relatively recently, experts called trans fats “a great boon to Americans’ arteries” and warned us to avoid the kinds of saturated fats found in butter, eggs, and meat. Now that advice has been reversed. But we can’t be too hard on researchers, because determining exactly what’s healthy is an inexact science. Some nutrients work only in conjunction with others; all chemicals affect different bodies in different ways. “It’s a difficult recipe to get right.”