Fat land : how Americans became the fattest people in the world (6 page)

BOOK: Fat land : how Americans became the fattest people in the world
12.85Mb size Format: txt, pdf, ePub


Puericulture had begun as an informal system of health education in the late nineteenth century, principally to teach new mothers how to prevent and treat tuberculosis, then on the rise. By the early twentieth century, puericulture was adapted to teach better mothering techniques to a new generation of mothers. When the first results of parental overindulgence showed up as a two-hundred-pound teenager (as it did), the advocates of puericulture retooled again. Their prescription: Adults had to take control of a child's diet. Period. If they did not, the child would certainly become a sickling.

Soon, French mothers were being taught a new puericulture dogma. The essentials were this: Plump children were not necessarily a point of pride; mealtimes should be as nearly set in stone as possible; snacks, except on rare occasions, were to be forbidden; second helpings were out of the question, save, perhaps, on a holiday; children should eat separately from adults, so as "to avoid arousing his desires" with richer adult fare. And the child was never to be left to his or her own personal choice. Augusta Moll-Weiss, the mother of puericulture and the founder of the influential Paris School for Mothers, put it thusly: "It is unimportant how much freedom is left in this choice; the essential thing is that the quality and quantity of the diet correspond to the exertion of the young human being." Lastly, all meals should be supervised by an adult. "The basic message was surprisingly persistent," writes the cultural historian Peter N. Stearn, the principal American chronicler of puericulture. "Too much food was bad. Children must learn to discipline their appetites and eating habits, sitting for meals regularly, chewing carefully, expecting adult supervision."

For the French, struggle and tension at the table were simply part of the process of setting reasonable boundaries for children.

About this the Diamonds and the Hirschmanns and their many present-day imitators have had nothing to say. Yet this very lack of pragmatic boundary-setting may well be wreaking nutritional havoc on children.


Consider perhaps the central dogma in the child-as-food-sage theology — that a child "knows" when he or she is full. Such is the belief, repeated emphatically to this day, of many of the nation's leading nutritional authorities, both academic and popular. This despite new research showing that children, just like adults, increasingly do not know when they are full. In a recent study by the Penn State nutrition scholar Barbara Rolls, researchers examined the eating habits of two groups of children, one of three-year-olds, another of five-year-olds. Both groups reported equal levels of energy expenditure and hunger. The children were then presented with a series of plates of macaroni and cheese. The first plate was a normal serving built around age-appropriate baseline nutritional needs; the second plate was slightly larger; the third was what we might now call "supersized." The results were both revealing and worrisome. The younger children consistently ate the same baseline amount, leaving more and more food on the plate as the servings grew in size. The five-year-olds acted as if they were from another planet, devouring whatever was put on their plates. Something had happened. As was the case with their adult counterparts in another of Rolls's studies (cited in chapter 2), the mere presence of larger portions had induced increased eating. Far from trusting their own (proverbial and literal) guts, children, the author concluded, should instead get "clear information on appropriate portion sizes."

Theorizing aside, the continuing disinclination to restrain a child's eating flies in the face of overwhelming evidence that, of all age groups, children seem to be the ones who respond best to clear dietary advice. In four randomized studies of obese six- to twelve-year-olds, those offered frequent, simple behavioral advice — in other words those who were lovingly "hassled" — were substantially less overweight ten years later than those who did not get the advice. And thirty of those children were no longer obese at all.

The case for early intervention has been further buttressed by new studies on another age-old medical injunction: Never put a


child on a diet. For decades, the concern was that such undernutrition could lead to stunted growth. But the authors of a study of 1062 children under age three have concluded differently. Writing in the journal Pediatrics, they state that "a supervised, low-saturated-fat and low-cholesterol diet has no influence on growth during the first three years of life." And overweight children who were put on such a diet ended up with better, more moderate eating habits, to boot.

In other words, it's good to tell Johnny when enough is enough.

Another way to find out where food intake minus mitigation leads is simply to look at the food world that children were "allowed" to create, a world that can be summarized by one word: snacking.

In the 1980s, snacking was flat-out encouraged. The first to do so were the decade's ever more economically busy parents, who simply wanted to make sure that their kids ate something. Fair enough. But snacking was also indirectly encouraged by new understandings in nutritional science, which suggested that many people, and particularly children, needed to eat more than three meals a day. Although such insights have a strong basis in fact, their real-world utility was often twisted by the media and food companies. Suddenly it was "unnatural" to eat three times a day. Progressive people ate "when their bodies told them to." Snacking was not only not bad; it was good to eat all day long. Such was the message of the diet craze known as "grazing," a quasi-regimen endlessly fawned over and packaged by the mainstream media.

Food companies, of course, were happy to join in the party. There would be "Snack Good," "Snack Healthy," and, by the early 1990s, "SnackWell." And with sugar and fat prices lower than ever, it was easy for new, less bridled players to share the fun and profit. The number and variety of high-calorie snack foods and sweets soared; where all through the 1960s and 1970s the


number of yearly new candy and snack products remained stable — at about 250 a year — that number jumped to about 1000 by the mid-1980s and to about 2000 by the late 1980s. The rate of new, high-calorie bakery foods also jumped substantially. A revealing graphic of this trend, charted against the rise in obesity rates, was published by the American Journal of Clinical Nutrition in 1999; the two lines rise in remarkable tandem.

The increased variety in snacks and sweets enabled by the Butzian revolution in agriculture conjured a new and ever fattening pattern of eating. Just as the presence of supersized portions had stimulated Americans to eat more at mealtime, the sheer presence of a large variety of new high-calorie snacks was deeply reshaping the overall habits of the American eater. Studying the eating patterns of adults, and using the most advanced monitoring and tracking systems available, researchers at the USDA Human Nutrition Research Center at Tufts University were able to document an amazing phenomenon: The higher the variety of snack foods present in their subjects' diets, the higher the number of calories from those foods they would consume, and the higher would be the subjects' consequent body fatness. This was stunning. Historically, the drive to eat a variety of food had been a positive element in human evolution, helping early humans to increase and balance fuel intake, and, consequently, improve their metabolic, physical, and mental abilities. The drive for novelty had been healthful. Now the same drive had become unhealthful. "Today," the Tufts researchers noted, "a drive to overeat when variety is plentiful is disadvantageous for weight regulation because dietary variety is greater than ever before and comes primarily from energy-dense commercial foods rather than from the energy-poor but micronutrient-rich vegetables and fruit for which the variety principle originally evolved." In short, variety had become the enemy.

You could see the phenomenon everywhere you went. One of the more insidious of the new snacks appeared in California, where the Snak Club company began selling huge (as much as


five portions) but inexpensive ($.99) bags of unbranded candy. The bags were routinely placed near checkout stands, where a telling ad campaign forthrightly proclaimed that the bag of candy just within Junior's reach was "a meal in itself." Ten years later the label was changed to "a treat in itself."

And snack kids did. In the '8os, in every single age group, be-tween-meal chomping was louder than ever. Moreover, the troubling tendency to snack several times every day — in essence making snacking part of a de facto meal pattern — was perpetuating itself into adolescence and young adulthood. To find out how much so, the pre-eminent nutrition scholar Barry Popkin and his associates at the University of North Carolina at Chapel Hill studied the dietary patterns of 8493 nineteen- to twenty-nine-year-olds over the period 1977-1996. The results showed that not only had snacking prevalence soared, but so had the number of snacks per day and the number of calories per snacking occasion.

The demographics of increased snacking also revealed a new and disturbing trend: The most avid snackers were the poor. In the same period the snacking rate per day among low-income households went from 67 percent to 82 percent. Snacking by whites increased the least while snacking by Hispanics and African Americans increased the most. The greatest increases were in the poor-to-middle-class South. And like meals in fast-food joints, the caloric density of snacks was growing. As Popkin concluded, "This large increase in total energy and energy density of snacks among young adults in the U.S. may be contributing to our obesity epidemic."

Beyond the immediate contribution of more calories to the diet, the very nature of modern snacking may be pushing children toward obesity. New studies show that, far from the romanticized "eat when you feel like it" philosophy, eating more often in itself may make one fat, regardless of the calorie count. In a recent summary paper in the British medical journal Lancet, the scholars Gary Frost and Anne Dornhorst explained: "Not only did hunter-gatherers eat a diet low in fat and derived mainly from


slowly absorbed carbohydrates, but also by eating less frequently they spent long periods of the day post-absorptively [fasting.] Today's grazing culture results in a disproportionate amount of time being spent post-prandially, which favors glycogen synthesis and fat disposition."

In other words, a perpetually snacking child — whether he knows best or not — is literally a walking, talking, fat-making machine. One that knows no limits.

If the parents of the early '80s had, in essence, let the calories in, they would soon be aided in doing so by a most unlikely accomplice: the public school system.

Until the mid-'70s, public high schools were still a bastion of traditional postwar culture, a place where the boundaries, however frayed, still held. In postwar America, a teacher's ability to act under the legal cover of in loco parentis was rarely questioned. Hence, at least on campus, teachers wielded broad cultural influence. This was because a teacher was, for the most part, assumed to be acting in the best interests of the child. The arch of his eyebrow or the pursing of her lip meant something. School was their empire.

A second standard-bearer of campus life concerned food. Nutritionally, the cafeteria of the '70s still reigned as the center of activity for those cool enough to have parents who didn't — or couldn't, or wouldn't — pack a lunch for them. There were Coke machines, but they were few and they dispensed a mere six to eight ounces at a time, and were peripheral to campus life, the places where amateur smokers cadged a quick one between classes.

Such, at least, were the lingering images of public schools held by many '80s parents, who were (sometimes consciously and often not) hoping that the duties they no longer had time for at home might somehow be fulfilled at school.

By the time Me Generation parents began handing their children over to the schools, though, the empire had changed. The


broad, boundary-imposing authority of the teacher crumbled under cultural, legal, and economic attack. The old, wide-ranging interpretation of in loco parentis had been eroded by court case after court case. Many of these turned around the issue of free speech — something Me Generation parents held particularly dear. (And perhaps even dearer since many of the high school speech cases involved the "symbolic free speech value" of wearing one's hair long.) Other legal findings limited the ability of teachers to discipline students — corporally or otherwise. The net effect of such schoolroom jurisprudence — and of the constant hectoring and second-guessing from society in general — was to make the teacher hunker down and back off. As Thomas R. McDaniel wrote in his 1983 essay "The Teacher's Ten Commandments," the best thing a truly concerned teacher could do was simple: "Sign up for a course in school law."

The final blow to the old empire came in the form of budgetary cutbacks. Ironically, many of these were supported by — if not originated by — the very same generation that was now hoping for the old system to come through just one more time. Their support for California's Proposition 13 was a case in point.

Fueled by inflation and rising property taxes, the 1979 ballot measure required a 1 percent cap on all property tax increases. Its principal proponent, a cigar-chomping Orange County businessman named Howard Jarvis, was a longtime anti-tax activist with a penchant for public speaking. As a small businessman and property owner himself, Jarvis easily connected to the growing legions of "Invisible Americans" — the same folk, many of them traditional Democrats, who had grown tired of government inefficiency and overtaxation and who would, only a year later, elect Ronald Reagan president. Persuasively Jarvis argued their case: If property taxes weren't capped, the very people who had helped build the Golden State would no longer be able to live in their own modest postwar tract homes. The measure's opponents — they were, in truth, few — took a different tack. Proposition 13, they claimed, would bring an end to the Golden State itself; it

BOOK: Fat land : how Americans became the fattest people in the world
12.85Mb size Format: txt, pdf, ePub

Other books

King's Ransom by Sharon Sala
A Mother's Gift by Maggie Hope
Soldier for the Empire by William C Dietz
Alone by Chesla, Gary
No More Pranks by Monique Polak