Sunday, January 28, 2007

Nutrition/dietary article

NYTimes

January 28, 2007
Unhappy Meals
By MICHAEL POLLAN

Eat food. Not too much. Mostly plants.

That, more or less, is the short answer to the supposedly incredibly complicated and confusing question of what we humans should eat in order to be maximally healthy. I hate to give away the game right here at the beginning of a long essay, and I confess that I’m tempted to complicate matters in the interest of keeping things going for a few thousand more words. I’ll try to resist but will go ahead and add a couple more details to flesh out the advice. Like: A little meat won’t kill you, though it’s better approached as a side dish than as a main. And you’re much better off eating whole fresh foods than processed food products. That’s what I mean by the recommendation to eat “food.” Once, food was all you could eat, but today there are lots of other edible foodlike substances in the supermarket. These novel products of food science often come in packages festooned with health claims, which brings me to a related rule of thumb: if you’re concerned about your health, you should probably avoid food products that make health claims. Why? Because a health claim on a food product is a good indication that it’s not really food, and food is what you want to eat.

Uh-oh. Things are suddenly sounding a little more complicated, aren’t they? Sorry. But that’s how it goes as soon as you try to get to the bottom of the whole vexing question of food and health. Before long, a dense cloud bank of confusion moves in. Sooner or later, everything solid you thought you knew about the links between diet and health gets blown away in the gust of the latest study.

Last winter came the news that a low-fat diet, long believed to protect against breast cancer, may do no such thing — this from the monumental, federally financed Women’s Health Initiative, which has also found no link between a low-fat diet and rates of coronary disease. The year before we learned that dietary fiber might not, as we had been confidently told, help prevent colon cancer. Just last fall two prestigious studies on omega-3 fats published at the same time presented us with strikingly different conclusions. While the Institute of Medicine stated that “it is uncertain how much these omega-3s contribute to improving health” (and they might do the opposite if you get them from mercury-contaminated fish), a Harvard study declared that simply by eating a couple of servings of fish each week (or by downing enough fish oil), you could cut your risk of dying from a heart attack by more than a third — a stunningly hopeful piece of news. It’s no wonder that omega-3 fatty acids are poised to become the oat bran of 2007, as food scientists micro-encapsulate fish oil and algae oil and blast them into such formerly all-terrestrial foods as bread and tortillas, milk and yogurt and cheese, all of which will soon, you can be sure, sprout fishy new health claims. (Remember the rule?)

By now you’re probably registering the cognitive dissonance of the supermarket shopper or science-section reader, as well as some nostalgia for the simplicity and solidity of the first few sentences of this essay. Which I’m still prepared to defend against the shifting winds of nutritional science and food-industry marketing. But before I do that, it might be useful to figure out how we arrived at our present state of nutritional confusion and anxiety.

The story of how the most basic questions about what to eat ever got so complicated reveals a great deal about the institutional imperatives of the food industry, nutritional science and — ahem — journalism, three parties that stand to gain much from widespread confusion surrounding what is, after all, the most elemental question an omnivore confronts. Humans deciding what to eat without expert help — something they have been doing with notable success since coming down out of the trees — is seriously unprofitable if you’re a food company, distinctly risky if you’re a nutritionist and just plain boring if you’re a newspaper editor or journalist. (Or, for that matter, an eater. Who wants to hear, yet again, “Eat more fruits and vegetables”?) And so, like a large gray fog, a great Conspiracy of Confusion has gathered around the simplest questions of nutrition — much to the advantage of everybody involved. Except perhaps the ostensible beneficiary of all this nutritional expertise and advice: us, and our health and happiness as eaters.

FROM FOODS TO NUTRIENTS

It was in the 1980s that food began disappearing from the American supermarket, gradually to be replaced by “nutrients,” which are not the same thing. Where once the familiar names of recognizable comestibles — things like eggs or breakfast cereal or cookies — claimed pride of place on the brightly colored packages crowding the aisles, now new terms like “fiber” and “cholesterol” and “saturated fat” rose to large-type prominence. More important than mere foods, the presence or absence of these invisible substances was now generally believed to confer health benefits on their eaters. Foods by comparison were coarse, old-fashioned and decidedly unscientific things — who could say what was in them, really? But nutrients — those chemical compounds and minerals in foods that nutritionists have deemed important to health — gleamed with the promise of scientific certainty; eat more of the right ones, fewer of the wrong, and you would live longer and avoid chronic diseases.

Nutrients themselves had been around, as a concept, since the early 19th century, when the English doctor and chemist William Prout identified what came to be called the “macronutrients”: protein, fat and carbohydrates. It was thought that that was pretty much all there was going on in food, until doctors noticed that an adequate supply of the big three did not necessarily keep people nourished. At the end of the 19th century, British doctors were puzzled by the fact that Chinese laborers in the Malay states were dying of a disease called beriberi, which didn’t seem to afflict Tamils or native Malays. The mystery was solved when someone pointed out that the Chinese ate “polished,” or white, rice, while the others ate rice that hadn’t been mechanically milled. A few years later, Casimir Funk, a Polish chemist, discovered the “essential nutrient” in rice husks that protected against beriberi and called it a “vitamine,” the first micronutrient. Vitamins brought a kind of glamour to the science of nutrition, and though certain sectors of the population began to eat by its expert lights, it really wasn’t until late in the 20th century that nutrients managed to push food aside in the popular imagination of what it means to eat.

No single event marked the shift from eating food to eating nutrients, though in retrospect a little-noticed political dust-up in Washington in 1977 seems to have helped propel American food culture down this dimly lighted path. Responding to an alarming increase in chronic diseases linked to diet — including heart disease, cancer and diabetes — a Senate Select Committee on Nutrition, headed by George McGovern, held hearings on the problem and prepared what by all rights should have been an uncontroversial document called “Dietary Goals for the United States.” The committee learned that while rates of coronary heart disease had soared in America since World War II, other cultures that consumed traditional diets based largely on plants had strikingly low rates of chronic disease. Epidemiologists also had observed that in America during the war years, when meat and dairy products were strictly rationed, the rate of heart disease temporarily plummeted.

Naïvely putting two and two together, the committee drafted a straightforward set of dietary guidelines calling on Americans to cut down on red meat and dairy products. Within weeks a firestorm, emanating from the red-meat and dairy industries, engulfed the committee, and Senator McGovern (who had a great many cattle ranchers among his South Dakota constituents) was forced to beat a retreat. The committee’s recommendations were hastily rewritten. Plain talk about food — the committee had advised Americans to actually “reduce consumption of meat” — was replaced by artful compromise: “Choose meats, poultry and fish that will reduce saturated-fat intake.”

A subtle change in emphasis, you might say, but a world of difference just the same. First, the stark message to “eat less” of a particular food has been deep-sixed; don’t look for it ever again in any official U.S. dietary pronouncement. Second, notice how distinctions between entities as different as fish and beef and chicken have collapsed; those three venerable foods, each representing an entirely different taxonomic class, are now lumped together as delivery systems for a single nutrient. Notice too how the new language exonerates the foods themselves; now the culprit is an obscure, invisible, tasteless — and politically unconnected — substance that may or may not lurk in them called “saturated fat.”

The linguistic capitulation did nothing to rescue McGovern from his blunder; the very next election, in 1980, the beef lobby helped rusticate the three-term senator, sending an unmistakable warning to anyone who would challenge the American diet, and in particular the big chunk of animal protein sitting in the middle of its plate. Henceforth, government dietary guidelines would shun plain talk about whole foods, each of which has its trade association on Capitol Hill, and would instead arrive clothed in scientific euphemism and speaking of nutrients, entities that few Americans really understood but that lack powerful lobbies in Washington. This was precisely the tack taken by the National Academy of Sciences when it issued its landmark report on diet and cancer in 1982. Organized nutrient by nutrient in a way guaranteed to offend no food group, it codified the official new dietary language. Industry and media followed suit, and terms like polyunsaturated, cholesterol, monounsaturated, carbohydrate, fiber, polyphenols, amino acids and carotenes soon colonized much of the cultural space previously occupied by the tangible substance formerly known as food. The Age of Nutritionism had arrived.

THE RISE OF NUTRITIONISM

The first thing to understand about nutritionism — I first encountered the term in the work of an Australian sociologist of science named Gyorgy Scrinis — is that it is not quite the same as nutrition. As the “ism” suggests, it is not a scientific subject but an ideology. Ideologies are ways of organizing large swaths of life and experience under a set of shared but unexamined assumptions. This quality makes an ideology particularly hard to see, at least while it’s exerting its hold on your culture. A reigning ideology is a little like the weather, all pervasive and virtually inescapable. Still, we can try.

In the case of nutritionism, the widely shared but unexamined assumption is that the key to understanding food is indeed the nutrient. From this basic premise flow several others. Since nutrients, as compared with foods, are invisible and therefore slightly mysterious, it falls to the scientists (and to the journalists through whom the scientists speak) to explain the hidden reality of foods to us. To enter a world in which you dine on unseen nutrients, you need lots of expert help.

But expert help to do what, exactly? This brings us to another unexamined assumption: that the whole point of eating is to maintain and promote bodily health. Hippocrates’s famous injunction to “let food be thy medicine” is ritually invoked to support this notion. I’ll leave the premise alone for now, except to point out that it is not shared by all cultures and that the experience of these other cultures suggests that, paradoxically, viewing food as being about things other than bodily health — like pleasure, say, or socializing — makes people no less healthy; indeed, there’s some reason to believe that it may make them more healthy. This is what we usually have in mind when we speak of the “French paradox” — the fact that a population that eats all sorts of unhealthful nutrients is in many ways healthier than we Americans are. So there is at least a question as to whether nutritionism is actually any good for you.

Another potentially serious weakness of nutritionist ideology is that it has trouble discerning qualitative distinctions between foods. So fish, beef and chicken through the nutritionists’ lens become mere delivery systems for varying quantities of fats and proteins and whatever other nutrients are on their scope. Similarly, any qualitative distinctions between processed foods and whole foods disappear when your focus is on quantifying the nutrients they contain (or, more precisely, the known nutrients).

This is a great boon for manufacturers of processed food, and it helps explain why they have been so happy to get with the nutritionism program. In the years following McGovern’s capitulation and the 1982 National Academy report, the food industry set about re-engineering thousands of popular food products to contain more of the nutrients that science and government had deemed the good ones and less of the bad, and by the late ’80s a golden era of food science was upon us. The Year of Eating Oat Bran — also known as 1988 — served as a kind of coming-out party for the food scientists, who succeeded in getting the material into nearly every processed food sold in America. Oat bran’s moment on the dietary stage didn’t last long, but the pattern had been established, and every few years since then a new oat bran has taken its turn under the marketing lights. (Here comes omega-3!)

By comparison, the typical real food has more trouble competing under the rules of nutritionism, if only because something like a banana or an avocado can’t easily change its nutritional stripes (though rest assured the genetic engineers are hard at work on the problem). So far, at least, you can’t put oat bran in a banana. So depending on the reigning nutritional orthodoxy, the avocado might be either a high-fat food to be avoided (Old Think) or a food high in monounsaturated fat to be embraced (New Think). The fate of each whole food rises and falls with every change in the nutritional weather, while the processed foods are simply reformulated. That’s why when the Atkins mania hit the food industry, bread and pasta were given a quick redesign (dialing back the carbs; boosting the protein), while the poor unreconstructed potatoes and carrots were left out in the cold.

Of course it’s also a lot easier to slap a health claim on a box of sugary cereal than on a potato or carrot, with the perverse result that the most healthful foods in the supermarket sit there quietly in the produce section, silent as stroke victims, while a few aisles over, the Cocoa Puffs and Lucky Charms are screaming about their newfound whole-grain goodness.

EAT RIGHT, GET FATTER

So nutritionism is good for business. But is it good for us? You might think that a national fixation on nutrients would lead to measurable improvements in the public health. But for that to happen, the underlying nutritional science, as well as the policy recommendations (and the journalism) based on that science, would have to be sound. This has seldom been the case.

Consider what happened immediately after the 1977 “Dietary Goals” — McGovern’s masterpiece of politico-nutritionist compromise. In the wake of the panel’s recommendation that we cut down on saturated fat, a recommendation seconded by the 1982 National Academy report on cancer, Americans did indeed change their diets, endeavoring for a quarter-century to do what they had been told. Well, kind of. The industrial food supply was promptly reformulated to reflect the official advice, giving us low-fat pork, low-fat Snackwell’s and all the low-fat pasta and high-fructose (yet low-fat!) corn syrup we could consume. Which turned out to be quite a lot. Oddly, America got really fat on its new low-fat diet — indeed, many date the current obesity and diabetes epidemic to the late 1970s, when Americans began binging on carbohydrates, ostensibly as a way to avoid the evils of fat.

This story has been told before, notably in these pages (“What if It’s All Been a Big Fat Lie?” by Gary Taubes, July 7, 2002), but it’s a little more complicated than the official version suggests. In that version, which inspired the most recent Atkins craze, we were told that America got fat when, responding to bad scientific advice, it shifted its diet from fats to carbs, suggesting that a re-evaluation of the two nutrients is in order: fat doesn’t make you fat; carbs do. (Why this should have come as news is a mystery: as long as people have been raising animals for food, they have fattened them on carbs.)

But there are a couple of problems with this revisionist picture. First, while it is true that Americans post-1977 did begin binging on carbs, and that fat as a percentage of total calories in the American diet declined, we never did in fact cut down on our consumption of fat. Meat consumption actually climbed. We just heaped a bunch more carbs onto our plates, obscuring perhaps, but not replacing, the expanding chunk of animal protein squatting in the center.

How did that happen? I would submit that the ideology of nutritionism deserves as much of the blame as the carbohydrates themselves do — that and human nature. By framing dietary advice in terms of good and bad nutrients, and by burying the recommendation that we should eat less of any particular food, it was easy for the take-home message of the 1977 and 1982 dietary guidelines to be simplified as follows: Eat more low-fat foods. And that is what we did. We’re always happy to receive a dispensation to eat more of something (with the possible exception of oat bran), and one of the things nutritionism reliably gives us is some such dispensation: low-fat cookies then, low-carb beer now. It’s hard to imagine the low-fat craze taking off as it did if McGovern’s original food-based recommendations had stood: eat fewer meat and dairy products. For how do you get from that stark counsel to the idea that another case of Snackwell’s is just what the doctor ordered?

BAD SCIENCE

But if nutritionism leads to a kind of false consciousness in the mind of the eater, the ideology can just as easily mislead the scientist. Most nutritional science involves studying one nutrient at a time, an approach that even nutritionists who do it will tell you is deeply flawed. “The problem with nutrient-by-nutrient nutrition science,” points out Marion Nestle, the New York University nutritionist, “is that it takes the nutrient out of the context of food, the food out of the context of diet and the diet out of the context of lifestyle.”

If nutritional scientists know this, why do they do it anyway? Because a nutrient bias is built into the way science is done: scientists need individual variables they can isolate. Yet even the simplest food is a hopelessly complex thing to study, a virtual wilderness of chemical compounds, many of which exist in complex and dynamic relation to one another, and all of which together are in the process of changing from one state to another. So if you’re a nutritional scientist, you do the only thing you can do, given the tools at your disposal: break the thing down into its component parts and study those one by one, even if that means ignoring complex interactions and contexts, as well as the fact that the whole may be more than, or just different from, the sum of its parts. This is what we mean by reductionist science.

Scientific reductionism is an undeniably powerful tool, but it can mislead us too, especially when applied to something as complex as, on the one side, a food, and on the other, a human eater. It encourages us to take a mechanistic view of that transaction: put in this nutrient; get out that physiological result. Yet people differ in important ways. Some populations can metabolize sugars better than others; depending on your evolutionary heritage, you may or may not be able to digest the lactose in milk. The specific ecology of your intestines helps determine how efficiently you digest what you eat, so that the same input of 100 calories may yield more or less energy depending on the proportion of Firmicutes and Bacteroidetes living in your gut. There is nothing very machinelike about the human eater, and so to think of food as simply fuel is wrong.

Also, people don’t eat nutrients, they eat foods, and foods can behave very differently than the nutrients they contain. Researchers have long believed, based on epidemiological comparisons of different populations, that a diet high in fruits and vegetables confers some protection against cancer. So naturally they ask, What nutrients in those plant foods are responsible for that effect? One hypothesis is that the antioxidants in fresh produce — compounds like beta carotene, lycopene, vitamin E, etc. — are the X factor. It makes good sense: these molecules (which plants produce to protect themselves from the highly reactive oxygen atoms produced in photosynthesis) vanquish the free radicals in our bodies, which can damage DNA and initiate cancers. At least that’s how it seems to work in the test tube. Yet as soon as you remove these useful molecules from the context of the whole foods they’re found in, as we’ve done in creating antioxidant supplements, they don’t work at all. Indeed, in the case of beta carotene ingested as a supplement, scientists have discovered that it actually increases the risk of certain cancers. Big oops.

What’s going on here? We don’t know. It could be the vagaries of human digestion. Maybe the fiber (or some other component) in a carrot protects the antioxidant molecules from destruction by stomach acids early in the digestive process. Or it could be that we isolated the wrong antioxidant. Beta is just one of a whole slew of carotenes found in common vegetables; maybe we focused on the wrong one. Or maybe beta carotene works as an antioxidant only in concert with some other plant chemical or process; under other circumstances, it may behave as a pro-oxidant.

Indeed, to look at the chemical composition of any common food plant is to realize just how much complexity lurks within it. Here’s a list of just the antioxidants that have been identified in garden-variety thyme:

4-Terpineol, alanine, anethole, apigenin, ascorbic acid, beta carotene, caffeic acid, camphene, carvacrol, chlorogenic acid, chrysoeriol, eriodictyol, eugenol, ferulic acid, gallic acid, gamma-terpinene isochlorogenic acid, isoeugenol, isothymonin, kaempferol, labiatic acid, lauric acid, linalyl acetate, luteolin, methionine, myrcene, myristic acid, naringenin, oleanolic acid, p-coumoric acid, p-hydroxy-benzoic acid, palmitic acid, rosmarinic acid, selenium, tannin, thymol, tryptophan, ursolic acid, vanillic acid.

This is what you’re ingesting when you eat food flavored with thyme. Some of these chemicals are broken down by your digestion, but others are going on to do undetermined things to your body: turning some gene’s expression on or off, perhaps, or heading off a free radical before it disturbs a strand of DNA deep in some cell. It would be great to know how this all works, but in the meantime we can enjoy thyme in the knowledge that it probably doesn’t do any harm (since people have been eating it forever) and that it may actually do some good (since people have been eating it forever) and that even if it does nothing, we like the way it tastes.

It’s also important to remind ourselves that what reductive science can manage to perceive well enough to isolate and study is subject to change, and that we have a tendency to assume that what we can see is all there is to see. When William Prout isolated the big three macronutrients, scientists figured they now understood food and what the body needs from it; when the vitamins were isolated a few decades later, scientists thought, O.K., now we really understand food and what the body needs to be healthy; today it’s the polyphenols and carotenoids that seem all-important. But who knows what the hell else is going on deep in the soul of a carrot?

The good news is that, to the carrot eater, it doesn’t matter. That’s the great thing about eating food as compared with nutrients: you don’t need to fathom a carrot’s complexity to reap its benefits.

The case of the antioxidants points up the dangers in taking a nutrient out of the context of food; as Nestle suggests, scientists make a second, related error when they study the food out of the context of the diet. We don’t eat just one thing, and when we are eating any one thing, we’re not eating another. We also eat foods in combinations and in orders that can affect how they’re absorbed. Drink coffee with your steak, and your body won’t be able to fully absorb the iron in the meat. The trace of limestone in the corn tortilla unlocks essential amino acids in the corn that would otherwise remain unavailable. Some of those compounds in that sprig of thyme may well affect my digestion of the dish I add it to, helping to break down one compound or possibly stimulate production of an enzyme to detoxify another. We have barely begun to understand the relationships among foods in a cuisine.

But we do understand some of the simplest relationships, like the zero-sum relationship: that if you eat a lot of meat you’re probably not eating a lot of vegetables. This simple fact may explain why populations that eat diets high in meat have higher rates of coronary heart disease and cancer than those that don’t. Yet nutritionism encourages us to look elsewhere for the explanation: deep within the meat itself, to the culpable nutrient, which scientists have long assumed to be the saturated fat. So they are baffled when large-population studies, like the Women’s Health Initiative, fail to find that reducing fat intake significantly reduces the incidence of heart disease or cancer.

Of course thanks to the low-fat fad (inspired by the very same reductionist fat hypothesis), it is entirely possible to reduce your intake of saturated fat without significantly reducing your consumption of animal protein: just drink the low-fat milk and order the skinless chicken breast or the turkey bacon. So maybe the culprit nutrient in meat and dairy is the animal protein itself, as some researchers now hypothesize. (The Cornell nutritionist T. Colin Campbell argues as much in his recent book, “The China Study.”) Or, as the Harvard epidemiologist Walter C. Willett suggests, it could be the steroid hormones typically present in the milk and meat; these hormones (which occur naturally in meat and milk but are often augmented in industrial production) are known to promote certain cancers.

But people worried about their health needn’t wait for scientists to settle this question before deciding that it might be wise to eat more plants and less meat. This is of course precisely what the McGovern committee was trying to tell us.

Nestle also cautions against taking the diet out of the context of the lifestyle. The Mediterranean diet is widely believed to be one of the most healthful ways to eat, yet much of what we know about it is based on studies of people living on the island of Crete in the 1950s, who in many respects lived lives very different from our own. Yes, they ate lots of olive oil and little meat. But they also did more physical labor. They fasted regularly. They ate a lot of wild greens — weeds. And, perhaps most important, they consumed far fewer total calories than we do. Similarly, much of what we know about the health benefits of a vegetarian diet is based on studies of Seventh Day Adventists, who muddy the nutritional picture by drinking absolutely no alcohol and never smoking. These extraneous but unavoidable factors are called, aptly, “confounders.” One last example: People who take supplements are healthier than the population at large, but their health probably has nothing whatsoever to do with the supplements they take — which recent studies have suggested are worthless. Supplement-takers are better-educated, more-affluent people who, almost by definition, take a greater-than-normal interest in personal health — confounding factors that probably account for their superior health.

But if confounding factors of lifestyle bedevil comparative studies of different populations, the supposedly more rigorous “prospective” studies of large American populations suffer from their own arguably even more disabling flaws. In these studies — of which the Women’s Health Initiative is the best known — a large population is divided into two groups. The intervention group changes its diet in some prescribed manner, while the control group does not. The two groups are then tracked over many years to learn whether the intervention affects relative rates of chronic disease.

When it comes to studying nutrition, this sort of extensive, long-term clinical trial is supposed to be the gold standard. It certainly sounds sound. In the case of the Women’s Health Initiative, sponsored by the National Institutes of Health, the eating habits and health outcomes of nearly 49,000 women (ages 50 to 79 at the beginning of the study) were tracked for eight years. One group of the women were told to reduce their consumption of fat to 20 percent of total calories. The results were announced early last year, producing front-page headlines of which the one in this newspaper was typical: “Low-Fat Diet Does Not Cut Health Risks, Study Finds.” And the cloud of nutritional confusion over the country darkened.

But even a cursory analysis of the study’s methods makes you wonder why anyone would take such a finding seriously, let alone order a Quarter Pounder With Cheese to celebrate it, as many newspaper readers no doubt promptly went out and did. Even the beginner student of nutritionism will immediately spot several flaws: the focus was on “fat,” rather than on any particular food, like meat or dairy. So women could comply simply by switching to lower-fat animal products. Also, no distinctions were made between types of fat: women getting their allowable portion of fat from olive oil or fish were lumped together with woman getting their fat from low-fat cheese or chicken breasts or margarine. Why? Because when the study was designed 16 years ago, the whole notion of “good fats” was not yet on the scientific scope. Scientists study what scientists can see.

But perhaps the biggest flaw in this study, and other studies like it, is that we have no idea what these women were really eating because, like most people when asked about their diet, they lied about it. How do we know this? Deduction. Consider: When the study began, the average participant weighed in at 170 pounds and claimed to be eating 1,800 calories a day. It would take an unusual metabolism to maintain that weight on so little food. And it would take an even freakier metabolism to drop only one or two pounds after getting down to a diet of 1,400 to 1,500 calories a day — as the women on the “low-fat” regimen claimed to have done. Sorry, ladies, but I just don’t buy it.

In fact, nobody buys it. Even the scientists who conduct this sort of research conduct it in the knowledge that people lie about their food intake all the time. They even have scientific figures for the magnitude of the lie. Dietary trials like the Women’s Health Initiative rely on “food-frequency questionnaires,” and studies suggest that people on average eat between a fifth and a third more than they claim to on the questionnaires. How do the researchers know that? By comparing what people report on questionnaires with interviews about their dietary intake over the previous 24 hours, thought to be somewhat more reliable. In fact, the magnitude of the lie could be much greater, judging by the huge disparity between the total number of food calories produced every day for each American (3,900 calories) and the average number of those calories Americans own up to chomping: 2,000. (Waste accounts for some of the disparity, but nowhere near all of it.) All we really know about how much people actually eat is that the real number lies somewhere between those two figures.

To try to fill out the food-frequency questionnaire used by the Women’s Health Initiative, as I recently did, is to realize just how shaky the data on which such trials rely really are. The survey, which took about 45 minutes to complete, started off with some relatively easy questions: “Did you eat chicken or turkey during the last three months?” Having answered yes, I was then asked, “When you ate chicken or turkey, how often did you eat the skin?” But the survey soon became harder, as when it asked me to think back over the past three months to recall whether when I ate okra, squash or yams, they were fried, and if so, were they fried in stick margarine, tub margarine, butter, “shortening” (in which category they inexplicably lump together hydrogenated vegetable oil and lard), olive or canola oil or nonstick spray? I honestly didn’t remember, and in the case of any okra eaten in a restaurant, even a hypnotist could not get out of me what sort of fat it was fried in. In the meat section, the portion sizes specified haven’t been seen in America since the Hoover administration. If a four-ounce portion of steak is considered “medium,” was I really going to admit that the steak I enjoyed on an unrecallable number of occasions during the past three months was probably the equivalent of two or three (or, in the case of a steakhouse steak, no less than four) of these portions? I think not. In fact, most of the “medium serving sizes” to which I was asked to compare my own consumption made me feel piggish enough to want to shave a few ounces here, a few there. (I mean, I wasn’t under oath or anything, was I?)

This is the sort of data on which the largest questions of diet and health are being decided in America today.

THE ELEPHANT IN THE ROOM

In the end, the biggest, most ambitious and widely reported studies of diet and health leave more or less undisturbed the main features of the Western diet: lots of meat and processed foods, lots of added fat and sugar, lots of everything — except fruits, vegetables and whole grains. In keeping with the nutritionism paradigm and the limits of reductionist science, the researchers fiddle with single nutrients as best they can, but the populations they recruit and study are typical American eaters doing what typical American eaters do: trying to eat a little less of this nutrient, a little more of that, depending on the latest thinking. (One problem with the control groups in these studies is that they too are exposed to nutritional fads in the culture, so over time their eating habits come to more closely resemble the habits of the intervention group.) It should not surprise us that the findings of such research would be so equivocal and confusing.

But what about the elephant in the room — the Western diet? It might be useful, in the midst of our deepening confusion about nutrition, to review what we do know about diet and health. What we know is that people who eat the way we do in America today suffer much higher rates of cancer, heart disease, diabetes and obesity than people eating more traditional diets. (Four of the 10 leading killers in America are linked to diet.) Further, we know that simply by moving to America, people from nations with low rates of these “diseases of affluence” will quickly acquire them. Nutritionism by and large takes the Western diet as a given, seeking to moderate its most deleterious effects by isolating the bad nutrients in it — things like fat, sugar, salt — and encouraging the public and the food industry to limit them. But after several decades of nutrient-based health advice, rates of cancer and heart disease in the U.S. have declined only slightly (mortality from heart disease is down since the ’50s, but this is mainly because of improved treatment), and rates of obesity and diabetes have soared.

No one likes to admit that his or her best efforts at understanding and solving a problem have actually made the problem worse, but that’s exactly what has happened in the case of nutritionism. Scientists operating with the best of intentions, using the best tools at their disposal, have taught us to look at food in a way that has diminished our pleasure in eating it while doing little or nothing to improve our health. Perhaps what we need now is a broader, less reductive view of what food is, one that is at once more ecological and cultural. What would happen, for example, if we were to start thinking about food as less of a thing and more of a relationship?

In nature, that is of course precisely what eating has always been: relationships among species in what we call food chains, or webs, that reach all the way down to the soil. Species co-evolve with the other species they eat, and very often a relationship of interdependence develops: I’ll feed you if you spread around my genes. A gradual process of mutual adaptation transforms something like an apple or a squash into a nutritious and tasty food for a hungry animal. Over time and through trial and error, the plant becomes tastier (and often more conspicuous) in order to gratify the animal’s needs and desires, while the animal gradually acquires whatever digestive tools (enzymes, etc.) are needed to make optimal use of the plant. Similarly, cow’s milk did not start out as a nutritious food for humans; in fact, it made them sick until humans who lived around cows evolved the ability to digest lactose as adults. This development proved much to the advantage of both the milk drinkers and the cows.

“Health” is, among other things, the byproduct of being involved in these sorts of relationships in a food chain — involved in a great many of them, in the case of an omnivorous creature like us. Further, when the health of one link of the food chain is disturbed, it can affect all the creatures in it. When the soil is sick or in some way deficient, so will be the grasses that grow in that soil and the cattle that eat the grasses and the people who drink the milk. Or, as the English agronomist Sir Albert Howard put it in 1945 in “The Soil and Health” (a founding text of organic agriculture), we would do well to regard “the whole problem of health in soil, plant, animal and man as one great subject.” Our personal health is inextricably bound up with the health of the entire food web.

In many cases, long familiarity between foods and their eaters leads to elaborate systems of communications up and down the food chain, so that a creature’s senses come to recognize foods as suitable by taste and smell and color, and our bodies learn what to do with these foods after they pass the test of the senses, producing in anticipation the chemicals necessary to break them down. Health depends on knowing how to read these biological signals: this smells spoiled; this looks ripe; that’s one good-looking cow. This is easier to do when a creature has long experience of a food, and much harder when a food has been designed expressly to deceive its senses — with artificial flavors, say, or synthetic sweeteners.

Note that these ecological relationships are between eaters and whole foods, not nutrients. Even though the foods in question eventually get broken down in our bodies into simple nutrients, as corn is reduced to simple sugars, the qualities of the whole food are not unimportant — they govern such things as the speed at which the sugars will be released and absorbed, which we’re coming to see as critical to insulin metabolism. Put another way, our bodies have a longstanding and sustainable relationship to corn that we do not have to high-fructose corn syrup. Such a relationship with corn syrup might develop someday (as people evolve superhuman insulin systems to cope with regular floods of fructose and glucose), but for now the relationship leads to ill health because our bodies don’t know how to handle these biological novelties. In much the same way, human bodies that can cope with chewing coca leaves — a longstanding relationship between native people and the coca plant in South America — cannot cope with cocaine or crack, even though the same “active ingredients” are present in all three. Reductionism as a way of understanding food or drugs may be harmless, even necessary, but reductionism in practice can lead to problems.

Looking at eating through this ecological lens opens a whole new perspective on exactly what the Western diet is: a radical and rapid change not just in our foodstuffs over the course of the 20th century but also in our food relationships, all the way from the soil to the meal. The ideology of nutritionism is itself part of that change. To get a firmer grip on the nature of those changes is to begin to know how we might make our relationships to food healthier. These changes have been numerous and far-reaching, but consider as a start these four large-scale ones:

From Whole Foods to Refined. The case of corn points up one of the key features of the modern diet: a shift toward increasingly refined foods, especially carbohydrates. Call it applied reductionism. Humans have been refining grains since at least the Industrial Revolution, favoring white flour (and white rice) even at the price of lost nutrients. Refining grains extends their shelf life (precisely because it renders them less nutritious to pests) and makes them easier to digest, by removing the fiber that ordinarily slows the release of their sugars. Much industrial food production involves an extension and intensification of this practice, as food processors find ways to deliver glucose — the brain’s preferred fuel — ever more swiftly and efficiently. Sometimes this is precisely the point, as when corn is refined into corn syrup; other times it is an unfortunate byproduct of food processing, as when freezing food destroys the fiber that would slow sugar absorption.

So fast food is fast in this other sense too: it is to a considerable extent predigested, in effect, and therefore more readily absorbed by the body. But while the widespread acceleration of the Western diet offers us the instant gratification of sugar, in many people (and especially those newly exposed to it) the “speediness” of this food overwhelms the insulin response and leads to Type II diabetes. As one nutrition expert put it to me, we’re in the middle of “a national experiment in mainlining glucose.” To encounter such a diet for the first time, as when people accustomed to a more traditional diet come to America, or when fast food comes to their countries, delivers a shock to the system. Public-health experts call it “the nutrition transition,” and it can be deadly.

From Complexity to Simplicity. If there is one word that covers nearly all the changes industrialization has made to the food chain, it would be simplification. Chemical fertilizers simplify the chemistry of the soil, which in turn appears to simplify the chemistry of the food grown in that soil. Since the widespread adoption of synthetic nitrogen fertilizers in the 1950s, the nutritional quality of produce in America has, according to U.S.D.A. figures, declined significantly. Some researchers blame the quality of the soil for the decline; others cite the tendency of modern plant breeding to select for industrial qualities like yield rather than nutritional quality. Whichever it is, the trend toward simplification of our food continues on up the chain. Processing foods depletes them of many nutrients, a few of which are then added back in through “fortification”: folic acid in refined flour, vitamins and minerals in breakfast cereal. But food scientists can add back only the nutrients food scientists recognize as important. What are they overlooking?

Simplification has occurred at the level of species diversity, too. The astounding variety of foods on offer in the modern supermarket obscures the fact that the actual number of species in the modern diet is shrinking. For reasons of economics, the food industry prefers to tease its myriad processed offerings from a tiny group of plant species, corn and soybeans chief among them. Today, a mere four crops account for two-thirds of the calories humans eat. When you consider that humankind has historically consumed some 80,000 edible species, and that 3,000 of these have been in widespread use, this represents a radical simplification of the food web. Why should this matter? Because humans are omnivores, requiring somewhere between 50 and 100 different chemical compounds and elements to be healthy. It’s hard to believe that we can get everything we need from a diet consisting largely of processed corn, soybeans, wheat and rice.

From Leaves to Seeds. It’s no coincidence that most of the plants we have come to rely on are grains; these crops are exceptionally efficient at transforming sunlight into macronutrients — carbs, fats and proteins. These macronutrients in turn can be profitably transformed into animal protein (by feeding them to animals) and processed foods of every description. Also, the fact that grains are durable seeds that can be stored for long periods means they can function as commodities as well as food, making these plants particularly well suited to the needs of industrial capitalism.

The needs of the human eater are another matter. An oversupply of macronutrients, as we now have, itself represents a serious threat to our health, as evidenced by soaring rates of obesity and diabetes. But the undersupply of micronutrients may constitute a threat just as serious. Put in the simplest terms, we’re eating a lot more seeds and a lot fewer leaves, a tectonic dietary shift the full implications of which we are just beginning to glimpse. If I may borrow the nutritionist’s reductionist vocabulary for a moment, there are a host of critical micronutrients that are harder to get from a diet of refined seeds than from a diet of leaves. There are the antioxidants and all the other newly discovered phytochemicals (remember that sprig of thyme?); there is the fiber, and then there are the healthy omega-3 fats found in leafy green plants, which may turn out to be most important benefit of all.

Most people associate omega-3 fatty acids with fish, but fish get them from green plants (specifically algae), which is where they all originate. Plant leaves produce these essential fatty acids (“essential” because our bodies can’t produce them on their own) as part of photosynthesis. Seeds contain more of another essential fatty acid: omega-6. Without delving too deeply into the biochemistry, the two fats perform very different functions, in the plant as well as the plant eater. Omega-3s appear to play an important role in neurological development and processing, the permeability of cell walls, the metabolism of glucose and the calming of inflammation. Omega-6s are involved in fat storage (which is what they do for the plant), the rigidity of cell walls, clotting and the inflammation response. (Think of omega-3s as fleet and flexible, omega-6s as sturdy and slow.) Since the two lipids compete with each other for the attention of important enzymes, the ratio between omega-3s and omega-6s may matter more than the absolute quantity of either fat. Thus too much omega-6 may be just as much a problem as too little omega-3.

And that might well be a problem for people eating a Western diet. As we’ve shifted from leaves to seeds, the ratio of omega-6s to omega-3s in our bodies has shifted, too. At the same time, modern food-production practices have further diminished the omega-3s in our diet. Omega-3s, being less stable than omega-6s, spoil more readily, so we have selected for plants that produce fewer of them; further, when we partly hydrogenate oils to render them more stable, omega-3s are eliminated. Industrial meat, raised on seeds rather than leaves, has fewer omega-3s and more omega-6s than preindustrial meat used to have. And official dietary advice since the 1970s has promoted the consumption of polyunsaturated vegetable oils, most of which are high in omega-6s (corn and soy, especially). Thus, without realizing what we were doing, we significantly altered the ratio of these two essential fats in our diets and bodies, with the result that the ratio of omega-6 to omega-3 in the typical American today stands at more than 10 to 1; before the widespread introduction of seed oils at the turn of the last century, it was closer to 1 to 1.

The role of these lipids is not completely understood, but many researchers say that these historically low levels of omega-3 (or, conversely, high levels of omega-6) bear responsibility for many of the chronic diseases associated with the Western diet, especially heart disease and diabetes. (Some researchers implicate omega-3 deficiency in rising rates of depression and learning disabilities as well.) To remedy this deficiency, nutritionism classically argues for taking omega-3 supplements or fortifying food products, but because of the complex, competitive relationship between omega-3 and omega-6, adding more omega-3s to the diet may not do much good unless you also reduce your intake of omega-6.

From Food Culture to Food Science. The last important change wrought by the Western diet is not, strictly speaking, ecological. But the industrialization of our food that we call the Western diet is systematically destroying traditional food cultures. Before the modern food era — and before nutritionism — people relied for guidance about what to eat on their national or ethnic or regional cultures. We think of culture as a set of beliefs and practices to help mediate our relationship to other people, but of course culture (at least before the rise of science) has also played a critical role in helping mediate people’s relationship to nature. Eating being a big part of that relationship, cultures have had a great deal to say about what and how and why and when and how much we should eat. Of course when it comes to food, culture is really just a fancy word for Mom, the figure who typically passes on the food ways of the group — food ways that, although they were never “designed” to optimize health (we have many reasons to eat the way we do), would not have endured if they did not keep eaters alive and well.

The sheer novelty and glamour of the Western diet, with its 17,000 new food products introduced every year, and the marketing muscle used to sell these products, has overwhelmed the force of tradition and left us where we now find ourselves: relying on science and journalism and marketing to help us decide questions about what to eat. Nutritionism, which arose to help us better deal with the problems of the Western diet, has largely been co-opted by it, used by the industry to sell more food and to undermine the authority of traditional ways of eating. You would not have read this far into this article if your food culture were intact and healthy; you would simply eat the way your parents and grandparents and great-grandparents taught you to eat. The question is, Are we better off with these new authorities than we were with the traditional authorities they supplanted? The answer by now should be clear.

It might be argued that, at this point in history, we should simply accept that fast food is our food culture. Over time, people will get used to eating this way and our health will improve. But for natural selection to help populations adapt to the Western diet, we’d have to be prepared to let those whom it sickens die. That’s not what we’re doing. Rather, we’re turning to the health-care industry to help us “adapt.” Medicine is learning how to keep alive the people whom the Western diet is making sick. It’s gotten good at extending the lives of people with heart disease, and now it’s working on obesity and diabetes. Capitalism is itself marvelously adaptive, able to turn the problems it creates into lucrative business opportunities: diet pills, heart-bypass operations, insulin pumps, bariatric surgery. But while fast food may be good business for the health-care industry, surely the cost to society — estimated at more than $200 billion a year in diet-related health-care costs — is unsustainable.

BEYOND NUTRITIONISM

To medicalize the diet problem is of course perfectly consistent with nutritionism. So what might a more ecological or cultural approach to the problem recommend? How might we plot our escape from nutritionism and, in turn, from the deleterious effects of the modern diet? In theory nothing could be simpler — stop thinking and eating that way — but this is somewhat harder to do in practice, given the food environment we now inhabit and the loss of sharp cultural tools to guide us through it. Still, I do think escape is possible, to which end I can now revisit — and elaborate on, but just a little — the simple principles of healthy eating I proposed at the beginning of this essay, several thousand words ago. So try these few (flagrantly unscientific) rules of thumb, collected in the course of my nutritional odyssey, and see if they don’t at least point us in the right direction.

1. Eat food. Though in our current state of confusion, this is much easier said than done. So try this: Don’t eat anything your great-great-grandmother wouldn’t recognize as food. (Sorry, but at this point Moms are as confused as the rest of us, which is why we have to go back a couple of generations, to a time before the advent of modern food products.) There are a great many foodlike items in the supermarket your ancestors wouldn’t recognize as food (Go-Gurt? Breakfast-cereal bars? Nondairy creamer?); stay away from these.

2. Avoid even those food products that come bearing health claims. They’re apt to be heavily processed, and the claims are often dubious at best. Don’t forget that margarine, one of the first industrial foods to claim that it was more healthful than the traditional food it replaced, turned out to give people heart attacks. When Kellogg’s can boast about its Healthy Heart Strawberry Vanilla cereal bars, health claims have become hopelessly compromised. (The American Heart Association charges food makers for their endorsement.) Don’t take the silence of the yams as a sign that they have nothing valuable to say about health.

3. Especially avoid food products containing ingredients that are a) unfamiliar, b) unpronounceable c) more than five in number — or that contain high-fructose corn syrup.None of these characteristics are necessarily harmful in and of themselves, but all of them are reliable markers for foods that have been highly processed.

4. Get out of the supermarket whenever possible. You won’t find any high-fructose corn syrup at the farmer’s market; you also won’t find food harvested long ago and far away. What you will find are fresh whole foods picked at the peak of nutritional quality. Precisely the kind of food your great-great-grandmother would have recognized as food.

5. Pay more, eat less. The American food system has for a century devoted its energies and policies to increasing quantity and reducing price, not to improving quality. There’s no escaping the fact that better food — measured by taste or nutritional quality (which often correspond) — costs more, because it has been grown or raised less intensively and with more care. Not everyone can afford to eat well in America, which is shameful, but most of us can: Americans spend, on average, less than 10 percent of their income on food, down from 24 percent in 1947, and less than the citizens of any other nation. And those of us who can afford to eat well should. Paying more for food well grown in good soils — whether certified organic or not — will contribute not only to your health (by reducing exposure to pesticides) but also to the health of others who might not themselves be able to afford that sort of food: the people who grow it and the people who live downstream, and downwind, of the farms where it is grown.

“Eat less” is the most unwelcome advice of all, but in fact the scientific case for eating a lot less than we currently do is compelling. “Calorie restriction” has repeatedly been shown to slow aging in animals, and many researchers (including Walter Willett, the Harvard epidemiologist) believe it offers the single strongest link between diet and cancer prevention. Food abundance is a problem, but culture has helped here, too, by promoting the idea of moderation. Once one of the longest-lived people on earth, the Okinawans practiced a principle they called “Hara Hachi Bu”: eat until you are 80 percent full. To make the “eat less” message a bit more palatable, consider that quality may have a bearing on quantity: I don’t know about you, but the better the quality of the food I eat, the less of it I need to feel satisfied. All tomatoes are not created equal.

6. Eat mostly plants, especially leaves. Scientists may disagree on what’s so good about plants — the antioxidants? Fiber? Omega-3s? — but they do agree that they’re probably really good for you and certainly can’t hurt. Also, by eating a plant-based diet, you’ll be consuming far fewer calories, since plant foods (except seeds) are typically less “energy dense” than the other things you might eat. Vegetarians are healthier than carnivores, but near vegetarians (“flexitarians”) are as healthy as vegetarians. Thomas Jefferson was on to something when he advised treating meat more as a flavoring than a food.

7. Eat more like the French. Or the Japanese. Or the Italians. Or the Greeks. Confounding factors aside, people who eat according to the rules of a traditional food culture are generally healthier than we are. Any traditional diet will do: if it weren’t a healthy diet, the people who follow it wouldn’t still be around. True, food cultures are embedded in societies and economies and ecologies, and some of them travel better than others: Inuit not so well as Italian. In borrowing from a food culture, pay attention to how a culture eats, as well as to what it eats. In the case of the French paradox, it may not be the dietary nutrients that keep the French healthy (lots of saturated fat and alcohol?!) so much as the dietary habits: small portions, no seconds or snacking, communal meals — and the serious pleasure taken in eating. (Worrying about diet can’t possibly be good for you.) Let culture be your guide, not science.

8. Cook. And if you can, plant a garden. To take part in the intricate and endlessly interesting processes of providing for our sustenance is the surest way to escape the culture of fast food and the values implicit in it: that food should be cheap and easy; that food is fuel and not communion. The culture of the kitchen, as embodied in those enduring traditions we call cuisines, contains more wisdom about diet and health than you are apt to find in any nutrition journal or journalism. Plus, the food you grow yourself contributes to your health long before you sit down to eat it. So you might want to think about putting down this article now and picking up a spatula or hoe.

9. Eat like an omnivore. Try to add new species, not just new foods, to your diet. The greater the diversity of species you eat, the more likely you are to cover all your nutritional bases. That of course is an argument from nutritionism, but there is a better one, one that takes a broader view of “health.” Biodiversity in the diet means less monoculture in the fields. What does that have to do with your health? Everything. The vast monocultures that now feed us require tremendous amounts of chemical fertilizers and pesticides to keep from collapsing. Diversifying those fields will mean fewer chemicals, healthier soils, healthier plants and animals and, in turn, healthier people. It’s all connected, which is another way of saying that your health isn’t bordered by your body and that what’s good for the soil is probably good for you, too.

Michael Pollan, a contributing writer, is the Knight professor of journalism at the University of California, Berkeley. His most recent book, “The Omnivore’s Dilemma,” was chosen by the editors of The New York Times Book Review as one of the 10 best books of 2006.

Costco impulse shopping

NYTimes

January 28, 2007
Spending
24 Rolls of Toilet Paper, a Tub of Salsa and a Plasma TV
By JULIE BICK

SHOPPING at Costco often goes something like this: Customer comes to buy bulk necessities like toilet paper and dish detergent. Customer buys those items, as well as a pack of giant muffins, three cashmere sweaters and a power tool.

It’s more than impulse buying. It is a calculated part of the company’s business plan. Call it the Costco effect.

“We always come out with too much,” said Linda Curtis Schneider, who lives in Nashville. “It’s hard to get out of there for under $200.”

Even when they are on vacation, the Schneider family seeks out the nearest Costco to gas up their rental car, grab a familiar lunch and browse for local specialties to bring back home. They have bought cases of chocolate-covered macadamia nuts from a Costco in Hawaii, gallon-sized salsa in Tucson, Ariz., and a crate of ruby red grapefruits in Marina del Rey, Calif.

The Costco Wholesale Corporation, based in Issaquah, Wash., aims to offer an inviting mix of necessities and indulgences — bulk detergent and megapacks of yogurts, stocked along with giant plasma TVs and crystal stemware.

From its first Seattle warehouse in 1983, Costco has grown to more than 500 warehouse stores worldwide and finished the 2006 fiscal year with its highest-ever sales, $58.96 billion. Costco is the largest player in the warehouse market. The rival Sam’s Club, a division of Wal-Mart Stores, operates more than 670 warehouse clubs worldwide, with a sales volume of approximately $40 billion.

Richard A. Galanti, Costco’s chief financial officer, said that while a grocery store might stock 40,000 separate types of items, and a Wal-Mart might stock 100,000, Costco will stock only the 4,000 most popular items it can find. “We try to figure out what people really want,” he said.

So, along with purchases of jumbo packs of paper towels and other supplies, impulse buying can be a big part of the Costco experience, because only the most well-liked, trendy, and fast-moving items are stocked.

Those items include iPods, individually wrapped cheese sticks to put in a child’s lunch box, as well as a few of the latest fashions.

Recently, Ms. Schneider and her college-age daughter were excited to find Ugg boots, Smashbox makeup in leather cases and Seven jeans at their Costco in Nashville. “Costco seems to go for the upper crust in taste,” she said.

Some offerings rotate in and out of the warehouse based on the season, sales volume and other factors. As a result, people may go to Costco more often than necessary to see what is new, said Steve Hoch, a retail professor at the Wharton School of the University of Pennsylvania. “When they see something they want,” he added, “they’ll be likely to go ahead and buy it, because next time they return, the item may be gone.”

While most consumers become annoyed when something they expect to find at a store is out of stock, a Costco shopper is likely to think, “I should have gotten it last time,” Professor Hoch said.

Other retailers may also seek to entice shoppers by setting limits and creating scarcity. For example, Target offers limited-edition designer clothing and home furnishings that are unique to its stores, and that are often stocked for a period of only 60 to 90 days.

And at BJ’s Wholesale Club, customers may come for their everyday grocery items, “but if they spot some jewelry or the new capri pants at a great price they will be happier,” said Teleia Farrell, a company spokeswoman. BJ’s uses items like 42-inch televisions and topaz rings to turn “ho-hum shopping into an exciting environment,” she said.

It is the same at Sam’s Club, where “members enjoy looking throughout the club for unexpected deals,” said Susan Koehler, a spokeswoman for the company.

Temporarily stocked surprises are also a calculated part of the Costco shopping experience. “We try to have hundreds of items that are different each time a customer comes to the warehouse, to create a treasure-hunt atmosphere,” said Joel Benoliel, a senior vice president. “We’ll always have the same staples — the cereal, the detergent — and then we add in the ‘wow’ items.” But at the same time, there can be a comforting sameness to each cavernous location.

Psychological factors can strongly influence buying behavior, according to Pamela N. Danziger, author of “Shopping: Why We Love It and How Retailers Can Create the Ultimate Customer Experience” (2006). Shoppers can experience an emotional thrill when they spot a deep discount, or find a particular item before it disappears from the shelves, she said, and creating those kinds of feelings has helped Costco. “Shopping is recreational there,” she said. “People seek out this psychological reward.”

Ted Reisdorf, 43, chief executive of Paragon Custom Homes of Scottsdale, Ariz., goes to Costco once every month or two and stocks up on household supplies, to save him more frequent trips to the grocery store. Once he is there, however, he walks up and down every aisle to see “what jumps out” at him. Mr. Reisdorf usually adds some books, DVDs or baked goods to his cart. “I always buy stuff I don’t exactly need,” he said.

Everyone seems to have an opinion about the Costco shopping experience. Some say they avoid going there because they always spend too much money. Others say they do not mind overspending at Costco because the company treats its workers well. A typical full-time cashier will earn $40,000 a year plus benefits after four years with the company.

Others, however, decry the essence of Costco. Teri Franklin, a mother of two in Seattle, said that Costco fed American consumerism and waste. “Instead of a single board game, you’re offered seven shrink-wrapped together,” she said. “You’ll probably end up playing with a couple and the rest will sit in the closet. But you really only wanted one.” She said she was not tempted to buy anything beyond bottled water and diapers at Costco. “How many things do you need 42 of, really?” she asked.

FOR those who want to minimize impulse buying, consumer experts say, it is helpful to shop as infrequently as possible, to arrive at the store with a list and a budget, and to walk down only the aisles that contain an item on the list. Conventional wisdom would also say that it is a good idea not to shop when hungry.

But those are not the types of shoppers who have made Costco successful. Professor Hoch said that increasing impulse buying or the number of items bought per visit was crucial to the company’s success.

Costco makes the bulk of its profit by charging an annual membership fee for access to its stores, he noted. A larger membership allows the company to buy items in bigger quantities and to pass along savings to customers. Customers who buy more items may feel that the membership fee is worth paying, because the cost is spread over all the products they buy.

Current annual membership rates are $50 for an individual, couple or business, and $100 for an Executive Membership, entitling the customer to other services.

“People laughed at the idea of charging someone to shop at your warehouse, but our membership fees are north of $1 billion a year,” Mr. Benoliel of Costco said. The company has more than 24 million member households in the United States and Canada.

Crucial to the company’s continued growth will be people like the Schneiders, who find shopping at Costco both utilitarian and serendipitous. “I might be going in for lettuce,” said Ms. Schneider, who on the spur of the moment once bought a $2,000 baby grand electronic piano at Costco, “but if I come out with other things, I don’t mind.”

Tuesday, January 23, 2007

Why are Americans single?

NYTimes

January 21, 2007
Ideas & Trends
Why Are There So Many Single Americans?
By KATE ZERNIKE

THE news that 51 percent of all women live without a spouse might be enough to make you invest in cat futures.

But consider, too, the flip side: about half of all men find themselves in the same situation. As the number of people marrying has dropped off in the last 45 years, the marriage rate has declined equally for men and for women.

The stereotype has been cemented in the popular culture: the hard-charging career girl who gets her comeuppance, either violently or dying a slow death by late-night memo and Chinese takeout. Think Glenn Close in “Fatal Attraction” and Sigourney Weaver in “Working Girl,” two enduring icons. In last year’s model, Meryl Streep in “The Devil Wears Prada” ends up single, if still singularly successful.

But when it comes to marriage, the two Americas aren’t divided by gender. And it’s not the career girls on the losing end. It’s their less educated manicurists or housekeepers, women who might arguably be less able to live on their own.

The emerging gulf is instead one of class — what demographers, sociologists and those who study the often depressing statistics about the wedded state call a “marriage gap” between the well-off and the less so.

Statistics show that college educated women are more likely to marry than non-college educated women — although they marry, on average, two years later. The popular image might have been true even 20 years ago — though generally speaking, most women probably didn’t boil the bunny rabbit the way Ms. Close’s character did in 1987. In the past, less educated women often “married up.” In “Working Girl,” Melanie Griffith triumphs. Now, marriage has become more one of equals; when more highly educated men marry, it tends to be more highly educated women. Today, Harrison Ford and Sigourney Weaver would live happily ever after.

Women with more education also are becoming less likely to divorce, or inclined to divorce, than those with less education. They are even less likely to be widowed all in all, less likely to end up alone.

“Educated women used to have a difficult time,” said David Popenoe, co-director of the National Marriage Project at Rutgers University. “Now they’re the most desired.” In Princeton, where he lives, men used to marry “way down the line,” Mr. Popenoe said. No more.

The difference extends across race lines: black women are significantly less likely to marry than white women, but among blacks, women with a college education are more likely to marry than those who do not.

Among women ages 25-34, 59 percent of college graduates are married, compared with 51 percent of non-college graduates, according to an analysis of the Census Bureau’s June 2006 Current Population Survey by Steven P. Martin, a sociologist at the University of Maryland. The same is true at older age groups: the difference is 75 percent to 62 percent for those ages 35-44, and 50 percent to 41 percent among those 65 and older.

The difference is smaller between men and women. According to the census, 55 percent of men are married, down from 69.3 percent in 1960, and 51.5 percent of women are, down from 65.9 percent in 1960.

The number of women living without a spouse is greater largely because women live longer, leaving them more likely to be widowed. Older men are also more likely to remarry. To control for these variables, consider 35-44 year olds. In 2005, according to the census, 66.2 percent of men in this age group were married, down from 88 percent in 1960; 67.2 percent of women were married, down from 87.4 percent.

The marriage gap exists for men, too. But particularly at younger ages, it is not nearly as wide as it is among women.

Commitment-averse men in their 20’s and 30’s, it turns out, look the same whether or not they have a college degree. In surveys and focus groups, they fit depressingly well into the old stereotypes: they fear marriage means a loss of liberty; they worry a wife will want to change them. They don’t trust women to tell the truth about past relationships, or they are waiting for the soul mate who hasn’t appeared. With the rising frequency of cohabitation, they can get sex without marriage, and they might lose their hard-earned money in a divorce, so what’s the rush?

As a Marriage Project report concluded, with no biological or sociological clock ticking, “boys can remain boys indefinitely.”

But that gap widens among older men. Among men ages 25 to 34, 50 percent of college graduates are married, compared to 47 percent of those who did not graduate from college. In older age brackets, there is a difference of 12 percentage points.

The class gap happens in large part because, as Christopher Jencks, a professor of social policy at Harvard, said, “like marries like.”

“If you wanted to predict the characteristics of who I would marry,” he said, “knowing my education, the strongest correlation you could observe is that someone who is educated is more likely to marry someone who is educated, and someone who is not educated is more likely to marry someone who is not educated.”

Why have things changed so much for women who don’t have the choices that educated women have? While marriage used to be something you did before launching a life or career, now it is seen as something you do after you’re financially stable — when you can buy a house, say. The same is true for all classes. But the less educated may not get there.

“Women are saying, ‘I’m not ready, I want to work for a while, the guys I hang around with don’t make enough money and they don’t want a commitment,’ ” Mr. Jencks said. “It’s the same thing a lot of African-American women in poor neighborhoods are saying. But there’s the difference that they’re having children.”

Women of all education levels figure their earning power will flatten out after they have children, he said. “The longer you wait, the higher the level it flattens out at,” he said. “That’s a good argument to wait. For the less educated, there isn’t a steep increase in salary, so there’s less incentive to wait.”

Maybe in the past, a man with little education nevertheless had a good-paying manufacturing job, with a health care and pension plan. He was a catch and represented stability.

Today, it may be hyperbolic to talk about the emasculation of the blue-collar man. But it is not only liberals concerned with the wealth gap who are watching these national trends with alarm. Social and religious conservatives have called on society to do more to address economic strains faced by this class.

“Marriage is more difficult today than it was in the past,” Mr. Popenoe said. “The people who excel in one area probably excel in that area, too. And people who are high school dropouts probably have a higher propensity to drop out of marriage.”

The last 30 years have seen a huge shift in educated women’s attitudes about divorce. Mr. Martin, who has written about women and divorce, said that three decades ago, about 30 percent of women who had graduated from college said it should be harder to get a divorce. Now, about 65 percent say so, he said.

But for less educated women and for men, the numbers have not changed; only 40 percent — a minority — say it should be harder to get a divorce.

“The way we used to look at marriage was that if women were highly educated, they had higher earning power, they were more culturally liberal and people might have predicted less marriage among them,” Mr. Martin said. “What’s becoming more powerful is the idea that economic resources are conducive to stable marriages. Women who have more money or the potential for more money are married to men who have more stable income.”

All this leads to a happiness gap, too. According to the Marriage Project, the percentage of spouses who rate their marriage as “very happy” has dropped among those without a college education, while it has risen or held steady among those better educated.

The better educated husbands and wives tend to share intellectual interests and economic backgrounds, as well as ideas about the division of household roles. They also have more earning power. And as in so many other things, in marriage, money helps ease the way.

Tuesday, January 16, 2007

Condo demand down

NYTimes

January 16, 2007
Buyers Scarce, Many Condos Are for Rent
By VIKAS BAJAJ

WASHINGTON — David Franco’s illuminated model of a proposed 10-story condominium tower dominates a sales center that, in spite of the “Now Selling” banner still fluttering outside, is conspicuously closed for business.

“We could have waited it out and kept pushing and pushing,” Mr. Franco said about the decision to abandon plans to sell 180 luxury condominiums with floor-to-ceiling windows offering views of the Washington Monument and Capitol Hill. “But it would have taken significantly longer.”

After six weeks of failing to lure more than a couple of dozen buyers, Mr. Franco and his partner, Jeff Blum, joined the builders of nearly 6,000 condominium units in the Washington metropolitan area who have decided in the last three months to recast their projects as rental apartment buildings.

Since the middle of 2006, the frenzied condominium market here and in several other big cities like Las Vegas, Miami and Boston has collapsed. Once roaring sales have slowed to a trickle, sparse inventory has mushroomed into a glut and soaring prices have flattened out and started falling.

In many cities, banks have significantly scaled back loans to condominium builders. Some have demanded that developers sell half or more of the units in a building before even beginning construction.

In hopes of salvaging something from their costly plans, hundreds of developers like Mr. Franco are looking to the strong market for apartments, planning to rent their units for at least a couple of years while waiting for today’s condo surplus to shrink. Mr. Franco and Mr. Blum hope to break ground on what will be a somewhat less expensive building this spring.

In some cases, developers are even turning older buildings back to rentals after a brief or aborted attempt at condo conversion. Meanwhile, another 2,500 proposed condominiums in the Washington area have been scrapped altogether, according to Delta Associates, a real estate research firm.

The latest salvage operation on the part of condo developers is far from a sure bet, however. Condominium buildings generally cost more to build and operate than those built for apartments from scratch. And while rents are high and rising in most cities, in many cases they still are not sufficient to turn a profit.

Industry analysts also point out that rents may start sagging if too many condos are converted into apartments too quickly. While rents were rising at a robust 6.1 percent annual pace in the Washington area late last year, according to the Bureau of Labor Statistics, some buildings in the suburbs have recently started promoting move-in specials and other incentives to lure renters.

“You can do it, but it isn’t as attractive,” Tom Meagher, a Boston real estate consultant, said about converting condos into apartments. “You are not going to get enough rent to cover the cost. You might have to go back and redesign the floor plans.”

In the Boston area, Mr. Meagher is tracking 600 condo projects representing about 49,000 units in various stages, from applying for permits to active construction. While the recent slowdown is forcing developers to consider converting their projects to apartments and offices, he expects as many as a third of them will never be built at all.

Mr. Franco said that he and Mr. Blum were able to cut 10 percent from the costs of their planned building, on land in the trendy U Street corridor. That should be enough to make a profit, he said. Beyond switching to some less expensive materials, they also decided to subdivide some larger units into smaller apartments.

The partners are now going through a similar financial exercise on another proposed building across the street, which was to house 225 condominiums but now could be recast as a rental building as well.

Lenders started tightening the purse strings for the condominium market in early 2006 as sales weakened first in cities like Miami and Las Vegas.

“Did the lenders pull back soon enough?” asked Robert Brennan, managing director of real estate finance at Credit Suisse in New York. “I don’t think we know yet.”

Real estate experts say condos are more susceptible to booms and busts than single-family homes are because they attract more investors who do not intend to live in them and are easier to build than a new subdivision in many cities.

And while there are tentative signs that the worst of the overall housing slump may be easing as builders cut back and interest rates remain relatively modest, condo markets continue to suffer.

Take the owner trying to sell a spacious two-bedroom condo for $879,000 in the former Columbia Hospital for Women, which closed in 2002, in the Foggy Bottom neighborhood of Washington. In 2004, the investor was so confident that he would make a handsome resale profit that he told his agent, Thomas P. Murphy, he wanted to buy five condos. Mr. Murphy said he flatly told his client he would only assist him in purchasing one unit in any one building.

“He needs $890,000 to break even, but the offers are at $800,000 to $840,000,” Mr. Murphy said. “He does remember that I told him he was not getting five of them.”

Could he rent the condo? Yes, but that option is not appealing, either. Mr. Murphy estimates that the unit could rent for $4,000 a month, far short of the $6,800 a month the condo costs in mortgage interest, maintenance fees, insurance and taxes.

“They have a choice of how they want to lose it,” Mr. Murphy said of investors and condo developers. “Drip by drip or in one slap.”

Mr. Murphy said he believed condo sales had picked up somewhat lately and he even ran a four-way bidding contest on one well-priced condo in Foggy Bottom, near the State Department. But the supply of newly built condos is so large and so many of them are similar to each other that many sellers are having to sharply cut their asking price. Others have simply given up.

At the end of 2006, 24,200 units were on the market in the Washington area, up from 13,000 at the start of 2005. Sales have slowed to 663 in the fourth quarter of 2006 from 3,520 in the first quarter of 2005, according to Delta Associates. Recorded prices have been flat, which probably masks an effective decline since only the most attractive properties are selling and many owners throw in extra inducements that do not show up in official figures.

One of the few exceptions to the trend is in Manhattan, particularly at the high end. Condo and co-op sales increased to 2,441 in the fourth quarter, from 1,574 a year ago, and inventory was relatively flat at 5,900, said Jonathan J. Miller, an appraiser. Much of the increase can be attributed to a legal change in how sales of co-ops are recorded, but Mr. Miller said a 5.5 percent drop in prices from the third quarter also helped.

Nationally, condominium sales have fallen further than those of single-family properties, 13.6 percent from November 2005 to the same month in 2006; free-standing homes showed a 10.7 percent decline in the same period. Inventories have risen 38.1 percent for condos and 29.6 percent for individual homes, according to the National Association of Realtors. The national median price — half the condos sold for more and half for less — was $224,600 in November, unchanged from November 2005.

But there is no comprehensive, national source of data for new condominiums sales. The Realtors group only measures sales of existing units and the Commerce Department, which tracks sales of new single-family homes, does not collect data on condominiums.

In the recent housing boom, many cities welcomed condos, hoping the young, upper-income set they attract would help revitalize older neighborhoods. In some cities, condominium construction also gave municipal officials an opportunity to demand that developers set aside some units for affordable housing in exchange for zoning and building approvals.

In Washington, the area around 14th and U Streets was one of several formerly run-down neighborhoods to get a facelift largely from new condo projects. The area was once a hub of African-American civic and cultural life, but the neighborhood was ravaged by riots in 1968 after Martin Luther King Jr.’s assassination and fell into decades of neglect and disrepair.

The area has now become home to trendy cafes, a Whole Foods grocery and other stores. But signs of its hardscrabble past linger on in dilapidated apartment buildings and storefronts. The influx of transplants from nearby Dupont Circle and Adams Morgan has also raised the usual strains that accompany gentrification — rising rents, increased traffic and the displacement of local residents.

Mr. Franco, who lives in the neighborhood, said he was sensitive to those concerns. His company, Level 2 Development, contributed $1 million to help a group of tenants in low-income apartments buy their building as part of a deal with the local government for the approval of his condo project.

He had hoped to take up residence in a 3,200-square-foot corner unit with an expansive terrace, which will now be cut up into smaller rental apartments.

But as he drove around the neighborhood recently pointing out rows of redeveloped buildings, he acknowledged that the market might have reached its limit for now. As an example, he pointed to a used car lot that seemed to be a vestige of a bygone era.

“The reality is not everything can make way for condos,” Mr. Franco said. “This guy may be doing so much business that it has far more value than what a real estate sale can fetch.”

Majority of women living without spouse

NYTimes

January 16, 2007
51% of Women Are Now Living Without Spouse
By SAM ROBERTS

For what experts say is probably the first time, more American women are living without a husband than with one, according to a New York Times analysis of census results.

In 2005, 51 percent of women said they were living without a spouse, up from 35 percent in 1950 and 49 percent in 2000.

Coupled with the fact that in 2005 married couples became a minority of all American households for the first time, the trend could ultimately shape social and workplace policies, including the ways government and employers distribute benefits.

Several factors are driving the statistical shift. At one end of the age spectrum, women are marrying later or living with unmarried partners more often and for longer periods. At the other end, women are living longer as widows and, after a divorce, are more likely than men to delay remarriage, sometimes delighting in their newfound freedom.

In addition, marriage rates among black women remain low. Only about 30 percent of black women are living with a spouse, according to the Census Bureau, compared with about 49 percent of Hispanic women, 55 percent of non-Hispanic white women and more than 60 percent of Asian women.

In a relatively small number of cases, the living arrangement is temporary, because the husbands are working out of town, are in the military or are institutionalized. But while most women eventually marry, the larger trend is unmistakable.

“This is yet another of the inexorable signs that there is no going back to a world where we can assume that marriage is the main institution that organizes people’s lives,” said Prof. Stephanie Coontz, director of public education for the Council on Contemporary Families, a nonprofit research group. “Most of these women will marry, or have married. But on average, Americans now spend half their adult lives outside marriage.”

Professor Coontz said this was probably unprecedented with the possible exception of major wartime mobilizations and when black couples were separated during slavery.

William H. Frey, a demographer with the Brookings Institution, a research group in Washington, described the shift as “a clear tipping point, reflecting the culmination of post-1960 trends associated with greater independence and more flexible lifestyles for women.”

“For better or worse, women are less dependent on men or the institution of marriage,” Dr. Frey said. “Younger women understand this better, and are preparing to live longer parts of their lives alone or with nonmarried partners. For many older boomer and senior women, the institution of marriage did not hold the promise they might have hoped for, growing up in an ‘Ozzie and Harriet’ era.”

Emily Zuzik, a 32-year-old musician and model who lives in the East Village of Manhattan, said she was not surprised by the trend.

“A lot of my friends are divorced or single or living alone,” Ms. Zuzik said. “I know a lot of people in their 30s who have roommates.”

Ms. Zuzik has lived with a boyfriend twice, once in California where the couple registered as domestic partners to qualify for his health insurance plan. “I don’t plan to live with anyone else again until I am married,” she said, “and I may opt to keep a place of my own even then.”

Linda Barth, a 56-year-old magazine editor in Houston who has never married, said, “I used to divide my women friends into single friends and married friends. Now that doesn’t seem to be an issue.”

Sheila Jamison, who also lives in the East Village and works for a media company, is 45 and single. She says her family believes she would have had a better chance of finding a husband had she attended a historically black college instead of Duke.

“Considering all the weddings I attended in the ’80s that have ended so very, very badly, I consider myself straight up lucky,” Ms. Jamison said. “I have not sworn off marriage, but if I do wed, it will be to have a companion with whom I can travel and play parlor games in my old age.”

Carol Crenshaw, 57, of Roswell, Ga., was divorced in 2005 after 33 years and says she is in no hurry to marry again.

“I’m in a place in my life where I’m comfortable,” said Ms. Crenshaw, who has two grown sons. “I can do what I want, when I want, with whom I want. I was a wife and a mother. I don’t feel like I need to do that again.”

Similarly, Shelley Fidler, 59, a public policy adviser at a law firm, has sworn off marriage. She moved from rural Virginia to the vibrant Adams Morgan neighborhood of Washington, D.C., when her 30-year marriage ended.

“The benefits were completely unforeseen for me,” Ms. Fidler said, “the free time, the amount of time I get to spend with friends, the time I have alone, which I value tremendously, the flexibility in terms of work, travel and cultural events.”

Among the more than 117 million women over the age of 15, according to the marital status category in the Census Bureau’s latest American Community Survey, 63 million are married. Of those, 3.1 million are legally separated and 2.4 million said their husbands were not living at home for one reason or another.

That brings the number of American women actually living with a spouse to 57.5 million, compared with the 59.9 million who are single or whose husbands were not living at home when the survey was taken in 2005.

Some of those situations, which the census identifies as “spouse absent” and “other,” are temporary, and, of course, even some people who describe themselves as separated eventually reunite with their spouses.

Over all, a larger share of men are married and living with their spouse — about 53 percent compared with 49 percent among women.

“Since women continue to outlive men, they have reached the nonmarital tipping point — more nonmarried than married,” Dr. Frey said. “This suggests that most girls growing up today can look forward to spending more of their lives outside of a traditional marriage.”

Pamela J. Smock, a researcher at the University of Michigan Population Studies Center, agreed, saying that “changing patterns of courtship, marriage, and that we are living longer lives all play a role.”

“Men also remarry more quickly than women after a divorce,” Ms. Smock added, “and both are increasingly likely to cohabit rather than remarry after a divorce.”

The proportion of married people, especially among younger age groups, has been declining for decades. Between 1950 and 2000, the share of women 15-to-24 who were married plummeted to 16 percent, from 42 percent. Among 25-to-34-year-olds, the proportion dropped to 58 percent, from 82 percent.

“Although we can help people ‘do’ marriage better, it is simply delusional to construct social policy or make personal life decisions on the basis that you can count on people spending most of their adult lives in marriage,” said Professor Coontz, the author of “Marriage, a History: How Love Conquered Marriage.”

Besse Gardner, 24, said she and her boyfriend met as college freshmen and started living together last April “for all the wrong reasons” — they found a great apartment on the beach in Los Angeles.

“We do not see living together as an end or even for the rest of our lives — it’s just fun right now,” Ms. Gardner said. “My roommate is someone I’d be thrilled to marry one day, but it just doesn’t make sense right now.”

Ms. Crenshaw said that some of the women in her support group for divorced women were miserable, but that she was surprised how happy she was to be single again.

“That’s not how I grew up,” she said. “That’s not how society thinks. It’s a marriage culture.”

Elissa B. Terris, 59, of Marietta, Ga., divorced in 2005 after being married for 34 years and raising a daughter, who is now an adult.

“A gentleman asked me to marry him and I said no,” she recalled. “I told him, ‘I’m just beginning to fly again, I’m just beginning to be me. Don’t take that away.’ ”

“Marriage kind of aged me because there weren’t options,” Ms. Terris said. “There was only one way to go. Now I have choices. One night I slept on the other side of the bed, and I thought, I like this side.”

She said she was returning to college to get a master’s degree (her former husband “didn’t want me to do that because I was more educated than he was”), had taken photography classes and was auditioning for a play.

“Once you go through something you think will kill you and it doesn’t,” she said, “every day is like a present.”

Saturday, January 13, 2007

Some median home prices

Homes still unaffordable for many

Despite a decline in prices, Americans aren't finding it easier to buy homes because of rising mortgage rates and relatively stagnant wages, a nonprofit group says.
By Reuters

Home prices may have dipped over the past year, but many U.S. workers would still struggle to afford a median-priced home in many cities, according to a study released today.

"American workers are really not gaining ground, and they're so far behind in the first place," said Barbara Lipman, the research director for the nonprofit Center for Housing Policy, which conducted the study.

The median home price in the 202 biggest metropolitan areas declined 2% in the third quarter of 2006 from a year earlier. But homes actually became less affordable, the study said, as mortgage rates rose and workers' pay did not keep pace.

"The real story is what happened to salaries," Lipman said. "Lower-paid occupations -- such as in retail or home health workers -- their salaries went up only about 3%."

The study said an annual income of nearly $85,000 was needed to afford the median-priced U.S. home.

In the New York area, a $500,000 median-priced home required a $171,000 annual salary. The median-priced home in San Francisco, the most expensive U.S. market, was $759,000, requiring income of $260,000. In less expensive Chicago, the median-priced home cost $254,000, requiring an $87,000 salary.

On the opposite end of the spectrum, Mansfield, Ohio, homes cost a median $85,000, requiring $29,118 in income.
Why families lack health insurance
The study assumed homebuyers needed a 10% down payment and could afford to pay 28% of their income on mortgage payments, property taxes and home insurance.

In reality, many households expend a much higher percentage of their incomes on mortgage payments, Lipman said.

To afford that, consumers cut other expenses, such as for health care and transportation, she said, citing research indicating that the cost of housing is the major reason families lack health insurance.

Other ways families cope with high housing expenses is to work longer hours or extra jobs, or by crowding in more income producers, she said.

Lipman's group found in an October 2006 survey that families that seek to buy less expensive homes in more distant suburbs -- adding to urban sprawl -- pay so much more for transportation that it eliminates the savings.

Though home prices range widely across the country, wages for low-wage jobs -- from teachers to janitors -- are about the same no matter where they are, Lipman said.

The report cited housing-aid programs offered by some big-city hospitals that have plenty of modestly-paid workers.

"For the low- to moderate-income individuals that we're talking about, they're not going to be helped by marginal declines in home prices," Lipman said. "The only way to address the problem is to create more affordable (homes), which may mean higher density units, townhouses and condos."