NYTimes
December 25, 2006
The Doctor’s World
The Man on the Table Devised the Surgery
By LAWRENCE K. ALTMAN
In late afternoon last Dec. 31, Dr. Michael E. DeBakey, then 97, was alone at home in Houston in his study preparing a lecture when a sharp pain ripped through his upper chest and between his shoulder blades, then moved into his neck.
Dr. DeBakey, one of the most influential heart surgeons in history, assumed his heart would stop in a few seconds.
“It never occurred to me to call 911 or my physician,” Dr. DeBakey said, adding: “As foolish as it may appear, you are, in a sense, a prisoner of the pain, which was intolerable. You’re thinking, What could I do to relieve myself of it. If it becomes intense enough, you’re perfectly willing to accept cardiac arrest as a possible way of getting rid of the pain.”
But when his heart kept beating, Dr. DeBakey suspected that he was not having a heart attack. As he sat alone, he decided that a ballooning had probably weakened the aorta, the main artery leading from the heart, and that the inner lining of the artery had torn, known as a dissecting aortic aneurysm.
No one in the world was more qualified to make that diagnosis than Dr. DeBakey because, as a younger man, he devised the operation to repair such torn aortas, a condition virtually always fatal. The operation has been performed at least 10,000 times around the world and is among the most demanding for surgeons and patients.
Over the past 60 years, Dr. DeBakey has changed the way heart surgery is performed. He was one of the first to perform coronary bypass operations. He trained generations of surgeons at the Baylor College of Medicine; operated on more than 60,000 patients; and in 1996 was summoned to Moscow by Boris Yeltsin, then the president of Russia, to aid in his quintuple heart bypass operation.
Now Dr. DeBakey is making history in a different way — as a patient. He was released from Methodist Hospital in Houston in September and is back at work. At 98, he is the oldest survivor of his own operation, proving that a healthy man of his age could endure it.
“He’s probably right out there at the cutting edge of a whole generation of people in their 90s who are going to survive” after such medical ordeals, one of his doctors, Dr. James L. Pool, said.
But beyond the medical advances, Dr. DeBakey’s story is emblematic of the difficulties that often accompany care at the end of life. It is a story of debates over how far to go in treating someone so old, late-night disputes among specialists about what the patient would want, and risky decisions that, while still being argued over, clearly saved Dr. DeBakey’s life.
It is also a story of Dr. DeBakey himself, a strong-willed pioneer who at one point was willing to die, concedes he was at times in denial about how sick he was and is now plowing into life with as much zest and verve as ever.
But Dr. DeBakey’s rescue almost never happened.
He refused to be admitted to a hospital until late January. As his health deteriorated and he became unresponsive in the hospital in early February, his surgical partner of 40 years, Dr. George P. Noon, decided an operation was the only way to save his life. But the hospital’s anesthesiologists refused to put Dr. DeBakey to sleep because such an operation had never been performed on someone his age and in his condition. Also, they said Dr. DeBakey had signed a directive that forbade surgery.
As the hospital’s ethics committee debated in a late-night emergency meeting on the 12th floor of Methodist Hospital, Dr. DeBakey’s wife, Katrin, barged in to demand that the operation begin immediately.
In the end, the ethics committee approved the operation; an anesthesiology colleague of Dr. DeBakey’s, who now works at a different hospital, agreed to put him to sleep; and the seven-hour operation began shortly before midnight on Feb. 9. “It is a miracle,” Dr. DeBakey said as he sat eating dinner in a Houston restaurant recently. “I really should not be here.”
The costs of Dr. DeBakey’s care easily exceeded $1 million. Methodist Hospital and his doctors say they have not charged Dr. DeBakey. His hospitalizations were under pseudonyms to help protect his privacy, which could make collecting insurance difficult. Methodist Hospital declined to say what the costs were or discuss the case further. Dr. DeBakey says he thinks the hospital should not have been secretive about his illness.
Dr. DeBakey’s doctors acknowledge that he got an unusually high level of care. But they said that they always tried to abide by a family’s wishes and that they would perform the procedure on any patient regardless of age, if the patient’s overall health was otherwise good.
Dr. DeBakey agreed to talk, and permitted his doctors to talk, because of a professional relationship of decades with this reporter, who is also a physician, and because he wanted to set the record straight for the public about what happened and explain how a man nearly 100 years old could survive.
A Preliminary Diagnosis
As Dr. DeBakey lay on the couch alone that night, last New Year’s Eve, he reasoned that a heart attack was unlikely because periodic checkups had never indicated he was at risk. An aortic dissection was more likely because of the pain, even though there was no hint of that problem in a routine echocardiogram a few weeks earlier.
Mrs. DeBakey and their daughter, Olga, had left for the beach in Galveston, but turned back because of heavy traffic. They arrived home to find Dr. DeBakey lying on the couch. Not wanting to alarm them, he lied and said he had fallen asleep and awakened with a pulled muscle.
“I did not want Katrin to be aware of my self-diagnosis because, in a sense, I would be telling her that I am going to die soon,” he said.
An anxious Mrs. DeBakey called two of her husband’s colleagues: Dr. Mohammed Attar, his longtime physician, and Dr. Matthias Loebe, who was covering for Dr. Noon. They came to the house quickly and became concerned because Dr. DeBakey had been in excellent health. After listening to him give a more frank account of his pain, they shared his suspicion of an aortic dissection.
Dr. DeBakey and his doctors agreed that for a firm diagnosis he would need a CT scan and other imaging tests, but he delayed them until Jan. 3.
The tests showed that Dr. DeBakey had a type 2 dissecting aortic aneurysm, according to a standard classification system he himself devised years earlier. Rarely did anyone survive that without surgery.
Still, Dr. DeBakey says that he refused admission to Methodist Hospital, in part because he did not want to be confined and he “was hopeful that this was not as bad as I first thought.” He feared the operation that he had developed to treat this condition might, at his age, leave him mentally or physically crippled. “I’d rather die,” he said.
Over the years, he had performed anatomically perfect operations on some patients who nevertheless died or survived with major complications. “I was trying to avoid all that,” he said.
Instead, he gambled on long odds that his damaged aorta would heal on its own. He chose to receive care at home. For more than three weeks, doctors made frequent house calls to make sure his blood pressure was low enough to prevent the aorta from rupturing. Around the clock, nurses monitored his food and drink. Periodically, he went to Methodist Hospital for imaging tests to measure the aneurysm’s size.
On Jan. 6, he insisted on giving the lecture he had been preparing on New Year’s Eve to the Academy of Medicine, Engineering and Science of Texas, of which he is a founding member. The audience in Houston included Nobel Prize winners and Senator Kay Bailey Hutchison.
Mrs. DeBakey stationed people around the podium to catch her husband if he slumped. Dr. DeBakey looked gray and spoke softly, but finished without incident. Then he listened to another lecture — which, by coincidence, was about the lethal dangers of dissecting aneurysms.
Dr. DeBakey, a master politician, said he could not pass up a chance to chat with the senator. He attended the academy luncheon and then went home.
In providing the extraordinary home care, the doctors were respecting the wishes of Dr. DeBakey and their actions reflected their awe of his power.
“People are very scared of him around here,” said Dr. Loebe, the heart surgeon who came to Dr. DeBakey’s home on New Year’s Eve. “He is the authority. It is very difficult to stand up and tell him what to do.”
But as time went on, the doctors could not adequately control Dr. DeBakey’s blood pressure. His nutrition was poor. He became short of breath. His kidneys failed. Fluid collected in the pericardial sac covering his heart, suggesting the aneurysm was leaking.
Dr. DeBakey now says that he was in denial. He did not admit to himself that he was getting worse. But on Jan. 23, he yielded and was admitted to the hospital.
Tests showed that the aneurysm was enlarging dangerously; the diameter increased to 6.6 centimeters on Jan. 28, up from 5.2 centimeters on Jan. 3. Dr. Noon said that when he and other doctors showed Dr. DeBakey the scans and recommended surgery, Dr. DeBakey said he would re-evaluate the situation in a few days.
By Feb. 9, with the aneurysm up to 7.5 centimeters and Dr. DeBakey unresponsive and near death, a decision had to be made.
“If we didn’t operate on him that day that was it, he was gone for sure,” Dr. Noon said.
At that point, Dr. DeBakey was unable to speak for himself. The surgeons gathered and decided they should proceed, despite the dangers. “We were doing what we thought was right,” Dr. Noon said, adding that “nothing made him a hopeless candidate for the operation except for being 97.” All family members agreed to the operation.
Dr. Bobby R. Alford, one of Dr. DeBakey’s physicians and a successor as chancellor of Baylor College of Medicine, said the doctors had qualms. “We could have walked away,” he said.
He and Dr. Noon discussed the decision several times. “We recognized the condemnation that could occur,” Dr. Alford said. “The whole surgical world would come down on us for doing something stupid, which it might have seemed to people who were not there.”
Surgery would be enormously risky and unlikely to offer clear-cut results — either a full recovery or death, Dr. Noon and his colleagues told Mrs. DeBakey, Olga, sons from a first marriage, and Dr. DeBakey’s sisters, Lois and Selma. The doctors said Dr. DeBakey might develop new ailments and need dialysis and a tracheostomy to help his breathing. They said the family’s decision could inflict prolonged suffering for all involved.
Olga and she “prayed a lot,” said Mrs. DeBakey, who is from Germany. “We had a healer in Europe who advised us that he will come through it. That helped us.”
Then things got more complicated.
A Refusal to Treat
At that point the Methodist Hospital anesthesiologists adamantly refused to accept Dr. DeBakey as a patient. They cited a standard form he had signed directing that he not be resuscitated if his heart stopped and a note in the chart saying he did not want surgery for the aortic dissection and aneurysm. They were concerned about his age and precarious physical condition.
Dr. Alford, the 72-year-old chancellor, said he was stunned by the refusal, an action he had never seen or heard about in his career.
Dr. Noon said none of the anesthesiologists had been involved in Dr. DeBakey’s care, yet they made a decision based on grapevine information without reading his medical records. So he insisted that the anesthesiologists state their objections directly to the DeBakey family.
Mrs. DeBakey said the anesthesiologists feared that Dr. DeBakey would die on the operating table and did not want to become known as the doctors who killed him. Dr. Joseph J. Naples, the hospital’s chief anesthesiologist, did not return repeated telephone calls to his office for comment.
Around 7 p.m., Mrs. DeBakey called Dr. Salwa A. Shenaq, an anesthesiologist friend who had worked with Dr. DeBakey for 22 years at Methodist Hospital and who now works at the nearby Michael E. DeBakey Veterans Affairs Medical Center.
Dr. Shenaq rushed from home. When she arrived, she said, Dr. Naples told her that he and his staff would not administer anesthesia to Dr. DeBakey. She said that a medical staff officer, whom she declined to name, warned her that she could be charged with assault if she touched Dr. DeBakey. The officer also told Dr. Shenaq that she could not give Dr. DeBakey anesthesia because she did not have Methodist Hospital privileges. She made it clear that she did, she said.
Administrators, lawyers and doctors discussed the situation, in particular the ambiguities of Dr. DeBakey’s wishes. Yes, Dr. Pool had written on his chart that Dr. DeBakey said he did not want surgery for a dissection. But Dr. Noon and the family thought the note in the chart no longer applied because Dr. DeBakey’s condition had so deteriorated and his only hope was his own procedure.
“They were going back and forth,” Dr. Shenaq said. “One time, they told me go ahead. Then, no, we cannot go ahead.”
To fulfill its legal responsibilities, Methodist Hospital summoned members of its ethics committee, who arrived in an hour. They met with Dr. DeBakey’s doctors in a private dining room a few yards from Dr. DeBakey’s room, according to five of his doctors who were present.
Their patient was a man who had always been in command. Now an unresponsive Dr. DeBakey had no control over his own destiny.
The ethics committee representatives wanted to follow Texas law, which, in part, requires assurance that doctors respect patient and family wishes.
Each of Dr. DeBakey’s doctors had worked with him for more than 20 years. One, Dr. Pool, said they felt they knew Dr. DeBakey well enough to answer another crucial question from the ethics committee: As his physicians, what did they believe he would choose for himself in such a dire circumstance if he had the ability to make that decision?
Dr. Noon said that Dr. DeBakey had told him it was time for nature to take its course, but also told him that the doctors had “to do what we need to do.” Members of Dr. DeBakey’s medical team said they interpreted the statements differently. Some thought he meant that they should do watchful waiting, acting only if conditions warranted; others thought it meant he wanted to die.
The question was whether the operation would counter Dr. DeBakey’s wishes expressed in his signed “do not resuscitate” order. Some said that everything Dr. DeBakey did was for his family. And the family wanted the operation.
After the committee members had met for an hour, Mrs. DeBakey could stand it no longer. She charged into the room.
“My husband’s going to die before we even get a chance to do anything — let’s get to work,” she said she told them.
The discussion ended. The majority ruled in a consensus without a formal vote. No minutes were kept, the doctors said.
“Boy, when that meeting was over, it was single focus — the best operation, the best post-operative care, the best recovery we could give him,” Dr. Pool said.
The Operation
As the ethics committee meeting ended about 11 p.m. on Feb. 9, the doctors rushed to start Dr. DeBakey’s anesthesia.
The operation was to last seven hours.
For part of that time, Dr. DeBakey’s body was cooled to protect his brain and other organs. His heart was stilled while a heart-lung bypass machine pumped oxygen-rich blood through his body. The surgeons replaced the damaged portion of Dr. DeBakey’s aorta with a six- to eight-inch graft made of Dacron, similar to material used in shirts. The graft was the type that Dr. DeBakey devised in the 1950s.
Afterward, Dr. DeBakey was taken to an intensive care unit.
Some doctors were waiting for Dr. DeBakey to die during the operation or soon thereafter, Dr. Noon said. “But he just got better.”
As feared, however, his recovery was stormy.
Surgeons had to cut separate holes into the trachea in his neck and stomach to help him breathe and eat. He needed dialysis because of kidney failure. He was on a mechanical ventilator for about six weeks because he was too weak to breathe on his own. He developed infections. His blood pressure often fell too low when aides lifted him to a sitting position. Muscle weakness left him unable to stand.
For a month, Dr. DeBakey was in the windowless intensive care unit, sometimes delirious, sometimes unresponsive, depending in part on his medications. The doctors were concerned that he had suffered severe, permanent brain damage. To allow him to tell day from night and lift his spirits, the hospital converted a private suite into an intensive care unit.
Some help came from unexpected places. On Sunday, April 2, Dr. William W. Lunn, the team’s lung specialist, took his oldest daughter, Elizabeth, 8, with him when he made rounds at the hospital and told her that a patient was feeling blue. While waiting, Elizabeth drew a cheery picture of a rainbow, butterflies, trees and grass and asked her father to give it to the patient. He did.
“You should have seen Dr. DeBakey’s eyes brighten,” Dr. Lunn said. Dr. DeBakey asked to see Elizabeth, held her hand and thanked her.
“At that point, I knew he was going to be O.K.,” Dr. Lunn said.
Dr. DeBakey was discharged on May 16. But on June 2, he was back in the hospital.
“He actually scared us because his blood pressure and heart rate were too high, he was gasping for breath” and he had fluid in his lungs, Dr. Lunn said.
But once the blood pressure was controlled with medicine, Dr. DeBakey began to recover well.
The Aftermath
At times, Dr. DeBakey says he played possum with the medical team, pretending to be asleep when he was listening to conversations.
On Aug. 21, when Dr. Loebe asked Dr. DeBakey to wake up, and he did not, Dr. Loebe announced that he had found an old roller pump that Dr. DeBakey devised in the 1930s to transfuse blood. Dr. DeBakey immediately opened his eyes. Then he gave the doctors a short lecture about how he had improved it over existing pumps.
As he recovered and Dr. DeBakey learned what had happened, he told his doctors he was happy they had operated on him. The doctors say they were relieved because they had feared he regretted their decision.
“If they hadn’t done it, I’d be dead,” he said.
The doctors and family had rolled the dice and won.
Dr. DeBakey does not remember signing an order saying not to resuscitate him and now thinks the doctors did the right thing. Doctors, he said, should be able to make decisions in such cases, without committees.
Throughout, Dr. DeBakey’s mental recovery was far ahead of his physical response.
When Dr. DeBakey first became aware of his post-operative condition, he said he “felt limp as a rag” and feared he was a quadriplegic. Kenneth Miller and other physical therapists have helped Dr. DeBakey strengthen his withered muscles.
“There were times where he needed a good bit of encouragement to participate,” Mr. Miller said. “But once he saw the progress, he was fully committed to what we were doing.”
Now he walks increasingly long distances without support. But his main means of locomotion is a motorized scooter. He races it around corridors, sometimes trailed by quick-stepping doctors of all ages.
Dr. DeBakey said he hoped to regain the stamina to resume traveling, though not at his former pace.
Dr. William L. Winters Jr., a cardiologist on Dr. DeBakey’s team, said: “I am impressed with what the body and mind can do when they work together. He absolutely has the desire to get back to where he was before. I think he’ll come close.”
Already, Dr. DeBakey is back working nearly a full day.
“I feel very good,” he said Friday. “I’m getting back into the swing of things.”
Scraps from a student in New Haven, CT. Eh, mostly just links. The Internet filtered for your enjoyment.
Monday, December 25, 2006
Thursday, December 21, 2006
Embracing room clutter
NYTimes
December 21, 2006
Saying Yes to Mess
By PENELOPE GREEN
IT is a truism of American life that we’re too darn messy, or we think we are, and we feel really bad about it. Our desks and dining room tables are awash with paper; our closets are bursting with clothes and sports equipment and old files; our laundry areas boil; our basements and garages seethe. And so do our partners — or our parents, if we happen to be teenagers.
This is why sales of home-organizing products, like accordion files and labelmakers and plastic tubs, keep going up and up, from $5.9 billion last year to a projected $7.6 billion by 2009, as do the revenues of companies that make closet organizing systems, an industry that is pulling in $3 billion a year, according to Closets magazine.
This is why January is now Get Organized Month, thanks also to the efforts of the National Association of Professional Organizers, whose 4,000 clutter-busting members will be poised, clipboards and trash bags at the ready, to minister to the 10,000 clutter victims the association estimates will be calling for its members’ services just after the new year.
But contrarian voices can be heard in the wilderness. An anti-anticlutter movement is afoot, one that says yes to mess and urges you to embrace your disorder. Studies are piling up that show that messy desks are the vivid signatures of people with creative, limber minds (who reap higher salaries than those with neat “office landscapes”) and that messy closet owners are probably better parents and nicer and cooler than their tidier counterparts. It’s a movement that confirms what you have known, deep down, all along: really neat people are not avatars of the good life; they are humorless and inflexible prigs, and have way too much time on their hands.
“It’s chasing an illusion to think that any organization — be it a family unit or a corporation — can be completely rid of disorder on any consistent basis,” said Jerrold Pollak, a neuropsychologist at Seacoast Mental Health Center in Portsmouth, N.H., whose work involves helping people tolerate the inherent disorder in their lives. “And if it could, should it be? Total organization is a futile attempt to deny and control the unpredictability of life. I live in a world of total clutter, advising on cases where you’d think from all the paper it’s the F.B.I. files on the Unabomber,” when, in fact, he said, it’s only “a person with a stiff neck.”
“My wife has threatened divorce over all the piles,” continued Dr. Pollack, who has an office at home, too. “If we had kids the health department would have to be alerted. But what can I do?”
Stop feeling bad, say the mess apologists. There are more urgent things to worry about. Irwin Kula is a rabbi based in Manhattan and author of “Yearnings: Embracing the Sacred Messiness of Life,” which was published by Hyperion in September. “Order can be profane and life-diminishing,” he said the other day. “It’s a flippant remark, but if you’ve never had a messy kitchen, you’ve probably never had a home-cooked meal. Real life is very messy, but we need to have models about how that messiness works.”
His favorite example? His 15-year-old daughter Talia’s bedroom, a picture of utter disorder — and individuality, he said.
“One day I’m standing in front of the door,” he said, “and it’s out of control and my wife, Dana, is freaking out, and suddenly I see in all the piles the dress she wore to her first dance and an earring she wore to her bat mitzvah. She’s so trusting her journal is wide open on the floor, and there are photo-booth pictures of her friends strewn everywhere. I said, ‘Omigod, her cup overflows!’ And we started to laugh.”
The room was an invitation, he said, to search for a deeper meaning under the scurf.
Last week David H. Freedman, another amiable mess analyst (and science journalist), stood bemused in front of the heathery tweed collapsible storage boxes with clear panels ($29.99) at the Container Store in Natick, Mass., and suggested that the main thing most people’s closets are brimming with is unused organizing equipment. “This is another wonderful trend,” Mr. Freedman said dryly, referring to the clear panels. “We’re going to lose the ability to put clutter away. Inside your storage box, you’d better be organized.”
Mr. Freedman is co-author, with Eric Abrahamson, of “A Perfect Mess: The Hidden Benefits of Disorder,” out in two weeks from Little, Brown & Company. The book is a meandering, engaging tour of beneficial mess and the systems and individuals reaping those benefits, like Gov. Arnold Schwarzenegger, whose mess-for-success tips include never making a daily schedule.
As a corollary, the book’s authors examine the high cost of neatness — measured in shame, mostly, and family fights, as well as wasted dollars — and generally have a fine time tipping over orthodoxies and poking fun at clutter busters and their ilk, and at the self-help tips they live or die by. They wonder: Why is it better to pack more activities into one day? By whose standards are procrastinators less effective than their well-scheduled peers? Why should children have to do chores to earn back their possessions if they leave them on the floor, as many professional organizers suggest?
In their book Mr. Freedman and Mr. Abrahamson describe the properties of mess in loving terms. Mess has resonance, they write, which means it can vibrate beyond its own confines and connect to the larger world. It was the overall scumminess of Alexander Fleming’s laboratory that led to his discovery of penicillin, from a moldy bloom in a petri dish he had forgotten on his desk.
Mess is robust and adaptable, like Mr. Schwarzenegger’s open calendar, as opposed to brittle, like a parent’s rigid schedule that doesn’t allow for a small child’s wool-gathering or balkiness. Mess is complete, in that it embraces all sorts of random elements. Mess tells a story: you can learn a lot about people from their detritus, whereas neat — well, neat is a closed book. Neat has no narrative and no personality (as any cover of Real Simple magazine will demonstrate). Mess is also natural, as Mr. Freedman and Mr. Abrahamson point out, and a real time-saver. “It takes extra effort to neaten up a system,” they write. “Things don’t generally neaten themselves.”
Indeed, the most valuable dividend of living with mess may be time. Mr. Freedman, who has three children and a hard-working spouse, Laurie Tobey-Freedman, a preschool special-needs coordinator, is studying Mandarin in his precious spare moments. Perusing a four-door stainless steel shoe cabinet ($149) at the Container Store, and imagining gussying up a shoe collection, he shook his head and said, “I don’t get the appeal of this, which may be a huge defect on my part in terms of higher forms of entertainment.”
The success of the Container Store notwithstanding, there is indeed something messy — and not in a good way — about so many organizing options. “When I think about this urge to organize, it reminds me of how it was when Americans began to take more and more control of their weight: they got fatter,” said Marian Salzman, chief marketing officer of J. Walter Thompson and co-author, with Ira Matathia, of “Next Now: Trends for the Future,” which is about to be published by Palgrave Macmillan. “I never gained weight until I went on a diet,” she said, adding that she has a room in which she hides a treadmill and, now, two bags of organizing supplies.
“I got sick of looking at them so I bought plastic tubs and stuffed the bags in the tubs and put the tubs in the room.” Right now, she said, “we are emotionally overloaded, and so what this is about is that we are getting better and better at living superficially.”
“Superficial is the new intimate,” Ms. Salzman said, gaining steam, “and these boxes, these organizing supplies, are the containers for all our superficial selves. ‘I will be a neater mom, a hipper mom, a mom that gets more done.’ Do I sound cynical?”
Nah.
In the semiotics of mess, desks may be the richest texts. Messy-desk research borrows from cognitive ergonomics, a field of study dealing with how a work environment supports productivity. Consider that desks, our work landscapes, are stand-ins for our brains, and so the piles we array on them are “cognitive artifacts,” or data cues, of our thoughts as we work.
To a professional organizer brandishing colored files and stackable trays, cluttered horizontal surfaces are a horror; to cognitive psychologists like Jay Brand, who works in the Ideation Group of Haworth Inc., the huge office furniture company, their peaks and valleys glow with intellectual intent and showcase a mind whirring away: sorting, linking, producing. (By extension, a clean desk can be seen as a dormant area, an indication that no thought or work is being undertaken.)
His studies and others, like a survey conducted last year by Ajilon Professional Staffing, in Saddle Brook, N.J., which linked messy desks to higher salaries (and neat ones to salaries under $35,000), answer Einstein’s oft-quoted remark, “If a cluttered desk is a sign of a cluttered mind, of what, then, is an empty desk?”
Don Springer, 61, is an information technology project manager and the winner of the Type O-No! contest sponsored by Dymo, the labelmaker manufacturer, in October. The contest offered $5,000 worth of clutter management — for the tools (the boxes, the bins and the systems, as well as a labelmaker) and the services of a professional organizer — to the best example of a “clutter nightmare,” as expressed by contestants in a photograph and a 100-word essay. “Type O-Nos,” reads a definition on the Dymo Web site, are “outlaws on the tidy trail, clutter criminals twice over.”
Mr. Springer, who in a phone interview spoke softly, precisely and with great humor, professed deep shame over the contents of what he calls his oh-by-the-way room, a library/junk room that his wife would like cleaned to make a nursery for a new grandchild. With a full-time job and membership in various clubs and organizations, and a desire to spend his free time seeing a movie with his wife instead of “expending the emotional energy it would take to sort through all the stuff,” Mr. Springer said, he is unable to prune the piles to his wife’s satisfaction. “There are emotional treasures buried in there, and I don’t want to part with them,” he said.
So, why bother?
“Because I love my wife and I want to make her happy,” he said.
According to a small survey that Mr. Freedman and Mr. Abrahamson conducted for their book — 160 adults representing a cross section of genders, races and incomes, Mr. Freedman said — of those who had split up with a partner, one in 12 had done so over a struggle involving one partner’s idea of mess. Happy partnerships turn out not necessarily to be those in which products from Staples figure largely. Mr. Freedman and his wife, for example, have been married for over two decades, and live in an offhandedly messy house with a violently messy basement — the latter area, where their three children hang out, decorated (though that’s not quite the right word) in a pre-1990s Tompkins Square Park lean-to style.
The room’s chaos is an example of one of Mr. Freedman and Mr. Abrahamson’s mess strategies, which is to create a mess-free DMZ (in this case, the basement stairs) and acknowledge areas of complementary mess. Cherish your mess management strategies, suggested Mr. Freedman, speaking approvingly of the pile builders and the under-the-bed stuffers; of those who let their messes wax and wane — the cyclers, he called them; and those who create satellite messes (in storage units off-site). “Most people don’t realize their own efficiency or effectiveness,” he said with a grin.
It’s also nice to remember, as Mr. Freedman pointed out, that almost anything looks pretty neat if it’s shuffled into a pile.
December 21, 2006
Saying Yes to Mess
By PENELOPE GREEN
IT is a truism of American life that we’re too darn messy, or we think we are, and we feel really bad about it. Our desks and dining room tables are awash with paper; our closets are bursting with clothes and sports equipment and old files; our laundry areas boil; our basements and garages seethe. And so do our partners — or our parents, if we happen to be teenagers.
This is why sales of home-organizing products, like accordion files and labelmakers and plastic tubs, keep going up and up, from $5.9 billion last year to a projected $7.6 billion by 2009, as do the revenues of companies that make closet organizing systems, an industry that is pulling in $3 billion a year, according to Closets magazine.
This is why January is now Get Organized Month, thanks also to the efforts of the National Association of Professional Organizers, whose 4,000 clutter-busting members will be poised, clipboards and trash bags at the ready, to minister to the 10,000 clutter victims the association estimates will be calling for its members’ services just after the new year.
But contrarian voices can be heard in the wilderness. An anti-anticlutter movement is afoot, one that says yes to mess and urges you to embrace your disorder. Studies are piling up that show that messy desks are the vivid signatures of people with creative, limber minds (who reap higher salaries than those with neat “office landscapes”) and that messy closet owners are probably better parents and nicer and cooler than their tidier counterparts. It’s a movement that confirms what you have known, deep down, all along: really neat people are not avatars of the good life; they are humorless and inflexible prigs, and have way too much time on their hands.
“It’s chasing an illusion to think that any organization — be it a family unit or a corporation — can be completely rid of disorder on any consistent basis,” said Jerrold Pollak, a neuropsychologist at Seacoast Mental Health Center in Portsmouth, N.H., whose work involves helping people tolerate the inherent disorder in their lives. “And if it could, should it be? Total organization is a futile attempt to deny and control the unpredictability of life. I live in a world of total clutter, advising on cases where you’d think from all the paper it’s the F.B.I. files on the Unabomber,” when, in fact, he said, it’s only “a person with a stiff neck.”
“My wife has threatened divorce over all the piles,” continued Dr. Pollack, who has an office at home, too. “If we had kids the health department would have to be alerted. But what can I do?”
Stop feeling bad, say the mess apologists. There are more urgent things to worry about. Irwin Kula is a rabbi based in Manhattan and author of “Yearnings: Embracing the Sacred Messiness of Life,” which was published by Hyperion in September. “Order can be profane and life-diminishing,” he said the other day. “It’s a flippant remark, but if you’ve never had a messy kitchen, you’ve probably never had a home-cooked meal. Real life is very messy, but we need to have models about how that messiness works.”
His favorite example? His 15-year-old daughter Talia’s bedroom, a picture of utter disorder — and individuality, he said.
“One day I’m standing in front of the door,” he said, “and it’s out of control and my wife, Dana, is freaking out, and suddenly I see in all the piles the dress she wore to her first dance and an earring she wore to her bat mitzvah. She’s so trusting her journal is wide open on the floor, and there are photo-booth pictures of her friends strewn everywhere. I said, ‘Omigod, her cup overflows!’ And we started to laugh.”
The room was an invitation, he said, to search for a deeper meaning under the scurf.
Last week David H. Freedman, another amiable mess analyst (and science journalist), stood bemused in front of the heathery tweed collapsible storage boxes with clear panels ($29.99) at the Container Store in Natick, Mass., and suggested that the main thing most people’s closets are brimming with is unused organizing equipment. “This is another wonderful trend,” Mr. Freedman said dryly, referring to the clear panels. “We’re going to lose the ability to put clutter away. Inside your storage box, you’d better be organized.”
Mr. Freedman is co-author, with Eric Abrahamson, of “A Perfect Mess: The Hidden Benefits of Disorder,” out in two weeks from Little, Brown & Company. The book is a meandering, engaging tour of beneficial mess and the systems and individuals reaping those benefits, like Gov. Arnold Schwarzenegger, whose mess-for-success tips include never making a daily schedule.
As a corollary, the book’s authors examine the high cost of neatness — measured in shame, mostly, and family fights, as well as wasted dollars — and generally have a fine time tipping over orthodoxies and poking fun at clutter busters and their ilk, and at the self-help tips they live or die by. They wonder: Why is it better to pack more activities into one day? By whose standards are procrastinators less effective than their well-scheduled peers? Why should children have to do chores to earn back their possessions if they leave them on the floor, as many professional organizers suggest?
In their book Mr. Freedman and Mr. Abrahamson describe the properties of mess in loving terms. Mess has resonance, they write, which means it can vibrate beyond its own confines and connect to the larger world. It was the overall scumminess of Alexander Fleming’s laboratory that led to his discovery of penicillin, from a moldy bloom in a petri dish he had forgotten on his desk.
Mess is robust and adaptable, like Mr. Schwarzenegger’s open calendar, as opposed to brittle, like a parent’s rigid schedule that doesn’t allow for a small child’s wool-gathering or balkiness. Mess is complete, in that it embraces all sorts of random elements. Mess tells a story: you can learn a lot about people from their detritus, whereas neat — well, neat is a closed book. Neat has no narrative and no personality (as any cover of Real Simple magazine will demonstrate). Mess is also natural, as Mr. Freedman and Mr. Abrahamson point out, and a real time-saver. “It takes extra effort to neaten up a system,” they write. “Things don’t generally neaten themselves.”
Indeed, the most valuable dividend of living with mess may be time. Mr. Freedman, who has three children and a hard-working spouse, Laurie Tobey-Freedman, a preschool special-needs coordinator, is studying Mandarin in his precious spare moments. Perusing a four-door stainless steel shoe cabinet ($149) at the Container Store, and imagining gussying up a shoe collection, he shook his head and said, “I don’t get the appeal of this, which may be a huge defect on my part in terms of higher forms of entertainment.”
The success of the Container Store notwithstanding, there is indeed something messy — and not in a good way — about so many organizing options. “When I think about this urge to organize, it reminds me of how it was when Americans began to take more and more control of their weight: they got fatter,” said Marian Salzman, chief marketing officer of J. Walter Thompson and co-author, with Ira Matathia, of “Next Now: Trends for the Future,” which is about to be published by Palgrave Macmillan. “I never gained weight until I went on a diet,” she said, adding that she has a room in which she hides a treadmill and, now, two bags of organizing supplies.
“I got sick of looking at them so I bought plastic tubs and stuffed the bags in the tubs and put the tubs in the room.” Right now, she said, “we are emotionally overloaded, and so what this is about is that we are getting better and better at living superficially.”
“Superficial is the new intimate,” Ms. Salzman said, gaining steam, “and these boxes, these organizing supplies, are the containers for all our superficial selves. ‘I will be a neater mom, a hipper mom, a mom that gets more done.’ Do I sound cynical?”
Nah.
In the semiotics of mess, desks may be the richest texts. Messy-desk research borrows from cognitive ergonomics, a field of study dealing with how a work environment supports productivity. Consider that desks, our work landscapes, are stand-ins for our brains, and so the piles we array on them are “cognitive artifacts,” or data cues, of our thoughts as we work.
To a professional organizer brandishing colored files and stackable trays, cluttered horizontal surfaces are a horror; to cognitive psychologists like Jay Brand, who works in the Ideation Group of Haworth Inc., the huge office furniture company, their peaks and valleys glow with intellectual intent and showcase a mind whirring away: sorting, linking, producing. (By extension, a clean desk can be seen as a dormant area, an indication that no thought or work is being undertaken.)
His studies and others, like a survey conducted last year by Ajilon Professional Staffing, in Saddle Brook, N.J., which linked messy desks to higher salaries (and neat ones to salaries under $35,000), answer Einstein’s oft-quoted remark, “If a cluttered desk is a sign of a cluttered mind, of what, then, is an empty desk?”
Don Springer, 61, is an information technology project manager and the winner of the Type O-No! contest sponsored by Dymo, the labelmaker manufacturer, in October. The contest offered $5,000 worth of clutter management — for the tools (the boxes, the bins and the systems, as well as a labelmaker) and the services of a professional organizer — to the best example of a “clutter nightmare,” as expressed by contestants in a photograph and a 100-word essay. “Type O-Nos,” reads a definition on the Dymo Web site, are “outlaws on the tidy trail, clutter criminals twice over.”
Mr. Springer, who in a phone interview spoke softly, precisely and with great humor, professed deep shame over the contents of what he calls his oh-by-the-way room, a library/junk room that his wife would like cleaned to make a nursery for a new grandchild. With a full-time job and membership in various clubs and organizations, and a desire to spend his free time seeing a movie with his wife instead of “expending the emotional energy it would take to sort through all the stuff,” Mr. Springer said, he is unable to prune the piles to his wife’s satisfaction. “There are emotional treasures buried in there, and I don’t want to part with them,” he said.
So, why bother?
“Because I love my wife and I want to make her happy,” he said.
According to a small survey that Mr. Freedman and Mr. Abrahamson conducted for their book — 160 adults representing a cross section of genders, races and incomes, Mr. Freedman said — of those who had split up with a partner, one in 12 had done so over a struggle involving one partner’s idea of mess. Happy partnerships turn out not necessarily to be those in which products from Staples figure largely. Mr. Freedman and his wife, for example, have been married for over two decades, and live in an offhandedly messy house with a violently messy basement — the latter area, where their three children hang out, decorated (though that’s not quite the right word) in a pre-1990s Tompkins Square Park lean-to style.
The room’s chaos is an example of one of Mr. Freedman and Mr. Abrahamson’s mess strategies, which is to create a mess-free DMZ (in this case, the basement stairs) and acknowledge areas of complementary mess. Cherish your mess management strategies, suggested Mr. Freedman, speaking approvingly of the pile builders and the under-the-bed stuffers; of those who let their messes wax and wane — the cyclers, he called them; and those who create satellite messes (in storage units off-site). “Most people don’t realize their own efficiency or effectiveness,” he said with a grin.
It’s also nice to remember, as Mr. Freedman pointed out, that almost anything looks pretty neat if it’s shuffled into a pile.
Monday, December 18, 2006
Questions to ask before marriage
Questions Couples Should Ask (Or Wish They Had) Before Marrying
Relationship experts report that too many couples fail to ask each other critical questions before marrying. Here are a few key ones that couples should consider asking:
1) Have we discussed whether or not to have children, and if the answer is yes, who is going to be the primary care giver?
2) Do we have a clear idea of each other’s financial obligations and goals, and do our ideas about spending and saving mesh?
3) Have we discussed our expectations for how the household will be maintained, and are we in agreement on who will manage the chores?
4) Have we fully disclosed our health histories, both physical and mental?
5) Is my partner affectionate to the degree that I expect?
6) Can we comfortably and openly discuss our sexual needs, preferences and fears?
7) Will there be a television in the bedroom?
8) Do we truly listen to each other and fairly consider one another’s ideas and complaints?
9) Have we reached a clear understanding of each other’s spiritual beliefs and needs, and have we discussed when and how our children will be exposed to religious/moral education?
10) Do we like and respect each other’s friends?
11) Do we value and respect each other’s parents, and is either of us concerned about whether the parents will interfere with the relationship?
12) What does my family do that annoys you?
13) Are there some things that you and I are NOT prepared to give up in the marriage?
14) If one of us were to be offered a career opportunity in a location far from the other’s family, are we prepared to move?
15) Do each of us feel fully confident in the other’s commitment to the marriage and believe that the bond can survive whatever challenges we may face?
Relationship experts report that too many couples fail to ask each other critical questions before marrying. Here are a few key ones that couples should consider asking:
1) Have we discussed whether or not to have children, and if the answer is yes, who is going to be the primary care giver?
2) Do we have a clear idea of each other’s financial obligations and goals, and do our ideas about spending and saving mesh?
3) Have we discussed our expectations for how the household will be maintained, and are we in agreement on who will manage the chores?
4) Have we fully disclosed our health histories, both physical and mental?
5) Is my partner affectionate to the degree that I expect?
6) Can we comfortably and openly discuss our sexual needs, preferences and fears?
7) Will there be a television in the bedroom?
8) Do we truly listen to each other and fairly consider one another’s ideas and complaints?
9) Have we reached a clear understanding of each other’s spiritual beliefs and needs, and have we discussed when and how our children will be exposed to religious/moral education?
10) Do we like and respect each other’s friends?
11) Do we value and respect each other’s parents, and is either of us concerned about whether the parents will interfere with the relationship?
12) What does my family do that annoys you?
13) Are there some things that you and I are NOT prepared to give up in the marriage?
14) If one of us were to be offered a career opportunity in a location far from the other’s family, are we prepared to move?
15) Do each of us feel fully confident in the other’s commitment to the marriage and believe that the bond can survive whatever challenges we may face?
How Much Should One Donate - Peter Singer
NYTimes
December 17, 2006
What Should a Billionaire Give – and What Should You?
By PETER SINGER
What is a human life worth? You may not want to put a price tag on a it. But if we really had to, most of us would agree that the value of a human life would be in the millions. Consistent with the foundations of our democracy and our frequently professed belief in the inherent dignity of human beings, we would also agree that all humans are created equal, at least to the extent of denying that differences of sex, ethnicity, nationality and place of residence change the value of a human life.
With Christmas approaching, and Americans writing checks to their favorite charities, it’s a good time to ask how these two beliefs — that a human life, if it can be priced at all, is worth millions, and that the factors I have mentioned do not alter the value of a human life — square with our actions. Perhaps this year such questions lurk beneath the surface of more family discussions than usual, for it has been an extraordinary year for philanthropy, especially philanthropy to fight global poverty.
For Bill Gates, the founder of Microsoft, the ideal of valuing all human life equally began to jar against reality some years ago, when he read an article about diseases in the developing world and came across the statistic that half a million children die every year from rotavirus, the most common cause of severe diarrhea in children. He had never heard of rotavirus. “How could I never have heard of something that kills half a million children every year?” he asked himself. He then learned that in developing countries, millions of children die from diseases that have been eliminated, or virtually eliminated, in the United States. That shocked him because he assumed that, if there are vaccines and treatments that could save lives, governments would be doing everything possible to get them to the people who need them. As Gates told a meeting of the World Health Assembly in Geneva last year, he and his wife, Melinda, “couldn’t escape the brutal conclusion that — in our world today — some lives are seen as worth saving and others are not.” They said to themselves, “This can’t be true.” But they knew it was.
Gates’s speech to the World Health Assembly concluded on an optimistic note, looking forward to the next decade when “people will finally accept that the death of a child in the developing world is just as tragic as the death of a child in the developed world.” That belief in the equal value of all human life is also prominent on the Web site of the Bill and Melinda Gates Foundation, where under Our Values we read: “All lives — no matter where they are being led — have equal value.”
We are very far from acting in accordance with that belief. In the same world in which more than a billion people live at a level of affluence never previously known, roughly a billion other people struggle to survive on the purchasing power equivalent of less than one U.S. dollar per day. Most of the world’s poorest people are undernourished, lack access to safe drinking water or even the most basic health services and cannot send their children to school. According to Unicef, more than 10 million children die every year — about 30,000 per day — from avoidable, poverty-related causes.
Last June the investor Warren Buffett took a significant step toward reducing those deaths when he pledged $31 billion to the Gates Foundation, and another $6 billion to other charitable foundations. Buffett’s pledge, set alongside the nearly $30 billion given by Bill and Melinda Gates to their foundation, has made it clear that the first decade of the 21st century is a new “golden age of philanthropy.” On an inflation-adjusted basis, Buffett has pledged to give more than double the lifetime total given away by two of the philanthropic giants of the past, Andrew Carnegie and John D. Rockefeller, put together. Bill and Melinda Gates’s gifts are not far behind.
Gates’s and Buffett’s donations will now be put to work primarily to reduce poverty, disease and premature death in the developing world. According to the Global Forum for Health Research, less than 10 percent of the world’s health research budget is spent on combating conditions that account for 90 percent of the global burden of disease. In the past, diseases that affect only the poor have been of no commercial interest to pharmaceutical manufacturers, because the poor cannot afford to buy their products. The Global Alliance for Vaccines and Immunization (GAVI), heavily supported by the Gates Foundation, seeks to change this by guaranteeing to purchase millions of doses of vaccines, when they are developed, that can prevent diseases like malaria. GAVI has also assisted developing countries to immunize more people with existing vaccines: 99 million additional children have been reached to date. By doing this, GAVI claims to have already averted nearly 1.7 million future deaths.
Philanthropy on this scale raises many ethical questions: Why are the people who are giving doing so? Does it do any good? Should we praise them for giving so much or criticize them for not giving still more? Is it troubling that such momentous decisions are made by a few extremely wealthy individuals? And how do our judgments about them reflect on our own way of living?
Let’s start with the question of motives. The rich must — or so some of us with less money like to assume — suffer sleepless nights because of their ruthlessness in squeezing out competitors, firing workers, shutting down plants or whatever else they have to do to acquire their wealth. When wealthy people give away money, we can always say that they are doing it to ease their consciences or generate favorable publicity. It has been suggested — by, for example, David Kirkpatrick, a senior editor at Fortune magazine — that Bill Gates’s turn to philanthropy was linked to the antitrust problems Microsoft had in the U.S. and the European Union. Was Gates, consciously or subconsciously, trying to improve his own image and that of his company?
This kind of sniping tells us more about the attackers than the attacked. Giving away large sums, rather than spending the money on corporate advertising or developing new products, is not a sensible strategy for increasing personal wealth. When we read that someone has given away a lot of their money, or time, to help others, it challenges us to think about our own behavior. Should we be following their example, in our own modest way? But if the rich just give their money away to improve their image, or to make up for past misdeeds — misdeeds quite unlike any we have committed, of course — then, conveniently, what they are doing has no relevance to what we ought to do.
A famous story is told about Thomas Hobbes, the 17th-century English philosopher, who argued that we all act in our own interests. On seeing him give alms to a beggar, a cleric asked Hobbes if he would have done this if Christ had not commanded us to do so. Yes, Hobbes replied, he was in pain to see the miserable condition of the old man, and his gift, by providing the man with some relief from that misery, also eased Hobbes’s pain. That reply reconciles Hobbes’s charity with his egoistic theory of human motivation, but at the cost of emptying egoism of much of its bite. If egoists suffer when they see a stranger in distress, they are capable of being as charitable as any altruist.
Followers of the 18th-century German philosopher Immanuel Kant would disagree. They think an act has moral worth only if it is done out of a sense of duty. Doing something merely because you enjoy doing it, or enjoy seeing its consequences, they say, has no moral worth, because if you happened not to enjoy doing it, then you wouldn’t do it, and you are not responsible for your likes and dislikes, whereas you are responsible for your obedience to the demands of duty.
Perhaps some philanthropists are motivated by their sense of duty. Apart from the equal value of all human life, the other “simple value” that lies at the core of the work of the Gates Foundation, according to its Web site, is “To whom much has been given, much is expected.” That suggests the view that those who have great wealth have a duty to use it for a larger purpose than their own interests. But while such questions of motive may be relevant to our assessment of Gates’s or Buffett’s character, they pale into insignificance when we consider the effect of what Gates and Buffett are doing. The parents whose children could die from rotavirus care more about getting the help that will save their children’s lives than about the motivations of those who make that possible.
Interestingly, neither Gates nor Buffett seems motivated by the possibility of being rewarded in heaven for his good deeds on earth. Gates told a Time interviewer, “There’s a lot more I could be doing on a Sunday morning” than going to church. Put them together with Andrew Carnegie, famous for his freethinking, and three of the four greatest American philanthropists have been atheists or agnostics. (The exception is John D. Rockefeller.) In a country in which 96 percent of the population say they believe in a supreme being, that’s a striking fact. It means that in one sense, Gates and Buffett are probably less self-interested in their charity than someone like Mother Teresa, who as a pious Roman Catholic believed in reward and punishment in the afterlife.
More important than questions about motives are questions about whether there is an obligation for the rich to give, and if so, how much they should give. A few years ago, an African-American cabdriver taking me to the Inter-American Development Bank in Washington asked me if I worked at the bank. I told him I did not but was speaking at a conference on development and aid. He then assumed that I was an economist, but when I said no, my training was in philosophy, he asked me if I thought the U.S. should give foreign aid. When I answered affirmatively, he replied that the government shouldn’t tax people in order to give their money to others. That, he thought, was robbery. When I asked if he believed that the rich should voluntarily donate some of what they earn to the poor, he said that if someone had worked for his money, he wasn’t going to tell him what to do with it.
At that point we reached our destination. Had the journey continued, I might have tried to persuade him that people can earn large amounts only when they live under favorable social circumstances, and that they don’t create those circumstances by themselves. I could have quoted Warren Buffett’s acknowledgment that society is responsible for much of his wealth. “If you stick me down in the middle of Bangladesh or Peru,” he said, “you’ll find out how much this talent is going to produce in the wrong kind of soil.” The Nobel Prize-winning economist and social scientist Herbert Simon estimated that “social capital” is responsible for at least 90 percent of what people earn in wealthy societies like those of the United States or northwestern Europe. By social capital Simon meant not only natural resources but, more important, the technology and organizational skills in the community, and the presence of good government. These are the foundation on which the rich can begin their work. “On moral grounds,” Simon added, “we could argue for a flat income tax of 90 percent.” Simon was not, of course, advocating so steep a rate of tax, for he was well aware of disincentive effects. But his estimate does undermine the argument that the rich are entitled to keep their wealth because it is all a result of their hard work. If Simon is right, that is true of at most 10 percent of it.
In any case, even if we were to grant that people deserve every dollar they earn, that doesn’t answer the question of what they should do with it. We might say that they have a right to spend it on lavish parties, private jets and luxury yachts, or, for that matter, to flush it down the toilet. But we could still think that for them to do these things while others die from easily preventable diseases is wrong. In an article I wrote more than three decades ago, at the time of a humanitarian emergency in what is now Bangladesh, I used the example of walking by a shallow pond and seeing a small child who has fallen in and appears to be in danger of drowning. Even though we did nothing to cause the child to fall into the pond, almost everyone agrees that if we can save the child at minimal inconvenience or trouble to ourselves, we ought to do so. Anything else would be callous, indecent and, in a word, wrong. The fact that in rescuing the child we may, for example, ruin a new pair of shoes is not a good reason for allowing the child to drown. Similarly if for the cost of a pair of shoes we can contribute to a health program in a developing country that stands a good chance of saving the life of a child, we ought to do so.
Perhaps, though, our obligation to help the poor is even stronger than this example implies, for we are less innocent than the passer-by who did nothing to cause the child to fall into the pond. Thomas Pogge, a philosopher at Columbia University, has argued that at least some of our affluence comes at the expense of the poor. He bases this claim not simply on the usual critique of the barriers that Europe and the United States maintain against agricultural imports from developing countries but also on less familiar aspects of our trade with developing countries. For example, he points out that international corporations are willing to make deals to buy natural resources from any government, no matter how it has come to power. This provides a huge financial incentive for groups to try to overthrow the existing government. Successful rebels are rewarded by being able to sell off the nation’s oil, minerals or timber.
In their dealings with corrupt dictators in developing countries, Pogge asserts, international corporations are morally no better than someone who knowingly buys stolen goods — with the difference that the international legal and political order recognizes the corporations, not as criminals in possession of stolen goods but as the legal owners of the goods they have bought. This situation is, of course, beneficial for the industrial nations, because it enables us to obtain the raw materials we need to maintain our prosperity, but it is a disaster for resource-rich developing countries, turning the wealth that should benefit them into a curse that leads to a cycle of coups, civil wars and corruption and is of little benefit to the people as a whole.
In this light, our obligation to the poor is not just one of providing assistance to strangers but one of compensation for harms that we have caused and are still causing them. It might be argued that we do not owe the poor compensation, because our affluence actually benefits them. Living luxuriously, it is said, provides employment, and so wealth trickles down, helping the poor more effectively than aid does. But the rich in industrialized nations buy virtually nothing that is made by the very poor. During the past 20 years of economic globalization, although expanding trade has helped lift many of the world’s poor out of poverty, it has failed to benefit the poorest 10 percent of the world’s population. Some of the extremely poor, most of whom live in sub-Saharan Africa, have nothing to sell that rich people want, while others lack the infrastructure to get their goods to market. If they can get their crops to a port, European and U.S. subsidies often mean that they cannot sell them, despite — as for example in the case of West African cotton growers who compete with vastly larger and richer U.S. cotton producers — having a lower production cost than the subsidized producers in the rich nations.
The remedy to these problems, it might reasonably be suggested, should come from the state, not from private philanthropy. When aid comes through the government, everyone who earns above the tax-free threshold contributes something, with more collected from those with greater ability to pay. Much as we may applaud what Gates and Buffett are doing, we can also be troubled by a system that leaves the fate of hundreds of millions of people hanging on the decisions of two or three private citizens. But the amount of foreign development aid given by the U.S. government is, at 22 cents for every $100 the nation earns, about the same, as a percentage of gross national income, as Portugal gives and about half that of the U.K. Worse still, much of it is directed where it best suits U.S. strategic interests — Iraq is now by far the largest recipient of U.S. development aid, and Egypt, Jordan, Pakistan and Afghanistan all rank in the Top 10. Less than a quarter of official U.S. development aid — barely a nickel in every $100 of our G.N.I. — goes to the world’s poorest nations.
Adding private philanthropy to U.S. government aid improves this picture, because Americans privately give more per capita to international philanthropic causes than the citizens of almost any other nation. Even when private donations are included, however, countries like Norway, Denmark, Sweden and the Netherlands give three or four times as much foreign aid, in proportion to the size of their economies, as the U.S. gives — with a much larger percentage going to the poorest nations. At least as things now stand, the case for philanthropic efforts to relieve global poverty is not susceptible to the argument that the government has taken care of the problem. And even if official U.S. aid were better-directed and comparable, relative to our gross domestic product, with that of the most generous nations, there would still be a role for private philanthropy. Unconstrained by diplomatic considerations or the desire to swing votes at the United Nations, private donors can more easily avoid dealing with corrupt or wasteful governments. They can go directly into the field, working with local villages and grass-roots organizations.
Nor are philanthropists beholden to lobbyists. As The New York Times reported recently, billions of dollars of U.S. aid is tied to domestic goods. Wheat for Africa must be grown in America, although aid experts say this often depresses local African markets, reducing the incentive for farmers there to produce more. In a decision that surely costs lives, hundreds of millions of condoms intended to stop the spread of AIDS in Africa and around the world must be manufactured in the U.S., although they cost twice as much as similar products made in Asia.
In other ways, too, private philanthropists are free to venture where governments fear to tread. Through a foundation named for his wife, Susan Thompson Buffett, Warren Buffett has supported reproductive rights, including family planning and pro-choice organizations. In another unusual initiative, he has pledged $50 million for the International Atomic Energy Agency’s plan to establish a “fuel bank” to supply nuclear-reactor fuel to countries that meet their nuclear-nonproliferation commitments. The idea, which has been talked about for many years, is widely agreed to be a useful step toward discouraging countries from building their own facilities for producing nuclear fuel, which could then be diverted to weapons production. It is, Buffett said, “an investment in a safer world.” Though it is something that governments could and should be doing, no government had taken the first step.
Aid has always had its critics. Carefully planned and intelligently directed private philanthropy may be the best answer to the claim that aid doesn’t work. Of course, as in any large-scale human enterprise, some aid can be ineffective. But provided that aid isn’t actually counterproductive, even relatively inefficient assistance is likely to do more to advance human wellbeing than luxury spending by the wealthy.
The rich, then, should give. But how much should they give? Gates may have given away nearly $30 billion, but that still leaves him sitting at the top of the Forbes list of the richest Americans, with $53 billion. His 66,000-square-foot high-tech lakeside estate near Seattle is reportedly worth more than $100 million. Property taxes are about $1 million. Among his possessions is the Leicester Codex, the only handwritten book by Leonardo da Vinci still in private hands, for which he paid $30.8 million in 1994. Has Bill Gates done enough? More pointedly, you might ask: if he really believes that all lives have equal value, what is he doing living in such an expensive house and owning a Leonardo Codex? Are there no more lives that could be saved by living more modestly and adding the money thus saved to the amount he has already given?
Yet we should recognize that, if judged by the proportion of his wealth that he has given away, Gates compares very well with most of the other people on the Forbes 400 list, including his former colleague and Microsoft co-founder, Paul Allen. Allen, who left the company in 1983, has given, over his lifetime, more than $800 million to philanthropic causes. That is far more than nearly any of us will ever be able to give. But Forbes lists Allen as the fifth-richest American, with a net worth of $16 billion. He owns the Seattle Seahawks, the Portland Trailblazers, a 413-foot oceangoing yacht that carries two helicopters and a 60-foot submarine. He has given only about 5 percent of his total wealth.
Is there a line of moral adequacy that falls between the 5 percent that Allen has given away and the roughly 35 percent that Gates has donated? Few people have set a personal example that would allow them to tell Gates that he has not given enough, but one who could is Zell Kravinsky. A few years ago, when he was in his mid-40s, Kravinsky gave almost all of his $45 million real estate fortune to health-related charities, retaining only his modest family home in Jenkintown, near Philadelphia, and enough to meet his family’s ordinary expenses. After learning that thousands of people with failing kidneys die each year while waiting for a transplant, he contacted a Philadelphia hospital and donated one of his kidneys to a complete stranger.
After reading about Kravinsky in The New Yorker, I invited him to speak to my classes at Princeton. He comes across as anguished by the failure of others to see the simple logic that lies behind his altruism. Kravinsky has a mathematical mind — a talent that obviously helped him in deciding what investments would prove profitable — and he says that the chances of dying as a result of donating a kidney are about 1 in 4,000. For him this implies that to withhold a kidney from someone who would otherwise die means valuing one’s own life at 4,000 times that of a stranger, a ratio Kravinsky considers “obscene.”
What marks Kravinsky from the rest of us is that he takes the equal value of all human life as a guide to life, not just as a nice piece of rhetoric. He acknowledges that some people think he is crazy, and even his wife says she believes that he goes too far. One of her arguments against the kidney donation was that one of their children may one day need a kidney, and Zell could be the only compatible donor. Kravinsky’s love for his children is, as far as I can tell, as strong as that of any normal parent. Such attachments are part of our nature, no doubt the product of our evolution as mammals who give birth to children, who for an unusually long time require our assistance in order to survive. But that does not, in Kravinsky’s view, justify our placing a value on the lives of our children that is thousands of times greater than the value we place on the lives of the children of strangers. Asked if he would allow his child to die if it would enable a thousand children to live, Kravinsky said yes. Indeed, he has said he would permit his child to die even if this enabled only two other children to live. Nevertheless, to appease his wife, he recently went back into real estate, made some money and bought the family a larger home. But he still remains committed to giving away as much as possible, subject only to keeping his domestic life reasonably tranquil.
Buffett says he believes in giving his children “enough so they feel they could do anything, but not so much that they could do nothing.” That means, in his judgment, “a few hundred thousand” each. In absolute terms, that is far more than most Americans are able to leave their children and, by Kravinsky’s standard, certainly too much. (Kravinsky says that the hard part is not giving away the first $45 million but the last $10,000, when you have to live so cheaply that you can’t function in the business world.) But even if Buffett left each of his three children a million dollars each, he would still have given away more than 99.99 percent of his wealth. When someone does that much — especially in a society in which the norm is to leave most of your wealth to your children — it is better to praise them than to cavil about the extra few hundred thousand dollars they might have given.
Philosophers like Liam Murphy of New York University and my colleague Kwame Anthony Appiah at Princeton contend that our obligations are limited to carrying our fair share of the burden of relieving global poverty. They would have us calculate how much would be required to ensure that the world’s poorest people have a chance at a decent life, and then divide this sum among the affluent. That would give us each an amount to donate, and having given that, we would have fulfilled our obligations to the poor.
What might that fair amount be? One way of calculating it would be to take as our target, at least for the next nine years, the Millennium Development Goals, set by the United Nations Millennium Summit in 2000. On that occasion, the largest gathering of world leaders in history jointly pledged to meet, by 2015, a list of goals that include:
Reducing by half the proportion of the world’s people in extreme poverty (defined as living on less than the purchasing-power equivalent of one U.S. dollar per day).
Reducing by half the proportion of people who suffer from hunger.
Ensuring that children everywhere are able to take a full course of primary schooling.
Ending sex disparity in education.
Reducing by two-thirds the mortality rate among children under 5.
Reducing by three-quarters the rate of maternal mortality.
Halting and beginning to reverse the spread of H.I.V./AIDS and halting and beginning to reduce the incidence of malaria and other major diseases.
Reducing by half the proportion of people without sustainable access to safe drinking water.
Last year a United Nations task force, led by the Columbia University economist Jeffrey Sachs, estimated the annual cost of meeting these goals to be $121 billion in 2006, rising to $189 billion by 2015. When we take account of existing official development aid promises, the additional amount needed each year to meet the goals is only $48 billion for 2006 and $74 billion for 2015.
Now let’s look at the incomes of America’s rich and superrich, and ask how much they could reasonably give. The task is made easier by statistics recently provided by Thomas Piketty and Emmanuel Saez, economists at the École Normale Supérieure, Paris-Jourdan, and the University of California, Berkeley, respectively, based on U.S. tax data for 2004. Their figures are for pretax income, excluding income from capital gains, which for the very rich are nearly always substantial. For simplicity I have rounded the figures, generally downward. Note too that the numbers refer to “tax units,” that is, in many cases, families rather than individuals.
Piketty and Saez’s top bracket comprises 0.01 percent of U.S. taxpayers. There are 14,400 of them, earning an average of $12,775,000, with total earnings of $184 billion. The minimum annual income in this group is more than $5 million, so it seems reasonable to suppose that they could, without much hardship, give away a third of their annual income, an average of $4.3 million each, for a total of around $61 billion. That would still leave each of them with an annual income of at least $3.3 million.
Next comes the rest of the top 0.1 percent (excluding the category just described, as I shall do henceforth). There are 129,600 in this group, with an average income of just over $2 million and a minimum income of $1.1 million. If they were each to give a quarter of their income, that would yield about $65 billion, and leave each of them with at least $846,000 annually.
The top 0.5 percent consists of 575,900 taxpayers, with an average income of $623,000 and a minimum of $407,000. If they were to give one-fifth of their income, they would still have at least $325,000 each, and they would be giving a total of $72 billion.
Coming down to the level of those in the top 1 percent, we find 719,900 taxpayers with an average income of $327,000 and a minimum of $276,000. They could comfortably afford to give 15 percent of their income. That would yield $35 billion and leave them with at least $234,000.
Finally, the remainder of the nation’s top 10 percent earn at least $92,000 annually, with an average of $132,000. There are nearly 13 million in this group. If they gave the traditional tithe — 10 percent of their income, or an average of $13,200 each — this would yield about $171 billion and leave them a minimum of $83,000.
You could spend a long time debating whether the fractions of income I have suggested for donation constitute the fairest possible scheme. Perhaps the sliding scale should be steeper, so that the superrich give more and the merely comfortable give less. And it could be extended beyond the Top 10 percent of American families, so that everyone able to afford more than the basic necessities of life gives something, even if it is as little as 1 percent. Be that as it may, the remarkable thing about these calculations is that a scale of donations that is unlikely to impose significant hardship on anyone yields a total of $404 billion — from just 10 percent of American families.
Obviously, the rich in other nations should share the burden of relieving global poverty. The U.S. is responsible for 36 percent of the gross domestic product of all Organization for Economic Cooperation and Development nations. Arguably, because the U.S. is richer than all other major nations, and its wealth is more unevenly distributed than wealth in almost any other industrialized country, the rich in the U.S. should contribute more than 36 percent of total global donations. So somewhat more than 36 percent of all aid to relieve global poverty should come from the U.S. For simplicity, let’s take half as a fair share for the U.S. On that basis, extending the scheme I have suggested worldwide would provide $808 billion annually for development aid. That’s more than six times what the task force chaired by Sachs estimated would be required for 2006 in order to be on track to meet the Millennium Development Goals, and more than 16 times the shortfall between that sum and existing official development aid commitments.
If we are obliged to do no more than our fair share of eliminating global poverty, the burden will not be great. But is that really all we ought to do? Since we all agree that fairness is a good thing, and none of us like doing more because others don’t pull their weight, the fair-share view is attractive. In the end, however, I think we should reject it. Let’s return to the drowning child in the shallow pond. Imagine it is not 1 small child who has fallen in, but 50 children. We are among 50 adults, unrelated to the children, picnicking on the lawn around the pond. We can easily wade into the pond and rescue the children, and the fact that we would find it cold and unpleasant sloshing around in the knee-deep muddy water is no justification for failing to do so. The “fair share” theorists would say that if we each rescue one child, all the children will be saved, and so none of us have an obligation to save more than one. But what if half the picnickers prefer staying clean and dry to rescuing any children at all? Is it acceptable if the rest of us stop after we have rescued just one child, knowing that we have done our fair share, but that half the children will drown? We might justifiably be furious with those who are not doing their fair share, but our anger with them is not a reason for letting the children die. In terms of praise and blame, we are clearly right to condemn, in the strongest terms, those who do nothing. In contrast, we may withhold such condemnation from those who stop when they have done their fair share. Even so, they have let children drown when they could easily have saved them, and that is wrong.
Similarly, in the real world, it should be seen as a serious moral failure when those with ample income do not do their fair share toward relieving global poverty. It isn’t so easy, however, to decide on the proper approach to take to those who limit their contribution to their fair share when they could easily do more and when, because others are not playing their part, a further donation would assist many in desperate need. In the privacy of our own judgment, we should believe that it is wrong not to do more. But whether we should actually criticize people who are doing their fair share, but no more than that, depends on the psychological impact that such criticism will have on them, and on others. This in turn may depend on social practices. If the majority are doing little or nothing, setting a standard higher than the fair-share level may seem so demanding that it discourages people who are willing to make an equitable contribution from doing even that. So it may be best to refrain from criticizing those who achieve the fair-share level. In moving our society’s standards forward, we may have to progress one step at a time.
For more than 30 years, I’ve been reading, writing and teaching about the ethical issue posed by the juxtaposition, on our planet, of great abundance and life-threatening poverty. Yet it was not until, in preparing this article, I calculated how much America’s Top 10 percent of income earners actually make that I fully understood how easy it would be for the world’s rich to eliminate, or virtually eliminate, global poverty. (It has actually become much easier over the last 30 years, as the rich have grown significantly richer.) I found the result astonishing. I double-checked the figures and asked a research assistant to check them as well. But they were right. Measured against our capacity, the Millennium Development Goals are indecently, shockingly modest. If we fail to achieve them — as on present indications we well might — we have no excuses. The target we should be setting for ourselves is not halving the proportion of people living in extreme poverty, and without enough to eat, but ensuring that no one, or virtually no one, needs to live in such degrading conditions. That is a worthy goal, and it is well within our reach.
Peter Singer is the Ira W. DeCamp professor of bioethics at the Center for Human Values at Princeton University. He is the author of many books, including most recently “The Way We Eat: Why Our Food Choices Matter.”
December 17, 2006
What Should a Billionaire Give – and What Should You?
By PETER SINGER
What is a human life worth? You may not want to put a price tag on a it. But if we really had to, most of us would agree that the value of a human life would be in the millions. Consistent with the foundations of our democracy and our frequently professed belief in the inherent dignity of human beings, we would also agree that all humans are created equal, at least to the extent of denying that differences of sex, ethnicity, nationality and place of residence change the value of a human life.
With Christmas approaching, and Americans writing checks to their favorite charities, it’s a good time to ask how these two beliefs — that a human life, if it can be priced at all, is worth millions, and that the factors I have mentioned do not alter the value of a human life — square with our actions. Perhaps this year such questions lurk beneath the surface of more family discussions than usual, for it has been an extraordinary year for philanthropy, especially philanthropy to fight global poverty.
For Bill Gates, the founder of Microsoft, the ideal of valuing all human life equally began to jar against reality some years ago, when he read an article about diseases in the developing world and came across the statistic that half a million children die every year from rotavirus, the most common cause of severe diarrhea in children. He had never heard of rotavirus. “How could I never have heard of something that kills half a million children every year?” he asked himself. He then learned that in developing countries, millions of children die from diseases that have been eliminated, or virtually eliminated, in the United States. That shocked him because he assumed that, if there are vaccines and treatments that could save lives, governments would be doing everything possible to get them to the people who need them. As Gates told a meeting of the World Health Assembly in Geneva last year, he and his wife, Melinda, “couldn’t escape the brutal conclusion that — in our world today — some lives are seen as worth saving and others are not.” They said to themselves, “This can’t be true.” But they knew it was.
Gates’s speech to the World Health Assembly concluded on an optimistic note, looking forward to the next decade when “people will finally accept that the death of a child in the developing world is just as tragic as the death of a child in the developed world.” That belief in the equal value of all human life is also prominent on the Web site of the Bill and Melinda Gates Foundation, where under Our Values we read: “All lives — no matter where they are being led — have equal value.”
We are very far from acting in accordance with that belief. In the same world in which more than a billion people live at a level of affluence never previously known, roughly a billion other people struggle to survive on the purchasing power equivalent of less than one U.S. dollar per day. Most of the world’s poorest people are undernourished, lack access to safe drinking water or even the most basic health services and cannot send their children to school. According to Unicef, more than 10 million children die every year — about 30,000 per day — from avoidable, poverty-related causes.
Last June the investor Warren Buffett took a significant step toward reducing those deaths when he pledged $31 billion to the Gates Foundation, and another $6 billion to other charitable foundations. Buffett’s pledge, set alongside the nearly $30 billion given by Bill and Melinda Gates to their foundation, has made it clear that the first decade of the 21st century is a new “golden age of philanthropy.” On an inflation-adjusted basis, Buffett has pledged to give more than double the lifetime total given away by two of the philanthropic giants of the past, Andrew Carnegie and John D. Rockefeller, put together. Bill and Melinda Gates’s gifts are not far behind.
Gates’s and Buffett’s donations will now be put to work primarily to reduce poverty, disease and premature death in the developing world. According to the Global Forum for Health Research, less than 10 percent of the world’s health research budget is spent on combating conditions that account for 90 percent of the global burden of disease. In the past, diseases that affect only the poor have been of no commercial interest to pharmaceutical manufacturers, because the poor cannot afford to buy their products. The Global Alliance for Vaccines and Immunization (GAVI), heavily supported by the Gates Foundation, seeks to change this by guaranteeing to purchase millions of doses of vaccines, when they are developed, that can prevent diseases like malaria. GAVI has also assisted developing countries to immunize more people with existing vaccines: 99 million additional children have been reached to date. By doing this, GAVI claims to have already averted nearly 1.7 million future deaths.
Philanthropy on this scale raises many ethical questions: Why are the people who are giving doing so? Does it do any good? Should we praise them for giving so much or criticize them for not giving still more? Is it troubling that such momentous decisions are made by a few extremely wealthy individuals? And how do our judgments about them reflect on our own way of living?
Let’s start with the question of motives. The rich must — or so some of us with less money like to assume — suffer sleepless nights because of their ruthlessness in squeezing out competitors, firing workers, shutting down plants or whatever else they have to do to acquire their wealth. When wealthy people give away money, we can always say that they are doing it to ease their consciences or generate favorable publicity. It has been suggested — by, for example, David Kirkpatrick, a senior editor at Fortune magazine — that Bill Gates’s turn to philanthropy was linked to the antitrust problems Microsoft had in the U.S. and the European Union. Was Gates, consciously or subconsciously, trying to improve his own image and that of his company?
This kind of sniping tells us more about the attackers than the attacked. Giving away large sums, rather than spending the money on corporate advertising or developing new products, is not a sensible strategy for increasing personal wealth. When we read that someone has given away a lot of their money, or time, to help others, it challenges us to think about our own behavior. Should we be following their example, in our own modest way? But if the rich just give their money away to improve their image, or to make up for past misdeeds — misdeeds quite unlike any we have committed, of course — then, conveniently, what they are doing has no relevance to what we ought to do.
A famous story is told about Thomas Hobbes, the 17th-century English philosopher, who argued that we all act in our own interests. On seeing him give alms to a beggar, a cleric asked Hobbes if he would have done this if Christ had not commanded us to do so. Yes, Hobbes replied, he was in pain to see the miserable condition of the old man, and his gift, by providing the man with some relief from that misery, also eased Hobbes’s pain. That reply reconciles Hobbes’s charity with his egoistic theory of human motivation, but at the cost of emptying egoism of much of its bite. If egoists suffer when they see a stranger in distress, they are capable of being as charitable as any altruist.
Followers of the 18th-century German philosopher Immanuel Kant would disagree. They think an act has moral worth only if it is done out of a sense of duty. Doing something merely because you enjoy doing it, or enjoy seeing its consequences, they say, has no moral worth, because if you happened not to enjoy doing it, then you wouldn’t do it, and you are not responsible for your likes and dislikes, whereas you are responsible for your obedience to the demands of duty.
Perhaps some philanthropists are motivated by their sense of duty. Apart from the equal value of all human life, the other “simple value” that lies at the core of the work of the Gates Foundation, according to its Web site, is “To whom much has been given, much is expected.” That suggests the view that those who have great wealth have a duty to use it for a larger purpose than their own interests. But while such questions of motive may be relevant to our assessment of Gates’s or Buffett’s character, they pale into insignificance when we consider the effect of what Gates and Buffett are doing. The parents whose children could die from rotavirus care more about getting the help that will save their children’s lives than about the motivations of those who make that possible.
Interestingly, neither Gates nor Buffett seems motivated by the possibility of being rewarded in heaven for his good deeds on earth. Gates told a Time interviewer, “There’s a lot more I could be doing on a Sunday morning” than going to church. Put them together with Andrew Carnegie, famous for his freethinking, and three of the four greatest American philanthropists have been atheists or agnostics. (The exception is John D. Rockefeller.) In a country in which 96 percent of the population say they believe in a supreme being, that’s a striking fact. It means that in one sense, Gates and Buffett are probably less self-interested in their charity than someone like Mother Teresa, who as a pious Roman Catholic believed in reward and punishment in the afterlife.
More important than questions about motives are questions about whether there is an obligation for the rich to give, and if so, how much they should give. A few years ago, an African-American cabdriver taking me to the Inter-American Development Bank in Washington asked me if I worked at the bank. I told him I did not but was speaking at a conference on development and aid. He then assumed that I was an economist, but when I said no, my training was in philosophy, he asked me if I thought the U.S. should give foreign aid. When I answered affirmatively, he replied that the government shouldn’t tax people in order to give their money to others. That, he thought, was robbery. When I asked if he believed that the rich should voluntarily donate some of what they earn to the poor, he said that if someone had worked for his money, he wasn’t going to tell him what to do with it.
At that point we reached our destination. Had the journey continued, I might have tried to persuade him that people can earn large amounts only when they live under favorable social circumstances, and that they don’t create those circumstances by themselves. I could have quoted Warren Buffett’s acknowledgment that society is responsible for much of his wealth. “If you stick me down in the middle of Bangladesh or Peru,” he said, “you’ll find out how much this talent is going to produce in the wrong kind of soil.” The Nobel Prize-winning economist and social scientist Herbert Simon estimated that “social capital” is responsible for at least 90 percent of what people earn in wealthy societies like those of the United States or northwestern Europe. By social capital Simon meant not only natural resources but, more important, the technology and organizational skills in the community, and the presence of good government. These are the foundation on which the rich can begin their work. “On moral grounds,” Simon added, “we could argue for a flat income tax of 90 percent.” Simon was not, of course, advocating so steep a rate of tax, for he was well aware of disincentive effects. But his estimate does undermine the argument that the rich are entitled to keep their wealth because it is all a result of their hard work. If Simon is right, that is true of at most 10 percent of it.
In any case, even if we were to grant that people deserve every dollar they earn, that doesn’t answer the question of what they should do with it. We might say that they have a right to spend it on lavish parties, private jets and luxury yachts, or, for that matter, to flush it down the toilet. But we could still think that for them to do these things while others die from easily preventable diseases is wrong. In an article I wrote more than three decades ago, at the time of a humanitarian emergency in what is now Bangladesh, I used the example of walking by a shallow pond and seeing a small child who has fallen in and appears to be in danger of drowning. Even though we did nothing to cause the child to fall into the pond, almost everyone agrees that if we can save the child at minimal inconvenience or trouble to ourselves, we ought to do so. Anything else would be callous, indecent and, in a word, wrong. The fact that in rescuing the child we may, for example, ruin a new pair of shoes is not a good reason for allowing the child to drown. Similarly if for the cost of a pair of shoes we can contribute to a health program in a developing country that stands a good chance of saving the life of a child, we ought to do so.
Perhaps, though, our obligation to help the poor is even stronger than this example implies, for we are less innocent than the passer-by who did nothing to cause the child to fall into the pond. Thomas Pogge, a philosopher at Columbia University, has argued that at least some of our affluence comes at the expense of the poor. He bases this claim not simply on the usual critique of the barriers that Europe and the United States maintain against agricultural imports from developing countries but also on less familiar aspects of our trade with developing countries. For example, he points out that international corporations are willing to make deals to buy natural resources from any government, no matter how it has come to power. This provides a huge financial incentive for groups to try to overthrow the existing government. Successful rebels are rewarded by being able to sell off the nation’s oil, minerals or timber.
In their dealings with corrupt dictators in developing countries, Pogge asserts, international corporations are morally no better than someone who knowingly buys stolen goods — with the difference that the international legal and political order recognizes the corporations, not as criminals in possession of stolen goods but as the legal owners of the goods they have bought. This situation is, of course, beneficial for the industrial nations, because it enables us to obtain the raw materials we need to maintain our prosperity, but it is a disaster for resource-rich developing countries, turning the wealth that should benefit them into a curse that leads to a cycle of coups, civil wars and corruption and is of little benefit to the people as a whole.
In this light, our obligation to the poor is not just one of providing assistance to strangers but one of compensation for harms that we have caused and are still causing them. It might be argued that we do not owe the poor compensation, because our affluence actually benefits them. Living luxuriously, it is said, provides employment, and so wealth trickles down, helping the poor more effectively than aid does. But the rich in industrialized nations buy virtually nothing that is made by the very poor. During the past 20 years of economic globalization, although expanding trade has helped lift many of the world’s poor out of poverty, it has failed to benefit the poorest 10 percent of the world’s population. Some of the extremely poor, most of whom live in sub-Saharan Africa, have nothing to sell that rich people want, while others lack the infrastructure to get their goods to market. If they can get their crops to a port, European and U.S. subsidies often mean that they cannot sell them, despite — as for example in the case of West African cotton growers who compete with vastly larger and richer U.S. cotton producers — having a lower production cost than the subsidized producers in the rich nations.
The remedy to these problems, it might reasonably be suggested, should come from the state, not from private philanthropy. When aid comes through the government, everyone who earns above the tax-free threshold contributes something, with more collected from those with greater ability to pay. Much as we may applaud what Gates and Buffett are doing, we can also be troubled by a system that leaves the fate of hundreds of millions of people hanging on the decisions of two or three private citizens. But the amount of foreign development aid given by the U.S. government is, at 22 cents for every $100 the nation earns, about the same, as a percentage of gross national income, as Portugal gives and about half that of the U.K. Worse still, much of it is directed where it best suits U.S. strategic interests — Iraq is now by far the largest recipient of U.S. development aid, and Egypt, Jordan, Pakistan and Afghanistan all rank in the Top 10. Less than a quarter of official U.S. development aid — barely a nickel in every $100 of our G.N.I. — goes to the world’s poorest nations.
Adding private philanthropy to U.S. government aid improves this picture, because Americans privately give more per capita to international philanthropic causes than the citizens of almost any other nation. Even when private donations are included, however, countries like Norway, Denmark, Sweden and the Netherlands give three or four times as much foreign aid, in proportion to the size of their economies, as the U.S. gives — with a much larger percentage going to the poorest nations. At least as things now stand, the case for philanthropic efforts to relieve global poverty is not susceptible to the argument that the government has taken care of the problem. And even if official U.S. aid were better-directed and comparable, relative to our gross domestic product, with that of the most generous nations, there would still be a role for private philanthropy. Unconstrained by diplomatic considerations or the desire to swing votes at the United Nations, private donors can more easily avoid dealing with corrupt or wasteful governments. They can go directly into the field, working with local villages and grass-roots organizations.
Nor are philanthropists beholden to lobbyists. As The New York Times reported recently, billions of dollars of U.S. aid is tied to domestic goods. Wheat for Africa must be grown in America, although aid experts say this often depresses local African markets, reducing the incentive for farmers there to produce more. In a decision that surely costs lives, hundreds of millions of condoms intended to stop the spread of AIDS in Africa and around the world must be manufactured in the U.S., although they cost twice as much as similar products made in Asia.
In other ways, too, private philanthropists are free to venture where governments fear to tread. Through a foundation named for his wife, Susan Thompson Buffett, Warren Buffett has supported reproductive rights, including family planning and pro-choice organizations. In another unusual initiative, he has pledged $50 million for the International Atomic Energy Agency’s plan to establish a “fuel bank” to supply nuclear-reactor fuel to countries that meet their nuclear-nonproliferation commitments. The idea, which has been talked about for many years, is widely agreed to be a useful step toward discouraging countries from building their own facilities for producing nuclear fuel, which could then be diverted to weapons production. It is, Buffett said, “an investment in a safer world.” Though it is something that governments could and should be doing, no government had taken the first step.
Aid has always had its critics. Carefully planned and intelligently directed private philanthropy may be the best answer to the claim that aid doesn’t work. Of course, as in any large-scale human enterprise, some aid can be ineffective. But provided that aid isn’t actually counterproductive, even relatively inefficient assistance is likely to do more to advance human wellbeing than luxury spending by the wealthy.
The rich, then, should give. But how much should they give? Gates may have given away nearly $30 billion, but that still leaves him sitting at the top of the Forbes list of the richest Americans, with $53 billion. His 66,000-square-foot high-tech lakeside estate near Seattle is reportedly worth more than $100 million. Property taxes are about $1 million. Among his possessions is the Leicester Codex, the only handwritten book by Leonardo da Vinci still in private hands, for which he paid $30.8 million in 1994. Has Bill Gates done enough? More pointedly, you might ask: if he really believes that all lives have equal value, what is he doing living in such an expensive house and owning a Leonardo Codex? Are there no more lives that could be saved by living more modestly and adding the money thus saved to the amount he has already given?
Yet we should recognize that, if judged by the proportion of his wealth that he has given away, Gates compares very well with most of the other people on the Forbes 400 list, including his former colleague and Microsoft co-founder, Paul Allen. Allen, who left the company in 1983, has given, over his lifetime, more than $800 million to philanthropic causes. That is far more than nearly any of us will ever be able to give. But Forbes lists Allen as the fifth-richest American, with a net worth of $16 billion. He owns the Seattle Seahawks, the Portland Trailblazers, a 413-foot oceangoing yacht that carries two helicopters and a 60-foot submarine. He has given only about 5 percent of his total wealth.
Is there a line of moral adequacy that falls between the 5 percent that Allen has given away and the roughly 35 percent that Gates has donated? Few people have set a personal example that would allow them to tell Gates that he has not given enough, but one who could is Zell Kravinsky. A few years ago, when he was in his mid-40s, Kravinsky gave almost all of his $45 million real estate fortune to health-related charities, retaining only his modest family home in Jenkintown, near Philadelphia, and enough to meet his family’s ordinary expenses. After learning that thousands of people with failing kidneys die each year while waiting for a transplant, he contacted a Philadelphia hospital and donated one of his kidneys to a complete stranger.
After reading about Kravinsky in The New Yorker, I invited him to speak to my classes at Princeton. He comes across as anguished by the failure of others to see the simple logic that lies behind his altruism. Kravinsky has a mathematical mind — a talent that obviously helped him in deciding what investments would prove profitable — and he says that the chances of dying as a result of donating a kidney are about 1 in 4,000. For him this implies that to withhold a kidney from someone who would otherwise die means valuing one’s own life at 4,000 times that of a stranger, a ratio Kravinsky considers “obscene.”
What marks Kravinsky from the rest of us is that he takes the equal value of all human life as a guide to life, not just as a nice piece of rhetoric. He acknowledges that some people think he is crazy, and even his wife says she believes that he goes too far. One of her arguments against the kidney donation was that one of their children may one day need a kidney, and Zell could be the only compatible donor. Kravinsky’s love for his children is, as far as I can tell, as strong as that of any normal parent. Such attachments are part of our nature, no doubt the product of our evolution as mammals who give birth to children, who for an unusually long time require our assistance in order to survive. But that does not, in Kravinsky’s view, justify our placing a value on the lives of our children that is thousands of times greater than the value we place on the lives of the children of strangers. Asked if he would allow his child to die if it would enable a thousand children to live, Kravinsky said yes. Indeed, he has said he would permit his child to die even if this enabled only two other children to live. Nevertheless, to appease his wife, he recently went back into real estate, made some money and bought the family a larger home. But he still remains committed to giving away as much as possible, subject only to keeping his domestic life reasonably tranquil.
Buffett says he believes in giving his children “enough so they feel they could do anything, but not so much that they could do nothing.” That means, in his judgment, “a few hundred thousand” each. In absolute terms, that is far more than most Americans are able to leave their children and, by Kravinsky’s standard, certainly too much. (Kravinsky says that the hard part is not giving away the first $45 million but the last $10,000, when you have to live so cheaply that you can’t function in the business world.) But even if Buffett left each of his three children a million dollars each, he would still have given away more than 99.99 percent of his wealth. When someone does that much — especially in a society in which the norm is to leave most of your wealth to your children — it is better to praise them than to cavil about the extra few hundred thousand dollars they might have given.
Philosophers like Liam Murphy of New York University and my colleague Kwame Anthony Appiah at Princeton contend that our obligations are limited to carrying our fair share of the burden of relieving global poverty. They would have us calculate how much would be required to ensure that the world’s poorest people have a chance at a decent life, and then divide this sum among the affluent. That would give us each an amount to donate, and having given that, we would have fulfilled our obligations to the poor.
What might that fair amount be? One way of calculating it would be to take as our target, at least for the next nine years, the Millennium Development Goals, set by the United Nations Millennium Summit in 2000. On that occasion, the largest gathering of world leaders in history jointly pledged to meet, by 2015, a list of goals that include:
Reducing by half the proportion of the world’s people in extreme poverty (defined as living on less than the purchasing-power equivalent of one U.S. dollar per day).
Reducing by half the proportion of people who suffer from hunger.
Ensuring that children everywhere are able to take a full course of primary schooling.
Ending sex disparity in education.
Reducing by two-thirds the mortality rate among children under 5.
Reducing by three-quarters the rate of maternal mortality.
Halting and beginning to reverse the spread of H.I.V./AIDS and halting and beginning to reduce the incidence of malaria and other major diseases.
Reducing by half the proportion of people without sustainable access to safe drinking water.
Last year a United Nations task force, led by the Columbia University economist Jeffrey Sachs, estimated the annual cost of meeting these goals to be $121 billion in 2006, rising to $189 billion by 2015. When we take account of existing official development aid promises, the additional amount needed each year to meet the goals is only $48 billion for 2006 and $74 billion for 2015.
Now let’s look at the incomes of America’s rich and superrich, and ask how much they could reasonably give. The task is made easier by statistics recently provided by Thomas Piketty and Emmanuel Saez, economists at the École Normale Supérieure, Paris-Jourdan, and the University of California, Berkeley, respectively, based on U.S. tax data for 2004. Their figures are for pretax income, excluding income from capital gains, which for the very rich are nearly always substantial. For simplicity I have rounded the figures, generally downward. Note too that the numbers refer to “tax units,” that is, in many cases, families rather than individuals.
Piketty and Saez’s top bracket comprises 0.01 percent of U.S. taxpayers. There are 14,400 of them, earning an average of $12,775,000, with total earnings of $184 billion. The minimum annual income in this group is more than $5 million, so it seems reasonable to suppose that they could, without much hardship, give away a third of their annual income, an average of $4.3 million each, for a total of around $61 billion. That would still leave each of them with an annual income of at least $3.3 million.
Next comes the rest of the top 0.1 percent (excluding the category just described, as I shall do henceforth). There are 129,600 in this group, with an average income of just over $2 million and a minimum income of $1.1 million. If they were each to give a quarter of their income, that would yield about $65 billion, and leave each of them with at least $846,000 annually.
The top 0.5 percent consists of 575,900 taxpayers, with an average income of $623,000 and a minimum of $407,000. If they were to give one-fifth of their income, they would still have at least $325,000 each, and they would be giving a total of $72 billion.
Coming down to the level of those in the top 1 percent, we find 719,900 taxpayers with an average income of $327,000 and a minimum of $276,000. They could comfortably afford to give 15 percent of their income. That would yield $35 billion and leave them with at least $234,000.
Finally, the remainder of the nation’s top 10 percent earn at least $92,000 annually, with an average of $132,000. There are nearly 13 million in this group. If they gave the traditional tithe — 10 percent of their income, or an average of $13,200 each — this would yield about $171 billion and leave them a minimum of $83,000.
You could spend a long time debating whether the fractions of income I have suggested for donation constitute the fairest possible scheme. Perhaps the sliding scale should be steeper, so that the superrich give more and the merely comfortable give less. And it could be extended beyond the Top 10 percent of American families, so that everyone able to afford more than the basic necessities of life gives something, even if it is as little as 1 percent. Be that as it may, the remarkable thing about these calculations is that a scale of donations that is unlikely to impose significant hardship on anyone yields a total of $404 billion — from just 10 percent of American families.
Obviously, the rich in other nations should share the burden of relieving global poverty. The U.S. is responsible for 36 percent of the gross domestic product of all Organization for Economic Cooperation and Development nations. Arguably, because the U.S. is richer than all other major nations, and its wealth is more unevenly distributed than wealth in almost any other industrialized country, the rich in the U.S. should contribute more than 36 percent of total global donations. So somewhat more than 36 percent of all aid to relieve global poverty should come from the U.S. For simplicity, let’s take half as a fair share for the U.S. On that basis, extending the scheme I have suggested worldwide would provide $808 billion annually for development aid. That’s more than six times what the task force chaired by Sachs estimated would be required for 2006 in order to be on track to meet the Millennium Development Goals, and more than 16 times the shortfall between that sum and existing official development aid commitments.
If we are obliged to do no more than our fair share of eliminating global poverty, the burden will not be great. But is that really all we ought to do? Since we all agree that fairness is a good thing, and none of us like doing more because others don’t pull their weight, the fair-share view is attractive. In the end, however, I think we should reject it. Let’s return to the drowning child in the shallow pond. Imagine it is not 1 small child who has fallen in, but 50 children. We are among 50 adults, unrelated to the children, picnicking on the lawn around the pond. We can easily wade into the pond and rescue the children, and the fact that we would find it cold and unpleasant sloshing around in the knee-deep muddy water is no justification for failing to do so. The “fair share” theorists would say that if we each rescue one child, all the children will be saved, and so none of us have an obligation to save more than one. But what if half the picnickers prefer staying clean and dry to rescuing any children at all? Is it acceptable if the rest of us stop after we have rescued just one child, knowing that we have done our fair share, but that half the children will drown? We might justifiably be furious with those who are not doing their fair share, but our anger with them is not a reason for letting the children die. In terms of praise and blame, we are clearly right to condemn, in the strongest terms, those who do nothing. In contrast, we may withhold such condemnation from those who stop when they have done their fair share. Even so, they have let children drown when they could easily have saved them, and that is wrong.
Similarly, in the real world, it should be seen as a serious moral failure when those with ample income do not do their fair share toward relieving global poverty. It isn’t so easy, however, to decide on the proper approach to take to those who limit their contribution to their fair share when they could easily do more and when, because others are not playing their part, a further donation would assist many in desperate need. In the privacy of our own judgment, we should believe that it is wrong not to do more. But whether we should actually criticize people who are doing their fair share, but no more than that, depends on the psychological impact that such criticism will have on them, and on others. This in turn may depend on social practices. If the majority are doing little or nothing, setting a standard higher than the fair-share level may seem so demanding that it discourages people who are willing to make an equitable contribution from doing even that. So it may be best to refrain from criticizing those who achieve the fair-share level. In moving our society’s standards forward, we may have to progress one step at a time.
For more than 30 years, I’ve been reading, writing and teaching about the ethical issue posed by the juxtaposition, on our planet, of great abundance and life-threatening poverty. Yet it was not until, in preparing this article, I calculated how much America’s Top 10 percent of income earners actually make that I fully understood how easy it would be for the world’s rich to eliminate, or virtually eliminate, global poverty. (It has actually become much easier over the last 30 years, as the rich have grown significantly richer.) I found the result astonishing. I double-checked the figures and asked a research assistant to check them as well. But they were right. Measured against our capacity, the Millennium Development Goals are indecently, shockingly modest. If we fail to achieve them — as on present indications we well might — we have no excuses. The target we should be setting for ourselves is not halving the proportion of people living in extreme poverty, and without enough to eat, but ensuring that no one, or virtually no one, needs to live in such degrading conditions. That is a worthy goal, and it is well within our reach.
Peter Singer is the Ira W. DeCamp professor of bioethics at the Center for Human Values at Princeton University. He is the author of many books, including most recently “The Way We Eat: Why Our Food Choices Matter.”
Sunday, November 26, 2006
Leaving academia for consulting
NYTimes
November 27, 2006
Gilded Paychecks
Very Rich Are Leaving the Merely Rich Behind
By LOUIS UCHITELLE
A decade into the practice of medicine, still striving to become “a well regarded physician-scientist,” Robert H. Glassman concluded that he was not making enough money. So he answered an ad in the New England Journal of Medicine from a business consulting firm hiring doctors.
And today, after moving on to Wall Street as an adviser on medical investments, he is a multimillionaire.
Such routes to great wealth were just opening up to physicians when Dr. Glassman was in school, graduating from Harvard College in 1983 and Harvard Medical School four years later. Hoping to achieve breakthroughs in curing cancer, his specialty, he plunged into research, even dreaming of a Nobel Prize, until Wall Street reordered his life.
Just how far he had come from a doctor’s traditional upper-middle-class expectations struck home at the 20th reunion of his college class. By then he was working for Merrill Lynch and soon would become a managing director of health care investment banking.
“There were doctors at the reunion — very, very smart people,” Dr. Glassman recalled in a recent interview. “They went to the top programs, they remained true to their ethics and really had very pure goals. And then they went to the 20th-year reunion and saw that somebody else who was 10 times less smart was making much more money.”
The opportunity to become abundantly rich is a recent phenomenon not only in medicine, but in a growing number of other professions and occupations. In each case, the great majority still earn fairly uniform six-figure incomes, usually less than $400,000 a year, government data show. But starting in the 1990s, a significant number began to earn much more, creating a two-tier income stratum within such occupations.
The divide has emerged as people like Dr. Glassman, who is 45, latched onto opportunities within their fields that offered significantly higher incomes. Some lawyers and bankers, for example, collect much larger fees than others in their fields for their work on business deals and cases.
Others have moved to different, higher-paying fields — from academia to Wall Street, for example — and a growing number of entrepreneurs have seen windfalls tied largely to expanding financial markets, which draw on capital from around the world. The latter phenomenon has allowed, say, the owner of a small mail-order business to sell his enterprise for tens of millions instead of the hundreds of thousands that such a sale might have brought 15 years ago.
Three decades ago, compensation among occupations differed far less than it does today. That growing difference is diverting people from some critical fields, experts say. The American Bar Foundation, a research group, has found in its surveys, for instance, that fewer law school graduates are going into public-interest law or government jobs and filling all the openings is becoming harder.
Something similar is happening in academia, where newly minted Ph.D.’s migrate from teaching or research to more lucrative fields. Similarly, many business school graduates shun careers as experts in, say, manufacturing or consumer products for much higher pay on Wall Street.
And in medicine, where some specialties now pay far more than others, young doctors often bypass the lower-paying fields. The Medical Group Management Association, for example, says the nation lacks enough doctors in family practice, where the median income last year was $161,000.
“The bigger the prize, the greater the effort that people are making to get it,” said Edward N. Wolff, a New York University economist who studies income and wealth. “That effort is draining people away from more useful work.”
What kind of work is most useful is a matter of opinion, of course, but there is no doubt that a new group of the very rich have risen today far above their merely affluent colleagues.
Turning to Philanthropy
One in every 825 households earned at least $2 million last year, nearly double the percentage in 1989, adjusted for inflation, Mr. Wolff found in an analysis of government data. When it comes to wealth, one in every 325 households had a net worth of $10 million or more in 2004, the latest year for which data is available, more than four times as many as in 1989.
As some have grown enormously rich, they are turning to philanthropy in a competition that is well beyond the means of their less wealthy peers. “The ones with $100 million are setting the standard for their own circles, but no longer for me,” said Robert Frank, a Cornell University economist who described the early stages of the phenomenon in a 1995 book, “The Winner-Take-All Society,” which he co-authored.
Fighting AIDS and poverty in Africa are favorite causes, and so is financing education, particularly at one’s alma mater.
“It is astonishing how many gifts of $100 million have been made in the last year,” said Inge Reichenbach, vice president for development at Yale University, which like other schools tracks the net worth of its alumni and assiduously pursues the richest among them.
Dr. Glassman hopes to enter this circle someday. At 35, he was making $150,000 in 1996 (about $190,000 in today’s dollars) as a hematology-oncology specialist. That’s when, recently married and with virtually no savings, he made the switch that brought him to management consulting.
He won’t say just how much he earns now on Wall Street or his current net worth. But compensation experts, among them Johnson Associates, say the annual income of those in his position is easily in the seven figures and net worth often rises to more than $20 million.
“He is on his way,” said Alan Johnson, managing director of the firm, speaking of people on career tracks similar to Dr. Glassman’s. “He is destined to riches.”
Indeed, doctors have become so interested in the business side of medicine that more than 40 medical schools have added, over the last 20 years, an optional fifth year of schooling for those who want to earn an M.B.A. degree as well as an M.D. Some go directly to Wall Street or into health care management without ever practicing medicine.
“It was not our goal to create masters of the universe,” said James Aisner, a spokesman for Harvard Business School, whose joint program with the medical school started last year. “It was to train people to do useful work.”
Dr. Glassman still makes hospital rounds two or three days a month, usually on free weekends. Treating patients, he said, is “a wonderful feeling.” But he sees his present work as also a valuable aspect of medicine.
One of his tasks is to evaluate the numerous drugs that start-up companies, particularly in biotechnology, are developing. These companies often turn to firms like Merrill Lynch for an investment or to sponsor an initial public stock offering. Dr. Glassman is a critical gatekeeper in this process, evaluating, among other things, whether promising drugs live up to their claims.
What Dr. Glassman represents, along with other very rich people interviewed for this article, is the growing number of Americans who acknowledge that they have accumulated, or soon will, more than enough money to live comfortably, even luxuriously, and also enough so that their children, as adults, will then be free to pursue careers “they have a hunger for,” as Dr. Glassman put it, “and not feel a need to do something just to pay the bills.”
In an earlier Gilded Age, Andrew Carnegie argued that talented managers who accumulate great wealth were morally obligated to redistribute their wealth through philanthropy. The estate tax and the progressive income tax later took over most of that function — imposing tax rates of more than 70 percent as recently as 1980 on incomes above a certain level.
Now, with this marginal rate at half that much and the estate tax fading in importance, many of the new rich engage in the conspicuous consumption that their wealth allows. Others, while certainly not stinting on comfort, are embracing philanthropy as an alternative to a life of professional accomplishment.
Bill Gates and Warren Buffett are held up as models, certainly by Dr. Glassman. “They are going to make much greater contributions by having made money and then giving it away than most, almost all, scientists,” he said, adding that he is drawn to philanthropy as a means of achieving a meaningful legacy.
“It has to be easier than the chance of becoming a Nobel Prize winner,” he said, explaining his decision to give up research, “and I think that goes through the minds of highly educated, high performing individuals.”
As Bush administration officials see it — and conservative economists often agree — philanthropy is a better means of redistributing the nation’s wealth than higher taxes on the rich. They argue that higher marginal tax rates would discourage entrepreneurship and risk-taking. But some among the newly rich have misgivings.
Mark M. Zandi is one. He was a founder of Economy.com, a forecasting and data gathering service in West Chester, Pa. His net worth vaulted into eight figures with the company’s sale last year to Moody’s Investor Service.
“Our tax policies should be redesigned through the prism that wealth is being increasingly skewed,” Mr. Zandi said, arguing that higher taxes on the rich could help restore a sense of fairness to the system and blunt a backlash from a middle class that feels increasingly squeezed by the costs of health care, higher education, and a secure retirement. The Federal Reserve’s Survey of Consumer Finances, a principal government source of income and wealth data, does not single out the occupations and professions generating so much wealth today. But Forbes magazine offers a rough idea in its annual surveys of the richest Americans, those approaching and crossing the billion dollar mark.
Some routes are of long standing. Inheritance plays a role. So do the earnings of Wall Street investment bankers and the super incomes of sports stars and celebrities. All of these routes swell the ranks of the very rich, as they did in 1989.
But among new occupations, the winners include numerous partners in recently formed hedge funds and private equity firms that invest or acquire companies. Real estate developers and lawyers are more in evidence today among the very rich. So are dot-com entrepreneurs as well as scientists who start a company to market an invention or discovery, soon selling it for many millions. And from corporate America come many more chief executives than in the past.
Seventy-five percent of the chief executives in a sample of 100 publicly traded companies had a net worth in 2004 of more than $25 million mainly from stock and options in the companies they ran, according to a study by Carola Frydman, a finance professor at the Massachusetts Institute of Technology’s Sloan School of Management. That was up from 31 percent for the same sample in 1989, adjusted for inflation.
Chief executives were not alone among corporate executives in rising to great wealth. There were similar or even greater increases in the percentage of lower-ranking executives — presidents, executive vice presidents, chief financial officers — also advancing into the $25 million-plus category.
The growing use of options as a form of pay helps to explain the sharp rise in the number of very wealthy households. But so does the gradual dismantling of the progressive income tax, Ms. Frydman concluded in a recent study.
“Our simulation results suggest that, had taxes been at their low 2000 level throughout the past 60 years, chief executive compensation would have been 35 percent higher during the 1950s and 1960s,” she wrote.
Trying Not to Live Ostentatiously
Finally, the owners of a variety of ordinary businesses — a small chain of coffee shops or temporary help agencies, for example — manage to expand these family operations with the help of venture capital and private equity firms, eventually selling them or taking them public in a marketplace that rewards them with huge sums.
John J. Moon, a managing director of Metalmark Capital, a private equity firm, explains how this process works.
“Let’s say we buy a small pizza parlor chain from an entrepreneur for $10 million,” said Mr. Moon, who at 39, is already among the very rich. “We make it more efficient, we build it from 10 stores to 100 and we sell it to Domino’s for $50 million.”
As a result, not only the entrepreneur gets rich; so do Mr. Moon and his colleagues, who make money from putting together such deals and from managing the money they raise from wealthy investors who provide much of the capital.
By his own account, Mr. Moon, like Dr. Glassman, came reluctantly to the accumulation of wealth. Having earned a Ph.D. in business economics from Harvard in 1994, he set out to be a professor of finance, landing a job at Dartmouth’s Tuck Graduate School of Business, with a starting salary in the low six figures.
To this day, teaching tugs at Mr. Moon, whose parents immigrated to the United States from South Korea. He steals enough time from Metalmark Capital to teach one course in finance each semester at Columbia University’s business school. “If Wall Street was not there as an alternative,” Mr. Moon said, “I would have gone into academia.”
Academia, of course, turned out to be no match for the job offers that came Mr. Moon’s way from several Wall Street firms. He joined Goldman Sachs, moved on to Morgan Stanley’s private equity operation in 1998 and stayed on when the unit separated from Morgan Stanley in 2004 and became Metalmark Capital.
As his income and net worth grew, the Harvard alumni association made contact and he started to give money, not just to Harvard, but to various causes. His growing charitable activities have brought him a leadership role in Harvard alumni activities, including a seat on the graduate school alumni council.
Still, Mr. Moon tries to live unostentatiously. “The trick is not to want more as your income and wealth grow,” he said. “You fly coach and then you fly first class and then it is fractional ownership of a jet and then owning a jet. I still struggle with first class. My partners make fun of me.”
His reluctance to show his wealth has a basis in his religion. “My wife and I are committed Presbyterians,” he said. “I would like to think that my faith informs my career decisions even more than financial considerations. That is not always easy because money is not unimportant.”
It has a momentum of its own. Mr. Moon and his wife, Hee-Jung, who gave up law to raise their two sons, are renovating a newly purchased Park Avenue co-op. “On an absolute scale it is lavish,” he said, “but on a relative scale, relative to my peers, it is small.”
Behavior is gradually changing in the Glassman household, too. Not that the doctor and his wife, Denise, 41, seem to crave change. Nothing in his off-the-rack suits, or the cafes and nondescript restaurants that he prefers for interviews, or the family’s comparatively modest four-bedroom home in suburban Short Hills, N.J., or their two cars (an Acura S.U.V. and a Honda Accord) suggests that wealth has altered the way the family lives.
But it is opening up “choices,” as Mrs. Glassman put it. They enjoy annual ski vacations in Utah now. The Glassmans are shopping for a larger house — not as large as the family could afford, Mrs. Glassman said, but large enough to accommodate a wood-paneled study where her husband could put all his books and his diplomas and “feel that it is his own.” Right now, a glassed-in porch, without book shelves, serves as a workplace for both of them.
Starting out, Dr. Glassman’s $150,000 a year was a bit less than that of his wife, then a marketing executive with an M.B.A. from Northwestern. Their plan was for her to stop working once they had children. To build up their income, she encouraged him to set up or join a medical practice to treat patients. Dr. Glassman initially balked, but he was coming to realize that his devotion to research would not necessarily deliver a big scientific payoff.
“I wasn’t sure that I was willing to take the risk of spending many years applying for grants and working long hours for the very slim chance of winning at the roulette table and making a significant contribution to the scientific literature,” he said.
In this mood, he was drawn to the ad that McKinsey & Company, the giant consulting firm, had placed in the New England Journal of Medicine. McKinsey was increasingly working among biomedical and pharmaceutical companies and it needed more physicians on staff as consultants. Dr. Glassman, absorbed in the world of medicine, did not know what McKinsey was. His wife enlightened him. “The way she explained it, McKinsey was like a Massachusetts General Hospital for M.B.A.’s,” he said. “It was really prestigious, which I liked, and I heard that it was very intellectually charged.”
He soon joined as a consultant, earning a starting salary that was roughly the same as he was earning as a researcher — and soon $100,000 more. He stayed four years, traveling constantly and during that time the family made the move to Short Hills from rented quarters in Manhattan.
Dr. Glassman migrated to Merrill Lynch in 2001, first in private equity, which he found to be more at the forefront of innovation than consulting at McKinsey, and then gradually to investment banking, going full time there in 2004.
Linking Security to Income
Casey McCullar hopes to follow a similar circuit. Now 29, he joined the Marconi Corporation, a big telecommunications company, in 1999 right out of the University of Texas in Dallas, his hometown. Over the next six years he worked up to project manager at $42,000 a year, becoming quite skilled in electronic mapmaking.
A trip to India for his company introduced him to the wonders of outsourcing and the money he might make as an entrepreneur facilitating the process. As a first step, he applied to the Tuck business school at Dartmouth, got in and quit his Texas job, despite his mother’s concern that he was giving up future promotions and very good health insurance, particularly Marconi’s dental plan.
His life at Tuck soon sent him in still another direction. When he graduates next June he will probably go to work for Mercer Management Consulting, he says. Mercer recruited him at a starting salary of $150,000, including bonus. “If you had told me a couple of years ago that I would be making three times my Marconi salary, I would not have believed you,” Mr. McCullar said.
Nearly 70 percent of Tuck’s graduates go directly to consulting firms or Wall Street investment houses. He may pursue finance later, Mr. McCullar says, always keeping in mind an entrepreneurial venture that could really leverage his talent.
“When my mom talks of Marconi’s dental plan and a safe retirement,” he said, “she really means lifestyle security based on job security.”
But “for my generation,” Mr. McCullar said, “lifestyle security comes from financial independence. I’m doing what I want to do and it just so happens that is where the money is.”
November 27, 2006
Gilded Paychecks
Very Rich Are Leaving the Merely Rich Behind
By LOUIS UCHITELLE
A decade into the practice of medicine, still striving to become “a well regarded physician-scientist,” Robert H. Glassman concluded that he was not making enough money. So he answered an ad in the New England Journal of Medicine from a business consulting firm hiring doctors.
And today, after moving on to Wall Street as an adviser on medical investments, he is a multimillionaire.
Such routes to great wealth were just opening up to physicians when Dr. Glassman was in school, graduating from Harvard College in 1983 and Harvard Medical School four years later. Hoping to achieve breakthroughs in curing cancer, his specialty, he plunged into research, even dreaming of a Nobel Prize, until Wall Street reordered his life.
Just how far he had come from a doctor’s traditional upper-middle-class expectations struck home at the 20th reunion of his college class. By then he was working for Merrill Lynch and soon would become a managing director of health care investment banking.
“There were doctors at the reunion — very, very smart people,” Dr. Glassman recalled in a recent interview. “They went to the top programs, they remained true to their ethics and really had very pure goals. And then they went to the 20th-year reunion and saw that somebody else who was 10 times less smart was making much more money.”
The opportunity to become abundantly rich is a recent phenomenon not only in medicine, but in a growing number of other professions and occupations. In each case, the great majority still earn fairly uniform six-figure incomes, usually less than $400,000 a year, government data show. But starting in the 1990s, a significant number began to earn much more, creating a two-tier income stratum within such occupations.
The divide has emerged as people like Dr. Glassman, who is 45, latched onto opportunities within their fields that offered significantly higher incomes. Some lawyers and bankers, for example, collect much larger fees than others in their fields for their work on business deals and cases.
Others have moved to different, higher-paying fields — from academia to Wall Street, for example — and a growing number of entrepreneurs have seen windfalls tied largely to expanding financial markets, which draw on capital from around the world. The latter phenomenon has allowed, say, the owner of a small mail-order business to sell his enterprise for tens of millions instead of the hundreds of thousands that such a sale might have brought 15 years ago.
Three decades ago, compensation among occupations differed far less than it does today. That growing difference is diverting people from some critical fields, experts say. The American Bar Foundation, a research group, has found in its surveys, for instance, that fewer law school graduates are going into public-interest law or government jobs and filling all the openings is becoming harder.
Something similar is happening in academia, where newly minted Ph.D.’s migrate from teaching or research to more lucrative fields. Similarly, many business school graduates shun careers as experts in, say, manufacturing or consumer products for much higher pay on Wall Street.
And in medicine, where some specialties now pay far more than others, young doctors often bypass the lower-paying fields. The Medical Group Management Association, for example, says the nation lacks enough doctors in family practice, where the median income last year was $161,000.
“The bigger the prize, the greater the effort that people are making to get it,” said Edward N. Wolff, a New York University economist who studies income and wealth. “That effort is draining people away from more useful work.”
What kind of work is most useful is a matter of opinion, of course, but there is no doubt that a new group of the very rich have risen today far above their merely affluent colleagues.
Turning to Philanthropy
One in every 825 households earned at least $2 million last year, nearly double the percentage in 1989, adjusted for inflation, Mr. Wolff found in an analysis of government data. When it comes to wealth, one in every 325 households had a net worth of $10 million or more in 2004, the latest year for which data is available, more than four times as many as in 1989.
As some have grown enormously rich, they are turning to philanthropy in a competition that is well beyond the means of their less wealthy peers. “The ones with $100 million are setting the standard for their own circles, but no longer for me,” said Robert Frank, a Cornell University economist who described the early stages of the phenomenon in a 1995 book, “The Winner-Take-All Society,” which he co-authored.
Fighting AIDS and poverty in Africa are favorite causes, and so is financing education, particularly at one’s alma mater.
“It is astonishing how many gifts of $100 million have been made in the last year,” said Inge Reichenbach, vice president for development at Yale University, which like other schools tracks the net worth of its alumni and assiduously pursues the richest among them.
Dr. Glassman hopes to enter this circle someday. At 35, he was making $150,000 in 1996 (about $190,000 in today’s dollars) as a hematology-oncology specialist. That’s when, recently married and with virtually no savings, he made the switch that brought him to management consulting.
He won’t say just how much he earns now on Wall Street or his current net worth. But compensation experts, among them Johnson Associates, say the annual income of those in his position is easily in the seven figures and net worth often rises to more than $20 million.
“He is on his way,” said Alan Johnson, managing director of the firm, speaking of people on career tracks similar to Dr. Glassman’s. “He is destined to riches.”
Indeed, doctors have become so interested in the business side of medicine that more than 40 medical schools have added, over the last 20 years, an optional fifth year of schooling for those who want to earn an M.B.A. degree as well as an M.D. Some go directly to Wall Street or into health care management without ever practicing medicine.
“It was not our goal to create masters of the universe,” said James Aisner, a spokesman for Harvard Business School, whose joint program with the medical school started last year. “It was to train people to do useful work.”
Dr. Glassman still makes hospital rounds two or three days a month, usually on free weekends. Treating patients, he said, is “a wonderful feeling.” But he sees his present work as also a valuable aspect of medicine.
One of his tasks is to evaluate the numerous drugs that start-up companies, particularly in biotechnology, are developing. These companies often turn to firms like Merrill Lynch for an investment or to sponsor an initial public stock offering. Dr. Glassman is a critical gatekeeper in this process, evaluating, among other things, whether promising drugs live up to their claims.
What Dr. Glassman represents, along with other very rich people interviewed for this article, is the growing number of Americans who acknowledge that they have accumulated, or soon will, more than enough money to live comfortably, even luxuriously, and also enough so that their children, as adults, will then be free to pursue careers “they have a hunger for,” as Dr. Glassman put it, “and not feel a need to do something just to pay the bills.”
In an earlier Gilded Age, Andrew Carnegie argued that talented managers who accumulate great wealth were morally obligated to redistribute their wealth through philanthropy. The estate tax and the progressive income tax later took over most of that function — imposing tax rates of more than 70 percent as recently as 1980 on incomes above a certain level.
Now, with this marginal rate at half that much and the estate tax fading in importance, many of the new rich engage in the conspicuous consumption that their wealth allows. Others, while certainly not stinting on comfort, are embracing philanthropy as an alternative to a life of professional accomplishment.
Bill Gates and Warren Buffett are held up as models, certainly by Dr. Glassman. “They are going to make much greater contributions by having made money and then giving it away than most, almost all, scientists,” he said, adding that he is drawn to philanthropy as a means of achieving a meaningful legacy.
“It has to be easier than the chance of becoming a Nobel Prize winner,” he said, explaining his decision to give up research, “and I think that goes through the minds of highly educated, high performing individuals.”
As Bush administration officials see it — and conservative economists often agree — philanthropy is a better means of redistributing the nation’s wealth than higher taxes on the rich. They argue that higher marginal tax rates would discourage entrepreneurship and risk-taking. But some among the newly rich have misgivings.
Mark M. Zandi is one. He was a founder of Economy.com, a forecasting and data gathering service in West Chester, Pa. His net worth vaulted into eight figures with the company’s sale last year to Moody’s Investor Service.
“Our tax policies should be redesigned through the prism that wealth is being increasingly skewed,” Mr. Zandi said, arguing that higher taxes on the rich could help restore a sense of fairness to the system and blunt a backlash from a middle class that feels increasingly squeezed by the costs of health care, higher education, and a secure retirement. The Federal Reserve’s Survey of Consumer Finances, a principal government source of income and wealth data, does not single out the occupations and professions generating so much wealth today. But Forbes magazine offers a rough idea in its annual surveys of the richest Americans, those approaching and crossing the billion dollar mark.
Some routes are of long standing. Inheritance plays a role. So do the earnings of Wall Street investment bankers and the super incomes of sports stars and celebrities. All of these routes swell the ranks of the very rich, as they did in 1989.
But among new occupations, the winners include numerous partners in recently formed hedge funds and private equity firms that invest or acquire companies. Real estate developers and lawyers are more in evidence today among the very rich. So are dot-com entrepreneurs as well as scientists who start a company to market an invention or discovery, soon selling it for many millions. And from corporate America come many more chief executives than in the past.
Seventy-five percent of the chief executives in a sample of 100 publicly traded companies had a net worth in 2004 of more than $25 million mainly from stock and options in the companies they ran, according to a study by Carola Frydman, a finance professor at the Massachusetts Institute of Technology’s Sloan School of Management. That was up from 31 percent for the same sample in 1989, adjusted for inflation.
Chief executives were not alone among corporate executives in rising to great wealth. There were similar or even greater increases in the percentage of lower-ranking executives — presidents, executive vice presidents, chief financial officers — also advancing into the $25 million-plus category.
The growing use of options as a form of pay helps to explain the sharp rise in the number of very wealthy households. But so does the gradual dismantling of the progressive income tax, Ms. Frydman concluded in a recent study.
“Our simulation results suggest that, had taxes been at their low 2000 level throughout the past 60 years, chief executive compensation would have been 35 percent higher during the 1950s and 1960s,” she wrote.
Trying Not to Live Ostentatiously
Finally, the owners of a variety of ordinary businesses — a small chain of coffee shops or temporary help agencies, for example — manage to expand these family operations with the help of venture capital and private equity firms, eventually selling them or taking them public in a marketplace that rewards them with huge sums.
John J. Moon, a managing director of Metalmark Capital, a private equity firm, explains how this process works.
“Let’s say we buy a small pizza parlor chain from an entrepreneur for $10 million,” said Mr. Moon, who at 39, is already among the very rich. “We make it more efficient, we build it from 10 stores to 100 and we sell it to Domino’s for $50 million.”
As a result, not only the entrepreneur gets rich; so do Mr. Moon and his colleagues, who make money from putting together such deals and from managing the money they raise from wealthy investors who provide much of the capital.
By his own account, Mr. Moon, like Dr. Glassman, came reluctantly to the accumulation of wealth. Having earned a Ph.D. in business economics from Harvard in 1994, he set out to be a professor of finance, landing a job at Dartmouth’s Tuck Graduate School of Business, with a starting salary in the low six figures.
To this day, teaching tugs at Mr. Moon, whose parents immigrated to the United States from South Korea. He steals enough time from Metalmark Capital to teach one course in finance each semester at Columbia University’s business school. “If Wall Street was not there as an alternative,” Mr. Moon said, “I would have gone into academia.”
Academia, of course, turned out to be no match for the job offers that came Mr. Moon’s way from several Wall Street firms. He joined Goldman Sachs, moved on to Morgan Stanley’s private equity operation in 1998 and stayed on when the unit separated from Morgan Stanley in 2004 and became Metalmark Capital.
As his income and net worth grew, the Harvard alumni association made contact and he started to give money, not just to Harvard, but to various causes. His growing charitable activities have brought him a leadership role in Harvard alumni activities, including a seat on the graduate school alumni council.
Still, Mr. Moon tries to live unostentatiously. “The trick is not to want more as your income and wealth grow,” he said. “You fly coach and then you fly first class and then it is fractional ownership of a jet and then owning a jet. I still struggle with first class. My partners make fun of me.”
His reluctance to show his wealth has a basis in his religion. “My wife and I are committed Presbyterians,” he said. “I would like to think that my faith informs my career decisions even more than financial considerations. That is not always easy because money is not unimportant.”
It has a momentum of its own. Mr. Moon and his wife, Hee-Jung, who gave up law to raise their two sons, are renovating a newly purchased Park Avenue co-op. “On an absolute scale it is lavish,” he said, “but on a relative scale, relative to my peers, it is small.”
Behavior is gradually changing in the Glassman household, too. Not that the doctor and his wife, Denise, 41, seem to crave change. Nothing in his off-the-rack suits, or the cafes and nondescript restaurants that he prefers for interviews, or the family’s comparatively modest four-bedroom home in suburban Short Hills, N.J., or their two cars (an Acura S.U.V. and a Honda Accord) suggests that wealth has altered the way the family lives.
But it is opening up “choices,” as Mrs. Glassman put it. They enjoy annual ski vacations in Utah now. The Glassmans are shopping for a larger house — not as large as the family could afford, Mrs. Glassman said, but large enough to accommodate a wood-paneled study where her husband could put all his books and his diplomas and “feel that it is his own.” Right now, a glassed-in porch, without book shelves, serves as a workplace for both of them.
Starting out, Dr. Glassman’s $150,000 a year was a bit less than that of his wife, then a marketing executive with an M.B.A. from Northwestern. Their plan was for her to stop working once they had children. To build up their income, she encouraged him to set up or join a medical practice to treat patients. Dr. Glassman initially balked, but he was coming to realize that his devotion to research would not necessarily deliver a big scientific payoff.
“I wasn’t sure that I was willing to take the risk of spending many years applying for grants and working long hours for the very slim chance of winning at the roulette table and making a significant contribution to the scientific literature,” he said.
In this mood, he was drawn to the ad that McKinsey & Company, the giant consulting firm, had placed in the New England Journal of Medicine. McKinsey was increasingly working among biomedical and pharmaceutical companies and it needed more physicians on staff as consultants. Dr. Glassman, absorbed in the world of medicine, did not know what McKinsey was. His wife enlightened him. “The way she explained it, McKinsey was like a Massachusetts General Hospital for M.B.A.’s,” he said. “It was really prestigious, which I liked, and I heard that it was very intellectually charged.”
He soon joined as a consultant, earning a starting salary that was roughly the same as he was earning as a researcher — and soon $100,000 more. He stayed four years, traveling constantly and during that time the family made the move to Short Hills from rented quarters in Manhattan.
Dr. Glassman migrated to Merrill Lynch in 2001, first in private equity, which he found to be more at the forefront of innovation than consulting at McKinsey, and then gradually to investment banking, going full time there in 2004.
Linking Security to Income
Casey McCullar hopes to follow a similar circuit. Now 29, he joined the Marconi Corporation, a big telecommunications company, in 1999 right out of the University of Texas in Dallas, his hometown. Over the next six years he worked up to project manager at $42,000 a year, becoming quite skilled in electronic mapmaking.
A trip to India for his company introduced him to the wonders of outsourcing and the money he might make as an entrepreneur facilitating the process. As a first step, he applied to the Tuck business school at Dartmouth, got in and quit his Texas job, despite his mother’s concern that he was giving up future promotions and very good health insurance, particularly Marconi’s dental plan.
His life at Tuck soon sent him in still another direction. When he graduates next June he will probably go to work for Mercer Management Consulting, he says. Mercer recruited him at a starting salary of $150,000, including bonus. “If you had told me a couple of years ago that I would be making three times my Marconi salary, I would not have believed you,” Mr. McCullar said.
Nearly 70 percent of Tuck’s graduates go directly to consulting firms or Wall Street investment houses. He may pursue finance later, Mr. McCullar says, always keeping in mind an entrepreneurial venture that could really leverage his talent.
“When my mom talks of Marconi’s dental plan and a safe retirement,” he said, “she really means lifestyle security based on job security.”
But “for my generation,” Mr. McCullar said, “lifestyle security comes from financial independence. I’m doing what I want to do and it just so happens that is where the money is.”
Anorexic story
NYTimes
November 26, 2006
One Spoonful at a Time
By HARRIET BROWN
On a sweltering evening in July of last year, I sat at the end of my daughter Kitty's bed, holding a milkshake made from a cup of Häagen-Dazs coffee ice cream and a cup of whole milk. Kitty (the pet name we've used since she was a baby) shivered, wrapped in a thick quilt. "Here's your milkshake," I said, aiming for a tone that was friendly but firm, a tone that would make her reach for the glass and begin drinking. Six-hundred ninety calories — that's what this milkshake represented to me.
But to Kitty it was the object of her deepest fear and loathing. "You're trying to make me fat," she said in a high-pitched, distorted voice that made the hairs on the back of my neck stand up. She rocked, clutching her stomach, chanting over and over: "I'm a fat pig. I'm so fat."
That summer, Kitty was 14. She was 4-foot-11 and weighed 71 pounds. I could see the angles and curves of each bone under her skin. Her hair, once shiny, was lank and falling out in clumps. Her breath carried the odor of ketosis, the sour smell of the starving body digesting itself.
I kept my voice neutral. "You need to drink the milkshake," I repeated. She lifted her head, and for a second I saw the 2-year-old Kitty, her mouth quirked in a half-smile, her dark eyes full of humor. It was enough to keep me from shrieking: Just drink the damn milkshake! Enough to keep me sitting on the end of the bed for the next two hours, talking in a low voice, lifting the straw to her lips over and over. The milkshake had long since melted when she swallowed the last of it, curled up in bed and closed her eyes. Her gaunt face stayed tense even in sleep.
Kitty's anorexia was diagnosed a few weeks before, at the end of that June. My husband and I knew something was wrong for several weeks; we just didn't know what. She'd started reading Gourmet and planning lavish dinner parties. She called me at work several times a day, needing to know what dinner would be the next night and the next. She exercised for hours each night, doing situps and push-ups in her room. On Mother's Day she worried that she might have obsessive-compulsive disorder, because she couldn't stop thinking about meals and food.
My husband and I told ourselves, She's 14, we can't be overprotective. We said to each other, I wouldn't be that age again for anything. Kitty didn't want to see a therapist; we didn't want to insist. Yet.
She was thin, too thin. She ate fruit and vegetables, turkey and low-fat yogurt — healthful choices. But as she crossed the floor at her eighth-grade graduation, we saw that something had changed; suddenly she looked emaciated. I called the pediatrician the next morning.
The day anorexia was diagnosed, the doctor told Kitty to eat more and told us to find her a therapist. Two weeks later we met with an eating-disorders specialist who talked to Kitty as if she were 3 years old. That's when we panicked; we'd been pinning our hopes on the therapist, but clearly she was not going to save the day. So we tried to get Kitty to eat: we encouraged, we reasoned, we yelled. Kitty cried, said she wasn't hungry, her stomach hurt; she would eat at her friends' houses, at camp, tomorrow.
On a hundred-degree day that July, she spent hours frying chicken and baking carrot cake, then ate almost none of it. I begged her to drink water; she swore she wasn't thirsty. Late that night, she put her hand on her chest. "My heart feels funny," she said. The emergency room doctor admitted her with an abnormal EKG; she was dehydrated, and her resting heart had slowed to 31 beats a minute (normal is 60 to 80). When she didn't eat, they moved her to the I.C.U., where a frazzled doctor ordered a feeding tube. Kitty wept. "I won't be able to taste my food!" she cried.
I wanted to shout, "But you're not eating anything!" The doctor gave her a choice: eat a protein shake and a small bowl of spaghetti in half an hour, or he would order the tube. She did it — and she kept eating, three tiny hospital meals a day, more than she'd eaten in weeks.
That first night in the hospital, we asked Kitty's pediatrician where her other anorexic patients went for treatment. "When they're this sick, they go away," she said, referring to inpatient eating-disorder clinics, where people often stay for two or three months. The nearest was an hour away and cost $1,000 a day, most of which would not be covered by our HMO. Kitty was terrified at the prospect. "Don't make me leave you," she cried. It would have been easier on one level to send her away to some place that could help her. But we couldn't send her off when she was so frightened.
We visited an adolescent day program at a local psychiatric hospital; it felt like the set of "One Flew Over the Cuckoo's Nest." On every subject except food, Kitty was completely rational; how would rehashing eighth grade in the hospital's "school" help her?
Had the diagnosis been, say, diabetes, we would have been given a list of guidelines and medications — a road map for recovery. We would have looked at research and treatment protocols. Look anorexia up on Amazon, and you'll find hundreds of titles, but we couldn't sort the useful books from the flaky ones. And in terms of treatment, there isn't much systematic scientific research on the disease. No one could tell us exactly how to make our daughter well. All they could say for sure was that the odds weren't good. Anorexia is one of the deadliest psychiatric diseases; it's estimated that up to 15 percent of anorexics die, from suicide or complications related to starvation. About a third may make some improvement but are still dominated by their obsession with food. Many become depressed or anxious, and some develop substance-abuse problems, like alcoholism. Almost half never marry. It is thought that if anorexia is not treated early on, during adolescence, it tends to take an average of five to seven years for the person to recover — if it happens at all. I pictured Kitty, starved and weak, at 16 and 18 and 21, and felt sick.
I went home and started researching, hoping to find another option. Among the few studies done on anorexia treatment, I came across one from 1997, a follow-up to an earlier study on adolescents that assessed a method developed in England and was still relatively unknown in the United States: family- based treatment, often called the Maudsley approach. This treatment was created by a team of therapists led by Christopher Dare and Ivan Eisler at the Maudsley Hospital in London, in the mid-1980s, as an alternative to hospitalization. In a hospital setting, nurses sit with anorexic patients at meals, encouraging and calming them; they create a culture in which patients have to eat. The Maudsley approach urges families to essentially take on the nurses' role. Parents become primary caretakers, working with a Maudsley therapist. Their job: Finding ways to insist that their children eat.
The two studies showed that 90 percent of the adolescents recovered or made significant gains; five years later, 90 percent had fully recovered. (Two other studies confirmed these results.) In the world of eating disorders, I was coming to understand, this was a phenomenally high success rate.
The idea that parents should be intimately involved in the refeeding of their children can be quite controversial, a departure from the conventional notion that the dynamic between parent and child causes or contributes to the anorexia. Many therapists advocate a "parentectomy," insisting that parents stay out of the treatment to preserve the child's privacy and autonomy. They say that a child must "choose" to eat in order to truly recover. Maudsley advocates see the family as the best chance a child has for recovery; no one else knows the child as well or has the same investment in the child's well-being. That felt right to us.
Over the last few years, most eating-disorders researchers have begun to think that there is no single cause of anorexia, that maybe it's more like a recipe, where several ingredients — genetics, personality type, hormones, stressful life events — come together in just the wrong way. Maudsley practitioners say that focusing on the cause is secondary, ultimately, because once the physiological process of starvation kicks in, the disease takes on a life of its own, unfolding with predictable symptoms, intensity and long-term consequences. Anorexics become almost uniformly depressed, withdrawn, enraged, anxious, irritable or suicidal, and their thinking about food and eating is distorted, in part because the brain runs on glucose, and when it has been deprived over a long period of time, when it's starved, it goes haywire. It's important to get the patient's weight up, fast, because the less time spent in starvation, the better the outcome. Adult anorexics who have been chronically ill for years have much poorer prognoses than teenagers.
I called Daniel Le Grange, an associate professor of psychiatry at the University of Chicago, who directs the eating-disorders program there. Le Grange spent five years training at Maudsley Hospital in England, and he and James Lock, a professor of child and adolescent psychiatry and pediatrics at Stanford, have written Maudsley treatment manuals for physicians and therapists and a book for parents. The two are in the middle of a $4 million N.I.H.-financed study designed to measure the effectiveness of the Maudsley approach. Le Grange compared anorexia to cancer. "If you leave it, it's going to metastasize," he said. "You need to figure out an aggressive way to eradicate it as quickly as you can. You're not going to hear an oncologist say, 'Oh, it's Stage 0 cancer, let's wait till it becomes Stage 3.' "
I asked Le Grange what he thought about a critique of Maudsley: that it violates the usual boundaries between child and parent, derailing the adolescent work of separation and individuation. "If your child has diabetes and doesn't check her blood sugar often enough, you'd make sure she did," Le Grange reassured me. "What we're trying to achieve is taking anorexia away so the child can go on her way unencumbered by the eating disorder. What could be more respectful of adolescent development?"
There were no local Maudsley therapists, so my husband and I lined up a pediatrician (in whose office Kitty was weighed weekly), a psychiatrist (whom she saw weekly, then twice a month), a therapist (weekly) and a nutritionist (two or three visits). We didn't know if Maudsley would work. We didn't know if it was, objectively speaking, the best choice. But anything was better than watching Kitty disappear, ounce by ounce, obscured by the creature who spoke with her voice and looked out through her eyes. Anything.
On Day 2 of refeeding Kitty, our younger daughter, Lulu (also her nickname), turned 10. We had cake, a dense, rich chocolate cake layered with raspberry filling — one of Kitty's favorites. Of course she refused it. I told her that if she didn't eat the cake, we'd go back to the hospital that night and she would get the tube. I hated saying this, but I hated the prospect of the hospital more. The tube felt like the worst thing that could happen to her, though of course it was not. Five minutes after Kitty was born, I fed her from my own body. Now the idea of forcing a tube down her throat, having a nurse insert a "bolus" every so often, seemed a grotesque perversion of every bit of love and sustenance I'd ever given her.
She sat in front of the cake, crying. She put down the fork, said her throat was closing, said that she was a horrible person, that she couldn't eat it, she just couldn't. We told her it was not a choice to starve. We told her she could do nothing until she ate — no TV, books, showers, phone, sleep. We told her we would sit at the table all night if we had to.
Still, I was astonished when she lifted the first tiny forkful of cake to her mouth. It took 45 minutes to eat the whole piece. After she'd scraped the last bit into her mouth, she lay her head on the table and sobbed, "That was scary, Mommy!"
At age 4, Kitty went for a pony ride and was seated on an enormous quarter horse. When the horse reared, she just held on. Afterward I asked if she'd been scared. "Not really," she said. "Can I go again?"
This was the child who was now terrified by a slice of chocolate cake.
That night, when I checked on her in bed, she mumbled, "Make it go away." I now knew what "it" was. It seemed as if she were possessed by a vicious demon she must appease or suffer the consequences. I pictured its leathery wings and yellow fangs inside her. Each crumb Kitty ate was an act of true bravery, defiance snatched from its curved talons. I've heard women joke, "I could use a little anorexia!" They have no idea.
This demon was described nowhere in the books I was frantically reading. It wasn't until I stumbled on a 1940s study led by Dr. Ancel Keys, a physiologist at the University of Minnesota, that I began to understand. During World War II, Keys recruited 36 physically and psychologically healthy men for a yearlong study on starvation. For the first three months they ate normally, while Keys's researchers recorded information about their personalities, eating patterns and behavior. For the next six months their rations were cut in half; most of the men lost about a quarter of their weight, putting them at about 75 percent of their former weight — about where Kitty was when she was hospitalized. The men spent the final three months being refed.
Keys and his colleagues published their study in 1950 as "The Biology of Human Starvation," and his findings are startlingly relevant to anorexia. Depression and irritability plagued all the volunteers, especially during refeeding. They cut their food into tiny pieces, drew meals out for hours. They became withdrawn and obsessional, antisocial and anxious. One volunteer deliberately chopped off three of his fingers during the recovery period. The demon, I thought.
"Starvation affects the whole organism," Keys wrote. Given what I'd seen of Kitty, that made sense to me. But I wondered why — if starvation triggers the cognitive, emotional and behavioral changes that are so uniform in anorexia — the Minnesota volunteers did not develop the intense fear of eating and gaining weight that characterizes the disease. And what about the millions of people around the world who are starving because they don't have enough food — why don't they develop anorexia?
Once more I turned to Le Grange, who explained that at the core of anorexia is the notion of starvation in the midst of plenty; starvation when food isn't available doesn't usually trigger the same response. As for the Minnesota volunteers, he said, they were males (most anorexics are female), and they were beyond adolescence, outside the developmental window when anorexia tends to strike. More important, the volunteers ate about half their caloric requirements for six months; most anorexics eat far less, over a much longer period of time. "We're talking about a 14- year-old who is profoundly starved for 12 months," he said. "These guys were semistarved for a relatively brief period." It's not just the weight; it's the pattern of behavioral reinforcement. Each time an anorexic restricts what she's consuming, the anorexic thoughts ("I'm so fat, I'm such a pig") and behaviors (constant exercising, for example) are strengthened. Which is why it takes not just weight gain but the experience of eating meal after meal after meal to truly cure the disease.
Of course this brings up the question: which comes first, physiological starvation or the mental and emotional changes of anorexia? "You or I would earn the Nobel Prize if we figured that out," Le Grange said. "It's a bit of both, probably, and the two impact each other. So if you are constitutionally slender and it's easy for you to diet, and you like ballet, and you live in the United States, and you're 13, and your personality is perfectionist, your chance of developing this illness is very, very high."
Switch gymnastics for ballet, and Le Grange had just described Kitty. I used to hope she'd get a B in school so she'd see that the world didn't come to an end. Clearly, she wasn't going to be O.K. in a week or a month or six months. We were embarking on a long journey, one that would change us all.
A week into refeeding, I'd become an expert in high-calorie cooking. I made macaroni and cheese with butter and whole milk, chicken breasts dredged in egg, rolled in bread crumbs, fried in butter. Carrot cake with cream-cheese icing. Thousand-calorie milkshakes and muffins. When a body is in a state of starvation, it isn't enough to simply eat a normal diet, Dr. Walter H. Kaye, director of the eating-disorders program at the University of California at San Diego, explained to me. The body requires huge numbers of calories to gain weight and maintain it. Every few days we added 300 calories; by Day 9, Kitty was eating 2,100 calories a day. Still, she'd lost another half pound, which panicked me until the pediatrician explained that Kitty's metabolism, slowed by starvation, was now revving high. It's not unusual to lose weight at first, she said; just keep feeding her.
A heating pad helped with the stomachaches and bloating that followed each meal. But nothing helped with the thoughts and feelings. Faced with a plate of food, the demon inside my daughter bargained, cried, lashed out. Her anxiety was so great that there was no reward that could motivate her to eat. Her fear of the tube was what kept her eating in those first few weeks. I wondered what would happen when she'd gained a few pounds and the tube was no longer a possibility.
Meanwhile, the demon sat at our table and spewed venom: "I'm a lazy pig. You're trying to make me fat." And, one night, terrifyingly: "I just want to go to sleep and never wake up."
With that comment, Kitty's younger sister, Lulu, looked up from her plate, her face full of anguish, and bolted from the table. I found her in the basement. "I don't want to go to my sister's funeral!" she cried. "Neither do I," I told her.
Later that night, when Kitty and Lulu were asleep, I stood in the middle of the kitchen and thought of how our lives had shrunk to the confines of these four walls. The counter and sink were piled high with dirty plates, ice cream tubs, glasses and pans. Between shopping, cooking, eating with Kitty, spending time with Lulu and going to work, my husband and I had no time for cleaning, much less anything else. Suddenly I was filled with fury. I grabbed a dish and smashed it on the linoleum, where it broke into half a dozen pieces. I broke another, and another, and another. There were so many things I couldn't fix or make right, so many feelings I couldn't handle. I swept the pieces into a bag and carried them outside. Tomorrow we would eat off paper plates.
Three weeks into refeeding, Kitty was consuming 3,000 calories a day; she'd gained about eight pounds. My husband or I would sit with her while she ate three meals and two snacks each day; we needed to know she was eating, and she needed us to compel her to eat, to get past the demon's grip on her. One of us brought her to work, as we had when she was an infant. In many ways this process felt like reparenting as well as refeeding, taking her back to a time when she was totally dependent on us.
Some parents don't want to or can't go backward like this. Some don't have flexible work schedules and can't be home for every meal and snack. Some are overwhelmed by the relentless and exhausting work of refeeding. For any of these parents, Maudsley may be impossible. It works best when two parents are involved — so they can take turns losing it, offstage — and when those parents agree that their top priority is refeeding. I heard stories from other families about anorexics who slipped meals into the trash when softhearted Dad was in charge, or about weight-conscious mothers who couldn't bring themselves to serve their daughters that much food. When we started refeeding Kitty, my husband had never thought much about nutrition, and the idea took some getting used to. By late August, though, he could tell you how many calories were in a pat of butter, a chicken breast, a glass of milk. And he was often far more patient with Kitty than I.
During that first month, Kitty smiled once or twice, which made us feel hopeful for the first time since the spring. We watched movies together and took walks around the block — the only exercise she was allowed. We had moments that seemed almost normal.
But the night before she was set to start high school, four weeks in, the demon re-emerged. This time it was far worse than anything we'd experienced, maybe because Kitty was stronger now. At the dinner table, she put her matchstick arms around herself and shouted, "I don't want to go to high school and have everyone say, 'Look at Kitty, look how fat she got over the summer!' "
She refused to eat anything. We cajoled and begged and threatened. She wept and flailed and lashed out. I left messages for the psychiatrist, the therapist, the pediatrician. I told her we'd have to go back to the hospital, though I suspected she now weighed too much to be admitted. Finally I reached a psychiatrist on call, who suggested that we give her a tranquilizer and put her to bed. "If she won't eat in the morning, bring her in," advised the psychiatrist. I was relieved, and also terrified: What if this was the start of a new downward slide?
But the next morning she ate breakfast as usual. After school, she came home with a couple of friends she hadn't seen since spring. As I made milkshakes for all of them, I was surprised to hear Kitty say jokingly, "We know all the ice creams with the most calories!"
One friend said, "We want to know which ones have the least!"
"Yeah," chimed in another, "because my butt is huge!" Another girl said,
"I hate my thighs!" There was a chorus of agreement.
I offered, "You girls are beautiful and healthy and strong." But I felt incredibly sad. Even face to face with the devastating effects of this disease, they were criticizing their bodies.
I've heard the arguments that media depictions of unrealistic female bodies are what drive girls to starve themselves — the Kate Moss syndrome. And it's tempting to see anorexia as a metaphor, a result of a cultural crisis in the zeitgeist. If this were true, though, millions of American girls and women would become anorexic instead of the roughly 1 to 3 percent who do. Clearly there are other factors involved.
My nightly Internet prowling turned up some interesting research by Kaye, the director of the eating-disorders program at the University of California. While Kaye suspects that social and cultural factors contribute to anorexia, he says that recent studies suggest that genetics is the most significant factor for anorexia and bulimia. He has found chromosomal abnormalities in anorexics, as well as irregular levels of the neurotransmitters dopamine and serotonin. The National Institutes of Health is currently spending $10 million in a five-year study to look at the genetic links of the disease.
I grew up in a household where disordered eating was the norm. My aunt was bulimic; my mother enrolled us both in Weight Watchers when I was 15. She recorded her weight each morning on a chart and went on to become a Weight Watchers lecturer, delivering weekly pep talks to a roomful of people who were engaged in an ongoing war with their own bodies. You had to stay vigilant, lest your appetite betray you and the pounds creep back on. I'd tried to teach my daughters to enjoy good food and to love their bodies, but maybe I hadn't gotten over my dieting-obsessed childhood. Or maybe I'd passed along a genetic predisposition that triggered Kitty's illness. The deeper into refeeding we got, though, the less I worried about causes. We could figure that out later. The important thing was to get Kitty to eat and gain weight.
By October, we'd settled into a pattern. My husband, whose work schedule is flexible, ate lunch with Kitty most days; I covered that meal when he couldn't. Kitty gained another six pounds and, encouragingly, grew an inch. But she hadn't felt hungry since before the diagnosis. I worried that anorexia had permanently short-circuited her brain-body connection; how would she ever regulate her own eating?
The rough days were predictable only in the sense that they kept coming. One night she sat at the table, hands over her eyes, in front of a plate of salmon and squash. "I'm bad! I'm bad!" she said, sobbing. "I won't eat, I won't!"
Calmly I said, "Food is your medicine and you've got to take it." Long minutes ticked by. Eventually she said: "I want to eat, but I can't. If I eat now I'll be a total failure!" The anorexia talk spilled out of her, on and on and on. I wanted to wrap her in my arms and say, "Of course you don't have to eat, poor baby." But I couldn't give the disease an inch. If I did, the same thing would happen the next day and the next day. We had to sit there until she ate, no matter how long it took.
By the first week of November, Kitty was up to 90 pounds, 10 pounds short of her target weight. More important, her mood improved significantly. But later in the month, she developed an upset stomach, which isn't unusual during refeeding, and began refusing the daily milkshakes. She complained of dizziness, wanted to know what I would be serving her, then argued for something else. She raged at me and at herself. One afternoon she cried so much she "accidentally" threw up her lunch. Back in September she tried to make herself throw up a few times; about half of all anorexics do become bulimic. Luckily, Kitty was never able to do it. I hoped she hadn't learned how.
In December, Kitty gained and lost the same pound over and over. At the end of the month she was still, frustratingly, at 90 pounds, still deeply in the grip of the disease. We boosted her intake to 4,000 calories a day. In mid-January, finally, her weight went up four pounds. It was astonishing, how much food she needed, but not unusual. Anorexics become metabolically inefficient; their temperatures rise, and they tend to burn off calories rather than put on flesh. "That's one reason for the high rate of relapse," Dr. Kaye told me. It's hard to gain enough weight to truly recover, and even harder to maintain it.
As went the fall, so went the spring. The pounds came on, very slowly, and Kitty's spirits continued to lift. More and more, she hung out with boys. "They don't talk about how fat they are," she explained. And they didn't make her feel self-conscious about eating. "Yo, Kitty, you done with your 10,000-calorie milkshake yet?" one boy said one afternoon in March, and she actually giggled.
In April, Kitty grew another two inches, which meant that her target weight went up, too. I felt despair at the thought that it would take longer now for her to gain enough weight, longer for her to get well. I told myself that her health was more complex than a number on the scale, that she was recovering. But I couldn't forget the sunken, unspeakably sad look in her eyes that past summer and fall.
One day in May, she came home from school grinning. "Guess what?" she said. "Sue brought cake to school, and I ate a piece. Aren't you proud of me?" All year she'd avoided parties, potlucks, lunches — any get-together that involved food. The fact that she ate a piece of cake, one of her "scary" foods, meant that she gave up, for a moment, being the anorexic at the back of the room. She became one of the group.
But the next day I made her and Lulu bagels with melted cheese, and Kitty complained, "You know I don't like sesame bagels." "You used to," I said. I knew what was behind this, and I wanted her to say it. I wanted it out in the open.
"They have more calories than plain bagels!" she burst out. But she calmed down quickly. "I know that was an eating-disordered thing to say. I couldn't help it," she said quietly, and ate the bagel.
I felt proud of her ability to name the demon and defy it. I wished we could just yank it out of her, unhook the claws that tormented her body and mind.
On a morning this past June, she called me at work to say the two most beautiful words in the English language: "I'm hungry!"
"I'm so happy!" I blurted out.
"I'm happy, too, Mom," she said.
She reached her target weight a few weeks later, and maintained it through the summer and early fall. Maudsley therapists say that true recovery entails weight restoration and functioning well psychologically and socially. Kitty would continue to see a therapist from time to time, to work on perfectionism and other issues. For now, though, she seemed happy and whole. The phone rang for her; friends trooped through the house. Life seemed normal again.
But one night recently I dreamed I was running through a strange house, looking for my daughter. I found her — though it wasn't really her — and grabbed her by the arm. "You didn't eat dinner last night, did you?" I shouted. "What did you have for breakfast?" Not-Kitty smiled. "A teaspoon of air," she said sweetly. I woke with my heart pounding, full of rage and hatred for Not-Kitty, the demon who lived on air, who wore my daughter's face and spoke with her voice.
The Maudsley approach advocates separating the disease from the sufferer, the anorexia from the adolescent. And this helps, especially on the worst days. But it's also true that the demon is part of our family now, lurking in the shadows. We will never forget it. We don't know if or when it will re-emerge — two months from now, two years, five years. We don't know if we've done the right thing. Is Kitty cured? Will she ever be cured? There are so many questions I can't answer.
That morning, I got out of bed in the gray predawn and went down to the kitchen. I pulled out eggs, milk, butter and raspberry jam and set to work making crepes. It was all I could think of to do.
November 26, 2006
One Spoonful at a Time
By HARRIET BROWN
On a sweltering evening in July of last year, I sat at the end of my daughter Kitty's bed, holding a milkshake made from a cup of Häagen-Dazs coffee ice cream and a cup of whole milk. Kitty (the pet name we've used since she was a baby) shivered, wrapped in a thick quilt. "Here's your milkshake," I said, aiming for a tone that was friendly but firm, a tone that would make her reach for the glass and begin drinking. Six-hundred ninety calories — that's what this milkshake represented to me.
But to Kitty it was the object of her deepest fear and loathing. "You're trying to make me fat," she said in a high-pitched, distorted voice that made the hairs on the back of my neck stand up. She rocked, clutching her stomach, chanting over and over: "I'm a fat pig. I'm so fat."
That summer, Kitty was 14. She was 4-foot-11 and weighed 71 pounds. I could see the angles and curves of each bone under her skin. Her hair, once shiny, was lank and falling out in clumps. Her breath carried the odor of ketosis, the sour smell of the starving body digesting itself.
I kept my voice neutral. "You need to drink the milkshake," I repeated. She lifted her head, and for a second I saw the 2-year-old Kitty, her mouth quirked in a half-smile, her dark eyes full of humor. It was enough to keep me from shrieking: Just drink the damn milkshake! Enough to keep me sitting on the end of the bed for the next two hours, talking in a low voice, lifting the straw to her lips over and over. The milkshake had long since melted when she swallowed the last of it, curled up in bed and closed her eyes. Her gaunt face stayed tense even in sleep.
Kitty's anorexia was diagnosed a few weeks before, at the end of that June. My husband and I knew something was wrong for several weeks; we just didn't know what. She'd started reading Gourmet and planning lavish dinner parties. She called me at work several times a day, needing to know what dinner would be the next night and the next. She exercised for hours each night, doing situps and push-ups in her room. On Mother's Day she worried that she might have obsessive-compulsive disorder, because she couldn't stop thinking about meals and food.
My husband and I told ourselves, She's 14, we can't be overprotective. We said to each other, I wouldn't be that age again for anything. Kitty didn't want to see a therapist; we didn't want to insist. Yet.
She was thin, too thin. She ate fruit and vegetables, turkey and low-fat yogurt — healthful choices. But as she crossed the floor at her eighth-grade graduation, we saw that something had changed; suddenly she looked emaciated. I called the pediatrician the next morning.
The day anorexia was diagnosed, the doctor told Kitty to eat more and told us to find her a therapist. Two weeks later we met with an eating-disorders specialist who talked to Kitty as if she were 3 years old. That's when we panicked; we'd been pinning our hopes on the therapist, but clearly she was not going to save the day. So we tried to get Kitty to eat: we encouraged, we reasoned, we yelled. Kitty cried, said she wasn't hungry, her stomach hurt; she would eat at her friends' houses, at camp, tomorrow.
On a hundred-degree day that July, she spent hours frying chicken and baking carrot cake, then ate almost none of it. I begged her to drink water; she swore she wasn't thirsty. Late that night, she put her hand on her chest. "My heart feels funny," she said. The emergency room doctor admitted her with an abnormal EKG; she was dehydrated, and her resting heart had slowed to 31 beats a minute (normal is 60 to 80). When she didn't eat, they moved her to the I.C.U., where a frazzled doctor ordered a feeding tube. Kitty wept. "I won't be able to taste my food!" she cried.
I wanted to shout, "But you're not eating anything!" The doctor gave her a choice: eat a protein shake and a small bowl of spaghetti in half an hour, or he would order the tube. She did it — and she kept eating, three tiny hospital meals a day, more than she'd eaten in weeks.
That first night in the hospital, we asked Kitty's pediatrician where her other anorexic patients went for treatment. "When they're this sick, they go away," she said, referring to inpatient eating-disorder clinics, where people often stay for two or three months. The nearest was an hour away and cost $1,000 a day, most of which would not be covered by our HMO. Kitty was terrified at the prospect. "Don't make me leave you," she cried. It would have been easier on one level to send her away to some place that could help her. But we couldn't send her off when she was so frightened.
We visited an adolescent day program at a local psychiatric hospital; it felt like the set of "One Flew Over the Cuckoo's Nest." On every subject except food, Kitty was completely rational; how would rehashing eighth grade in the hospital's "school" help her?
Had the diagnosis been, say, diabetes, we would have been given a list of guidelines and medications — a road map for recovery. We would have looked at research and treatment protocols. Look anorexia up on Amazon, and you'll find hundreds of titles, but we couldn't sort the useful books from the flaky ones. And in terms of treatment, there isn't much systematic scientific research on the disease. No one could tell us exactly how to make our daughter well. All they could say for sure was that the odds weren't good. Anorexia is one of the deadliest psychiatric diseases; it's estimated that up to 15 percent of anorexics die, from suicide or complications related to starvation. About a third may make some improvement but are still dominated by their obsession with food. Many become depressed or anxious, and some develop substance-abuse problems, like alcoholism. Almost half never marry. It is thought that if anorexia is not treated early on, during adolescence, it tends to take an average of five to seven years for the person to recover — if it happens at all. I pictured Kitty, starved and weak, at 16 and 18 and 21, and felt sick.
I went home and started researching, hoping to find another option. Among the few studies done on anorexia treatment, I came across one from 1997, a follow-up to an earlier study on adolescents that assessed a method developed in England and was still relatively unknown in the United States: family- based treatment, often called the Maudsley approach. This treatment was created by a team of therapists led by Christopher Dare and Ivan Eisler at the Maudsley Hospital in London, in the mid-1980s, as an alternative to hospitalization. In a hospital setting, nurses sit with anorexic patients at meals, encouraging and calming them; they create a culture in which patients have to eat. The Maudsley approach urges families to essentially take on the nurses' role. Parents become primary caretakers, working with a Maudsley therapist. Their job: Finding ways to insist that their children eat.
The two studies showed that 90 percent of the adolescents recovered or made significant gains; five years later, 90 percent had fully recovered. (Two other studies confirmed these results.) In the world of eating disorders, I was coming to understand, this was a phenomenally high success rate.
The idea that parents should be intimately involved in the refeeding of their children can be quite controversial, a departure from the conventional notion that the dynamic between parent and child causes or contributes to the anorexia. Many therapists advocate a "parentectomy," insisting that parents stay out of the treatment to preserve the child's privacy and autonomy. They say that a child must "choose" to eat in order to truly recover. Maudsley advocates see the family as the best chance a child has for recovery; no one else knows the child as well or has the same investment in the child's well-being. That felt right to us.
Over the last few years, most eating-disorders researchers have begun to think that there is no single cause of anorexia, that maybe it's more like a recipe, where several ingredients — genetics, personality type, hormones, stressful life events — come together in just the wrong way. Maudsley practitioners say that focusing on the cause is secondary, ultimately, because once the physiological process of starvation kicks in, the disease takes on a life of its own, unfolding with predictable symptoms, intensity and long-term consequences. Anorexics become almost uniformly depressed, withdrawn, enraged, anxious, irritable or suicidal, and their thinking about food and eating is distorted, in part because the brain runs on glucose, and when it has been deprived over a long period of time, when it's starved, it goes haywire. It's important to get the patient's weight up, fast, because the less time spent in starvation, the better the outcome. Adult anorexics who have been chronically ill for years have much poorer prognoses than teenagers.
I called Daniel Le Grange, an associate professor of psychiatry at the University of Chicago, who directs the eating-disorders program there. Le Grange spent five years training at Maudsley Hospital in England, and he and James Lock, a professor of child and adolescent psychiatry and pediatrics at Stanford, have written Maudsley treatment manuals for physicians and therapists and a book for parents. The two are in the middle of a $4 million N.I.H.-financed study designed to measure the effectiveness of the Maudsley approach. Le Grange compared anorexia to cancer. "If you leave it, it's going to metastasize," he said. "You need to figure out an aggressive way to eradicate it as quickly as you can. You're not going to hear an oncologist say, 'Oh, it's Stage 0 cancer, let's wait till it becomes Stage 3.' "
I asked Le Grange what he thought about a critique of Maudsley: that it violates the usual boundaries between child and parent, derailing the adolescent work of separation and individuation. "If your child has diabetes and doesn't check her blood sugar often enough, you'd make sure she did," Le Grange reassured me. "What we're trying to achieve is taking anorexia away so the child can go on her way unencumbered by the eating disorder. What could be more respectful of adolescent development?"
There were no local Maudsley therapists, so my husband and I lined up a pediatrician (in whose office Kitty was weighed weekly), a psychiatrist (whom she saw weekly, then twice a month), a therapist (weekly) and a nutritionist (two or three visits). We didn't know if Maudsley would work. We didn't know if it was, objectively speaking, the best choice. But anything was better than watching Kitty disappear, ounce by ounce, obscured by the creature who spoke with her voice and looked out through her eyes. Anything.
On Day 2 of refeeding Kitty, our younger daughter, Lulu (also her nickname), turned 10. We had cake, a dense, rich chocolate cake layered with raspberry filling — one of Kitty's favorites. Of course she refused it. I told her that if she didn't eat the cake, we'd go back to the hospital that night and she would get the tube. I hated saying this, but I hated the prospect of the hospital more. The tube felt like the worst thing that could happen to her, though of course it was not. Five minutes after Kitty was born, I fed her from my own body. Now the idea of forcing a tube down her throat, having a nurse insert a "bolus" every so often, seemed a grotesque perversion of every bit of love and sustenance I'd ever given her.
She sat in front of the cake, crying. She put down the fork, said her throat was closing, said that she was a horrible person, that she couldn't eat it, she just couldn't. We told her it was not a choice to starve. We told her she could do nothing until she ate — no TV, books, showers, phone, sleep. We told her we would sit at the table all night if we had to.
Still, I was astonished when she lifted the first tiny forkful of cake to her mouth. It took 45 minutes to eat the whole piece. After she'd scraped the last bit into her mouth, she lay her head on the table and sobbed, "That was scary, Mommy!"
At age 4, Kitty went for a pony ride and was seated on an enormous quarter horse. When the horse reared, she just held on. Afterward I asked if she'd been scared. "Not really," she said. "Can I go again?"
This was the child who was now terrified by a slice of chocolate cake.
That night, when I checked on her in bed, she mumbled, "Make it go away." I now knew what "it" was. It seemed as if she were possessed by a vicious demon she must appease or suffer the consequences. I pictured its leathery wings and yellow fangs inside her. Each crumb Kitty ate was an act of true bravery, defiance snatched from its curved talons. I've heard women joke, "I could use a little anorexia!" They have no idea.
This demon was described nowhere in the books I was frantically reading. It wasn't until I stumbled on a 1940s study led by Dr. Ancel Keys, a physiologist at the University of Minnesota, that I began to understand. During World War II, Keys recruited 36 physically and psychologically healthy men for a yearlong study on starvation. For the first three months they ate normally, while Keys's researchers recorded information about their personalities, eating patterns and behavior. For the next six months their rations were cut in half; most of the men lost about a quarter of their weight, putting them at about 75 percent of their former weight — about where Kitty was when she was hospitalized. The men spent the final three months being refed.
Keys and his colleagues published their study in 1950 as "The Biology of Human Starvation," and his findings are startlingly relevant to anorexia. Depression and irritability plagued all the volunteers, especially during refeeding. They cut their food into tiny pieces, drew meals out for hours. They became withdrawn and obsessional, antisocial and anxious. One volunteer deliberately chopped off three of his fingers during the recovery period. The demon, I thought.
"Starvation affects the whole organism," Keys wrote. Given what I'd seen of Kitty, that made sense to me. But I wondered why — if starvation triggers the cognitive, emotional and behavioral changes that are so uniform in anorexia — the Minnesota volunteers did not develop the intense fear of eating and gaining weight that characterizes the disease. And what about the millions of people around the world who are starving because they don't have enough food — why don't they develop anorexia?
Once more I turned to Le Grange, who explained that at the core of anorexia is the notion of starvation in the midst of plenty; starvation when food isn't available doesn't usually trigger the same response. As for the Minnesota volunteers, he said, they were males (most anorexics are female), and they were beyond adolescence, outside the developmental window when anorexia tends to strike. More important, the volunteers ate about half their caloric requirements for six months; most anorexics eat far less, over a much longer period of time. "We're talking about a 14- year-old who is profoundly starved for 12 months," he said. "These guys were semistarved for a relatively brief period." It's not just the weight; it's the pattern of behavioral reinforcement. Each time an anorexic restricts what she's consuming, the anorexic thoughts ("I'm so fat, I'm such a pig") and behaviors (constant exercising, for example) are strengthened. Which is why it takes not just weight gain but the experience of eating meal after meal after meal to truly cure the disease.
Of course this brings up the question: which comes first, physiological starvation or the mental and emotional changes of anorexia? "You or I would earn the Nobel Prize if we figured that out," Le Grange said. "It's a bit of both, probably, and the two impact each other. So if you are constitutionally slender and it's easy for you to diet, and you like ballet, and you live in the United States, and you're 13, and your personality is perfectionist, your chance of developing this illness is very, very high."
Switch gymnastics for ballet, and Le Grange had just described Kitty. I used to hope she'd get a B in school so she'd see that the world didn't come to an end. Clearly, she wasn't going to be O.K. in a week or a month or six months. We were embarking on a long journey, one that would change us all.
A week into refeeding, I'd become an expert in high-calorie cooking. I made macaroni and cheese with butter and whole milk, chicken breasts dredged in egg, rolled in bread crumbs, fried in butter. Carrot cake with cream-cheese icing. Thousand-calorie milkshakes and muffins. When a body is in a state of starvation, it isn't enough to simply eat a normal diet, Dr. Walter H. Kaye, director of the eating-disorders program at the University of California at San Diego, explained to me. The body requires huge numbers of calories to gain weight and maintain it. Every few days we added 300 calories; by Day 9, Kitty was eating 2,100 calories a day. Still, she'd lost another half pound, which panicked me until the pediatrician explained that Kitty's metabolism, slowed by starvation, was now revving high. It's not unusual to lose weight at first, she said; just keep feeding her.
A heating pad helped with the stomachaches and bloating that followed each meal. But nothing helped with the thoughts and feelings. Faced with a plate of food, the demon inside my daughter bargained, cried, lashed out. Her anxiety was so great that there was no reward that could motivate her to eat. Her fear of the tube was what kept her eating in those first few weeks. I wondered what would happen when she'd gained a few pounds and the tube was no longer a possibility.
Meanwhile, the demon sat at our table and spewed venom: "I'm a lazy pig. You're trying to make me fat." And, one night, terrifyingly: "I just want to go to sleep and never wake up."
With that comment, Kitty's younger sister, Lulu, looked up from her plate, her face full of anguish, and bolted from the table. I found her in the basement. "I don't want to go to my sister's funeral!" she cried. "Neither do I," I told her.
Later that night, when Kitty and Lulu were asleep, I stood in the middle of the kitchen and thought of how our lives had shrunk to the confines of these four walls. The counter and sink were piled high with dirty plates, ice cream tubs, glasses and pans. Between shopping, cooking, eating with Kitty, spending time with Lulu and going to work, my husband and I had no time for cleaning, much less anything else. Suddenly I was filled with fury. I grabbed a dish and smashed it on the linoleum, where it broke into half a dozen pieces. I broke another, and another, and another. There were so many things I couldn't fix or make right, so many feelings I couldn't handle. I swept the pieces into a bag and carried them outside. Tomorrow we would eat off paper plates.
Three weeks into refeeding, Kitty was consuming 3,000 calories a day; she'd gained about eight pounds. My husband or I would sit with her while she ate three meals and two snacks each day; we needed to know she was eating, and she needed us to compel her to eat, to get past the demon's grip on her. One of us brought her to work, as we had when she was an infant. In many ways this process felt like reparenting as well as refeeding, taking her back to a time when she was totally dependent on us.
Some parents don't want to or can't go backward like this. Some don't have flexible work schedules and can't be home for every meal and snack. Some are overwhelmed by the relentless and exhausting work of refeeding. For any of these parents, Maudsley may be impossible. It works best when two parents are involved — so they can take turns losing it, offstage — and when those parents agree that their top priority is refeeding. I heard stories from other families about anorexics who slipped meals into the trash when softhearted Dad was in charge, or about weight-conscious mothers who couldn't bring themselves to serve their daughters that much food. When we started refeeding Kitty, my husband had never thought much about nutrition, and the idea took some getting used to. By late August, though, he could tell you how many calories were in a pat of butter, a chicken breast, a glass of milk. And he was often far more patient with Kitty than I.
During that first month, Kitty smiled once or twice, which made us feel hopeful for the first time since the spring. We watched movies together and took walks around the block — the only exercise she was allowed. We had moments that seemed almost normal.
But the night before she was set to start high school, four weeks in, the demon re-emerged. This time it was far worse than anything we'd experienced, maybe because Kitty was stronger now. At the dinner table, she put her matchstick arms around herself and shouted, "I don't want to go to high school and have everyone say, 'Look at Kitty, look how fat she got over the summer!' "
She refused to eat anything. We cajoled and begged and threatened. She wept and flailed and lashed out. I left messages for the psychiatrist, the therapist, the pediatrician. I told her we'd have to go back to the hospital, though I suspected she now weighed too much to be admitted. Finally I reached a psychiatrist on call, who suggested that we give her a tranquilizer and put her to bed. "If she won't eat in the morning, bring her in," advised the psychiatrist. I was relieved, and also terrified: What if this was the start of a new downward slide?
But the next morning she ate breakfast as usual. After school, she came home with a couple of friends she hadn't seen since spring. As I made milkshakes for all of them, I was surprised to hear Kitty say jokingly, "We know all the ice creams with the most calories!"
One friend said, "We want to know which ones have the least!"
"Yeah," chimed in another, "because my butt is huge!" Another girl said,
"I hate my thighs!" There was a chorus of agreement.
I offered, "You girls are beautiful and healthy and strong." But I felt incredibly sad. Even face to face with the devastating effects of this disease, they were criticizing their bodies.
I've heard the arguments that media depictions of unrealistic female bodies are what drive girls to starve themselves — the Kate Moss syndrome. And it's tempting to see anorexia as a metaphor, a result of a cultural crisis in the zeitgeist. If this were true, though, millions of American girls and women would become anorexic instead of the roughly 1 to 3 percent who do. Clearly there are other factors involved.
My nightly Internet prowling turned up some interesting research by Kaye, the director of the eating-disorders program at the University of California. While Kaye suspects that social and cultural factors contribute to anorexia, he says that recent studies suggest that genetics is the most significant factor for anorexia and bulimia. He has found chromosomal abnormalities in anorexics, as well as irregular levels of the neurotransmitters dopamine and serotonin. The National Institutes of Health is currently spending $10 million in a five-year study to look at the genetic links of the disease.
I grew up in a household where disordered eating was the norm. My aunt was bulimic; my mother enrolled us both in Weight Watchers when I was 15. She recorded her weight each morning on a chart and went on to become a Weight Watchers lecturer, delivering weekly pep talks to a roomful of people who were engaged in an ongoing war with their own bodies. You had to stay vigilant, lest your appetite betray you and the pounds creep back on. I'd tried to teach my daughters to enjoy good food and to love their bodies, but maybe I hadn't gotten over my dieting-obsessed childhood. Or maybe I'd passed along a genetic predisposition that triggered Kitty's illness. The deeper into refeeding we got, though, the less I worried about causes. We could figure that out later. The important thing was to get Kitty to eat and gain weight.
By October, we'd settled into a pattern. My husband, whose work schedule is flexible, ate lunch with Kitty most days; I covered that meal when he couldn't. Kitty gained another six pounds and, encouragingly, grew an inch. But she hadn't felt hungry since before the diagnosis. I worried that anorexia had permanently short-circuited her brain-body connection; how would she ever regulate her own eating?
The rough days were predictable only in the sense that they kept coming. One night she sat at the table, hands over her eyes, in front of a plate of salmon and squash. "I'm bad! I'm bad!" she said, sobbing. "I won't eat, I won't!"
Calmly I said, "Food is your medicine and you've got to take it." Long minutes ticked by. Eventually she said: "I want to eat, but I can't. If I eat now I'll be a total failure!" The anorexia talk spilled out of her, on and on and on. I wanted to wrap her in my arms and say, "Of course you don't have to eat, poor baby." But I couldn't give the disease an inch. If I did, the same thing would happen the next day and the next day. We had to sit there until she ate, no matter how long it took.
By the first week of November, Kitty was up to 90 pounds, 10 pounds short of her target weight. More important, her mood improved significantly. But later in the month, she developed an upset stomach, which isn't unusual during refeeding, and began refusing the daily milkshakes. She complained of dizziness, wanted to know what I would be serving her, then argued for something else. She raged at me and at herself. One afternoon she cried so much she "accidentally" threw up her lunch. Back in September she tried to make herself throw up a few times; about half of all anorexics do become bulimic. Luckily, Kitty was never able to do it. I hoped she hadn't learned how.
In December, Kitty gained and lost the same pound over and over. At the end of the month she was still, frustratingly, at 90 pounds, still deeply in the grip of the disease. We boosted her intake to 4,000 calories a day. In mid-January, finally, her weight went up four pounds. It was astonishing, how much food she needed, but not unusual. Anorexics become metabolically inefficient; their temperatures rise, and they tend to burn off calories rather than put on flesh. "That's one reason for the high rate of relapse," Dr. Kaye told me. It's hard to gain enough weight to truly recover, and even harder to maintain it.
As went the fall, so went the spring. The pounds came on, very slowly, and Kitty's spirits continued to lift. More and more, she hung out with boys. "They don't talk about how fat they are," she explained. And they didn't make her feel self-conscious about eating. "Yo, Kitty, you done with your 10,000-calorie milkshake yet?" one boy said one afternoon in March, and she actually giggled.
In April, Kitty grew another two inches, which meant that her target weight went up, too. I felt despair at the thought that it would take longer now for her to gain enough weight, longer for her to get well. I told myself that her health was more complex than a number on the scale, that she was recovering. But I couldn't forget the sunken, unspeakably sad look in her eyes that past summer and fall.
One day in May, she came home from school grinning. "Guess what?" she said. "Sue brought cake to school, and I ate a piece. Aren't you proud of me?" All year she'd avoided parties, potlucks, lunches — any get-together that involved food. The fact that she ate a piece of cake, one of her "scary" foods, meant that she gave up, for a moment, being the anorexic at the back of the room. She became one of the group.
But the next day I made her and Lulu bagels with melted cheese, and Kitty complained, "You know I don't like sesame bagels." "You used to," I said. I knew what was behind this, and I wanted her to say it. I wanted it out in the open.
"They have more calories than plain bagels!" she burst out. But she calmed down quickly. "I know that was an eating-disordered thing to say. I couldn't help it," she said quietly, and ate the bagel.
I felt proud of her ability to name the demon and defy it. I wished we could just yank it out of her, unhook the claws that tormented her body and mind.
On a morning this past June, she called me at work to say the two most beautiful words in the English language: "I'm hungry!"
"I'm so happy!" I blurted out.
"I'm happy, too, Mom," she said.
She reached her target weight a few weeks later, and maintained it through the summer and early fall. Maudsley therapists say that true recovery entails weight restoration and functioning well psychologically and socially. Kitty would continue to see a therapist from time to time, to work on perfectionism and other issues. For now, though, she seemed happy and whole. The phone rang for her; friends trooped through the house. Life seemed normal again.
But one night recently I dreamed I was running through a strange house, looking for my daughter. I found her — though it wasn't really her — and grabbed her by the arm. "You didn't eat dinner last night, did you?" I shouted. "What did you have for breakfast?" Not-Kitty smiled. "A teaspoon of air," she said sweetly. I woke with my heart pounding, full of rage and hatred for Not-Kitty, the demon who lived on air, who wore my daughter's face and spoke with her voice.
The Maudsley approach advocates separating the disease from the sufferer, the anorexia from the adolescent. And this helps, especially on the worst days. But it's also true that the demon is part of our family now, lurking in the shadows. We will never forget it. We don't know if or when it will re-emerge — two months from now, two years, five years. We don't know if we've done the right thing. Is Kitty cured? Will she ever be cured? There are so many questions I can't answer.
That morning, I got out of bed in the gray predawn and went down to the kitchen. I pulled out eggs, milk, butter and raspberry jam and set to work making crepes. It was all I could think of to do.
Subscribe to:
Posts (Atom)