Issuu on Google+

end­­eavors Spring 2008

Research and Creative Activity  •  The University of North Carolina at Chapel Hill

We’ve all heard about the latest, greatest, whiz-bang discoveries: The universe is accelerating. Stem cells may save us. T. Rex was related to chickens. Our planet is getting warmer.

Fine and dandy.

There’s just one thing bugging us:

What Don’t We Know? page 9

Victor Kozin

Very often in these pages, we probably leave the impression that the value of research lies in what we have already

learned. We describe a study, summarize its findings, and speculate about their meaning and utility. We don’t talk very much about what really gets us going—the scary-vast unknown. A few years ago, I interviewed Christian de Duve for a program at the Morehead Planetarium. He had won the Nobel Prize for his work in cell biology, and we were talking about the origins of life. At the end of the interview, I asked him how we should go about attracting young people to science. Here is what he said: The challenges that face young people are the same as the challenges that have been facing scientists ever since science began. It’s the attraction of the unknown, the need to understand what is not understood. It would be a tremendous mistake to assume that we understand everything today. We just understand a few things, and it gives us the feeling, “Oh yes, we’ve solved the problem,” but no. There are huge problems waiting to be solved, many important discoveries waiting to be made. We should tell young people, “Go into science because the unknown is waiting for you, and what is not known and not understood today is bound to be more exciting and more interesting than what we understand already.” This reminded me of some advice I’d read from the poet and critic Randall Jarrell. I cannot quote him verbatim, but the gist of what he said was this: Go out to the edge of your understanding, and write about what you don’t know about what you do know. Yes, scientists and poets have to know things. They master a great many facts and equations, methods and mechanics. This is their steep, rocky trail to the summit, the edge, and the limits of what is known. They study hard and learn because they want to go beyond. So if you want to turn your children on to science or poetry or any other risky, exciting, obsessive endeavor, reveal to them how little we adults really know. Tell your children that the great unknown is waiting. And there will always, always be enough to go around. —The Editor

end­­eavors Spring 2008 • Volume XXIV, Number 3 Endeavors engages its readers in the intellectual life of the University of North Carolina at Chapel Hill by conveying the excitement of creativity, discovery, and the rigors and risks of the quest for new knowledge. Endeavors (ISSN 1933-4338) is published three times a year by the Office of the Vice Chancellor for Research and Economic Development at the University of North Carolina at Chapel Hill.

Send comments, requests for permission to reprint material, and requests for extra copies to: Endeavors Office of Information and Communications CB 4106, 307 Bynum Hall University of North Carolina at Chapel Hill Chapel Hill, NC 27599-4106 phone: (919) 962-6136 e-mail:

James Moeser, Chancellor Bernadette Gray-Little, Provost and Executive Vice Chancellor Tony Waldrop, Vice Chancellor, Research and Economic Development

contents Spring 2008

2 overview

TV therapy for schizophrenia, preemies benefit from Epsom salts, weight-wise women, housing the homeless, a suspect gene for faulty sperm, everyone should count, brain size and autism, thinking before drinking, and rerouting blood.

cover story 9 What Don’t We Know?

28 The Wide, Tangled Net Slavers prey on Ghana’s children. by Mark Derewicz

30 Virtual Danger, Safer Places

Computer simulations can help people survive war and disaster. by Margarite Nathe

34 A Fight to the Death

Dark matter, holes in our genes, climatic calamity, population overload, and other questions that keep us up at night. by Jason Smith

features 17 The Wizardry of Green

To confound their enemies, plants conjure all kinds of tricks. by Jessica McCann

20 Murder in Moscow

A guy walks into a bar and finds his wife with another man... by Mark Derewicz

Making cancer kill itself. by Prashant Nair

37 Forever Hung Over

The brain, long after the drinking stops. by Mark Derewicz

38 A Girl’s Life and a Glass Half-Full

Sugar and spice, but not everything’s nice. by Margarite Nathe

40 Where G-force and Gray Matter Meet Heads up: bad tackles are bad news. by Mark Derewicz

22 A Bit of Salt, a Trace of Life

42 Living in a State of Thirst

26 What Opened this Gigantic Hole?

47 in print

What Jack Griffith unearthed near Roswell, New Mexico. by Mark Derewicz

Laura Mersini-Houghton predicted a giant void in the sky, and NASA found it. The sign of a parallel universe? by Prashant Nair

We need more than just rain to prevent future water shortages in North Carolina. by Sheila Read

Shooting the Barefoot Farmer.

49 endview Going to the Galapagos.

Editor: Neil Caudle, Associate Vice Chancellor, Research and Economic Development Associate Editor: Jason Smith Writers: Kelly Chi, Mark Derewicz, Susan Hardy, Jessica McCann, Beth Mole, Prashant Nair, Margarite Nathe, Deborah Neffa, Sheila Read, and Meagen Voss Design: Neil Caudle and Jason Smith Print production, online design: Jason Smith

©2008 by the University of North Carolina at Chapel Hill in the United States. All rights reserved. No part of this publication may be reproduced without the consent of the University of North Carolina at Chapel Hill. Use of trade names implies no endorsement by UNC-Chapel Hill.

overview People skills can break a vicious cycle Portraying schizophrenia: The painting at right is by Ramell Moore, who studied abstract visual art at Carolina. “This painting shows a woman who was blinded by gases released at her workplace,” Moore said. “It represents the danger people are exposed to working in hazardous conditions for low wages. It also conveys the onset of psychosis.”


f you e-mail some friends and they don’t respond immediately, do you assume they are very angry with you, or that there’s a conspiracy of people who have decided not to pay attention to you? Do you withdraw or lash out? You might, if you have schizophrenia. Social problems such as these matter because they can interfere with recovery, says David Penn, professor of psychology. Penn says that people who have good relationships with family and friends have less trouble living with schizophrenia. Most public attention on treatment of schizophrenia focuses on antipsychotic medications and the problems with cost, side effects, and many patients’ refusals to keep taking them. Even though medications do lessen some of schizophrenia’s worst symptoms, there’s little evidence that meds help people with the disease to develop better relationships, Penn says. 2 endeavors

At John Umstead Hospital in Butner, North Carolina, graduate student David Roberts had been struggling to figure out how to engage people with schizophrenia in treatment groups that taught basic social skills. There were problems. It is notoriously difficult to capture the attention of someone who is hallucinating, as many patients are when they arrive at psychiatric hospitals. And group leaders found that the standard curriculum on social skills was falling flat. That curriculum assumed people had profound deficits in social skills, Roberts says. One day when Roberts was watching the comedy Friends with patients, something clicked. He noticed that they were laughing at the same scenes he was. Maybe, he thought, the patients understood more about social situations than clinicians had thought. After trial and error, Roberts and

Penn hit on a creative prescription for the treatment groups: Turn on the HBO hit Curb Your Enthusiasm. Watch Larry David bumble his way through embarrassing social encounters. Laugh. Discuss. “A lot of the episodes show people getting social cues wrong,” Penn says. “These are problems all of us can relate to. All of us jump to conclusions. All of us at one point or another will misread a social cue.” But schizophrenia often magnifies small miscommunications. People with schizophrenia tend to have difficulty identifying other people’s emotions and reading between the lines to infer what someone means, Penn says. Schizophrenia has a wide range of symptoms and severity, but it is characterized by

profound alteration of thinking, feeling, and ability to communicate. At its worst, it may involve symptoms such as auditory and visual hallucinations, delusions, and disorganized behavior. A vicious circle often occurs in which someone with schizophrenia withdraws from an awkward situation, Roberts says. Then others respond by avoiding them. “They think, ‘That’s a weird person,’ and they don’t engage with him or her anymore,” Roberts says. That leads to fewer social opportunities for the person with schizophrenia, and social skills deteriorate further. Penn and Roberts have high hopes for the Social Cognition and Interaction Training program. They initially showed episodes of Curb Your Enthusiasm and Monk to spark discussion about sticky social situations,

but Penn and Roberts eventually produced their own videos, using their own scripts and UNC actors.


he six-month program aims to improve the ability of people with schizophrenia to identify others’ emotions, develop reasonable explanations for other people’s behavior, and—most importantly—apply these skills to their lives. Roberts and his colleagues have completed a clinical trial of about forty outpatients at the Schizophrenia Treatment and Evaluation Program at UNC Hospitals and Caramore, a structured residential program in Carrboro, North Carolina, for adults with mental illnesses. After participants completed the training

program, independent raters judged how well the participants handled situations such as making a request of a landlord or meeting a new neighbor. Roberts says the researchers did see a significant difference in the people who received the training compared to those who received standard treatment. After an article on the program appeared in The New Yorker in October 2007, clinicians and community organizations from around the world expressed interest in it, Penn says. —Sheila Read Sheila Read is a master’s student in journalism at Carolina. David Penn is a professor of psychology and associate director of clinical training at Carolina. David Roberts is a graduate student in psychology at Carolina.

Epsom salts help tiny babies


n the Old Clinic, just past a row of archaic, closet-sized delivery rooms marked with stenciled numbers, John Thorp and his colleagues showed that common, old-fashioned Epsom salts can drastically reduce the risk of cerebral palsy in premature babies. “I think one of the ironies and maybe one of the tragedies of modernity—and all the biological knowledge that has been accumulated—is that nobody really understands what causes premature birth,” Thorp says. The rate of premature birth has been rising since the 1980s. One in ten premature babies will develop a permanent disability. The more premature the baby, the higher the risk of disability. One of the most common is cerebral palsy, which is caused by damage to a motor control center in the brain and can lead to a range of physical disabilities. Cerebral palsy has huge societal costs too, Thorp says. The lifetime cost per patient runs into the millions of dollars, and malpractice suits associated with cerebral palsy can result in fifteen-million-dollar awards. Each year ten thousand babies develop the condition. Thorp and his colleagues showed that treating women entering preterm labor with intravenous magnesium sulfate— Epsom salts—reduced the risk of cerebral palsy by 50 percent. The researchers found no harmful side effects in the 2,241 births in the study. Because of the safety, accessibility, and low cost of Epsom salts—just $3 at the local pharmacy—Thorp hopes the practice will be widely adopted. “But,” he adds, “it would be better to not be born prematurely.” Thorp hopes that the medical community will increase

New use for an old-fashioned remedy Epsom salts, which are magnesium sulfate, cost about $3 at the local pharmacy and can help reduce the risk of cerebral palsy in premature babies.

research in women’s health and invest in preventing preterm births. “Historically, people don’t spend a lot of money on women’s health,” he says. “But we were all once in-utero and all had to get ex-utero—at the right time and healthy.” —Beth Mole Beth Mole is a doctoral student in the Department of Microbiology and Immunology in the School of Medicine at Carolina. John Thorp is the Hugh McAllister Distinguished Professor of Obstetrics and Gynecology and division director of Women’s Primary Healthcare in the Department of Obstetrics and Gynecology. The National Institute of Child Health and Human Development and the National Institute of Neurological Disorders and Stroke provided funding for his study. endeavors 3


Weight loss that works Carmen Samuel-Hodge (below) helps busy women drop the pounds. Jason Smith

Having no home is costly, too


any homeless people go through a series of revolving doors as they navigate some combination of hospitals, shelters, jails, prisons, and the streets. They can’t afford housing or are denied it because of their backgrounds. The cycle continues, and the costs to homeless people and communities accumulate. Researchers in the School of Social Work have found that giving homeless people small apartments and charging rent based on income resulted in cost savings for the community. The researchers interviewed twenty-one residents of Lennox Chase, the first apartment complex built in Wake County to provide permanent, supportive housing for homeless people. 4 endeavors

nyone who has ever tried to lose weight knows how difficult, timeconsuming, and expensive it can be. But a UNC study shows that weight-loss programs can be fun, cheap, and effective. For more than two years the university’s Center for Health Promotion and Disease Prevention studied weight loss in 143 low-income and mid-life women at a community health center in Wilmington, North Carolina. Fifty-four percent of the women in the group lost an average of 9.5 pounds or more during the 16-week weight-loss phase of the study. “That is a really good outcome,” says Carmen Samuel-Hodge, who led the study. “That’s over four percent of their body weight.” The average participant’s weight at the beginning of the study was 201 pounds. In the study, which was called WeightWise, researchers worked with women to find affordable and practical ways to improve eating habits, increase daily physical activity, and manage stress and busy schedules. The researchers gave the participants low-calorie, healthful recipes that incorporated affordable ingredients and seven daily servings of fruits and vegetables. Researchers also gave the women videotapes of indoor walking exercises and encouraged the women to incorporate more movement and exercise into their daily routines. After

the weight-loss phase came a year during which the participants worked to maintain their new, lower weight. Last fall Samuel-Hodge received another three-year grant from the Centers for Disease Control and Prevention to take the study a step further. Researchers will expand Weight-Wise to include 240 women in 6 North Carolina health departments, and then measure how much weight the women lose in a five-month period. Even a modest eight-pound weight loss can lead to healthier hearts, improved blood pressure, and lower risk of heart disease, Samuel-Hodge says. She believes the program worked because the sessions fit busy women’s schedules. “The women liked it because they could come at times that were good for them,” she says, “and they were in an atmosphere that was conducive to learning how to change their behaviors.” —Deborah Neffa

A third of residents the researchers interviewed had been in residential drug- or alcohol-treatment centers. Resident William Dickens has lived at Lennox Chase since November of 2003. “When I when was on drugs and alcohol, I was spending no quality time with my daughter like a father should be,” he remembers. He made promises to his family he couldn’t keep, and after getting clean at The Healing Place recovery center in Raleigh, one of his first goals was to rebuild his family ties. Dickens remembers being on shaky ground then, trying to find a job, manage his money, and find a place to stay. A friend from The Healing Place told him about newly built Lennox Chase, which has a social worker and three dozen 500-square-foot studio apartments. The total cost of services for residents in the two years before entry into Lennox

Chase was $377,142. In the two years after, it was $265,785. The cost for incarceration went from $3,500 to zero. Inpatient substance abuse treatment—$128,000 to zero. Not all costs fell. Medical treatment costs went from about $111,000 to $202,000. Two people with long-term health concerns, which included chronic obstructive pulmonary disorder and asthma, accounted for most of the increase. “These initial findings are very hopeful,” says Dean Duncan, who led the study and is leading similar studies of developments in Asheville, Greensboro, and Durham. “I think that the next report we generate will show that it’s much more cost-effective to move directly off the street into supportive housing.” Duncan says the chronically homeless population ranges from individuals who sleep on the streets to families whose parents

Deborah Neffa is a junior majoring in journalism at Carolina. Carmen Samuel-Hodge is a research assistant professor in the Department of Nutrition in the School of Public Health. The Weight-Wise study was funded by the U.S. Centers for Disease Control and Prevention and the North Carolina Department of Health and Human Services.

A faulty gene for defective sperm?


ne faulty gene may cause infertility in some men, says Carolina researcher Yi Zhang. When Zhang deleted a gene in laboratory mice that helps package DNA into the heads of maturing sperm, he found that the mice produced immature sperm with little chance of successfully penetrating an egg. Sperm cells’ multistage maturation process turns a tiny blob of genetic material into a sex cell with a whip-like tail that propels it forward on its path to the egg and a compact head that penetrates the egg. Genes in the immature sperm head produce proteins that help the sperm mature. Proteins in the sperm head called histones hug the DNA and wrap it around themselves. When the time is right, the enzyme JHDM2A removes methyl groups from the histones. Then the histones release the DNA, allowing sperm to make another set of proteins, called protamines, which package the DNA in the sperm head tightly enough for penetration, Zhang says. Because the histones mask the genes until their methyl groups are removed, scientists call this process “silencing by methylation.” Zhang created mice that lacked the

Some benefits cannot be measured in dollars. “I think there was a transformative effect of getting your own place and of being able to open the door and shut the door,” Walsh says. The residents regularly check up on one another and greet new residents, he says. Dickens, who has been clean since 2002 and now works at Fords Produce in Raleigh, says it’s difficult to express his gratitude for having the housing and stability to get on his feet and rebuild his relationship with his daughter. He remembers her first overnight visit to his apartment. “I got up in the middle of the night and sat in one of my chairs to watch her sleep,” he says. “It brought tears to my eyes. There’s nothing like that feeling.” —Kelly Chi Dean Duncan is a research associate professor and Adam Walsh is a fourth-year doctoral student in the School of Social Work. They received funding from the North Carolina Interagency Council for Coordinating Homeless Programs.

enzyme JHDM2A. He found that the mutant mice had smaller testes than normal mice. “We found almost no mature sperm in the testes of the mutants,” Zhang says. Although the mutation didn’t prevent the mice from having sex, they were infertile. Normal mouse sperm have hook-like heads. The sperm of the mutated mice had round heads and were mostly immotile. Deleting the JHDM2A enzyme affected packaging in the sperm head because the sperm could not produce protamines, Zhang adds. Many other gene defects cause infertility in mice, but most of those genes have turned out to be normal in infertile men. Zhang says JHDM2A is another potential candidate, and infertility may be caused by a single-gene defect in at least some men. Haifan Lin, director of the Yale Stem Cell

Jason Smith

are working but can’t afford housing, to multiple families or individuals staying in overcrowded homes. The N.C. Interagency Council for Coordinating Homeless Programs estimates that on the night of January 24, 2007 in North Carolina, nearly 11,000 people were without homes. In 2005 the state’s Department of Health and Human Services drafted a ten-year plan to end homelessness, and Orange County is one of many communities drafting its own version. The plans include the goal of introducing more permanent supportive housing. It’s often difficult to identify service needs for homeless who are not in permanent housing, says Adam Walsh, a School of Social Work graduate student who coauthored the report. Even Lennox Chase residents, Walsh found, didn’t know certain services were available. “Once they were in a stable place like Lennox Chase, they could talk about what they needed and coordinate services. That was really important,” he says.

William Dickens in his apartment at Lennox Chase.

Center, says Zhang’s work “represents a landmark contribution to reproductive biology.” Lin adds that although scientists have long known the role of protamines in sperm maturation, Zhang identified the enzyme that instructs genes to produce protamines. Zhang is now collaborating with infertility researchers to study the enzyme in humans. “I’m almost certain that if there’s a mutation in this gene, it will cause male infertility,” he says. —Prashant Nair Prashant Nair is a master’s student in medical journalism at Carolina. Yi Zhang is a professor in the Department of Biochemistry and Biophysics, and a member of UNC’s Lineberger Comprehensive Cancer Center. His study was published in the November 1, 2007 issue of Nature. Yuki Okada

Microscopic images of sperm development show the presence of the JHDM2A protein (stained brown). The protein appears in the nucleus of round spermatids. No JHDM2A is detected in spermatozoa (last image on right).

endeavors 5

Dan Williams

People count


n the United States we take for granted that government agencies will record the vital events in our lives: births, marriages, deaths. But throughout the developing world, systematic records of vital events are still incomplete or almost nonexistent. Epidemiologist Philip Setel and his colleagues call this failure to register vital events a “scandal of invisibility.” An unregistered person can be denied access to government services and basic legal rights. For example, a woman who doesn’t have documentation proving her marriage may not be able to inherit her husband’s property after he dies. Worse, Setel says, we don’t know what we don’t know about uncounted people’s lives. “If they are forced into child labor or service as child soldiers, if they’re lost in a natural disaster or subject to human trafficking— who’s going to know? No one. No one who will be in a position to help them.” Setel and an international group of scholars wrote about the problem in a series of papers published in Lancet. The papers emphasize one of the crucial elements missing from developing countries’ records: death records stating a cause of death. Twenty-five percent of people worldwide live in countries that don’t report cause-of-

An interviewer with Mozambique’s National Statistics Institute conducts a verbal autopsy.

death data to the World Health Organization (WHO). Another fifty percent live in countries that report data WHO classifies as low-quality or limited-use. Without good data, it’s hard to effectively fight world health crises. “The tendency right now is toward focused perspectives on different diseases,” Setel says. “We have the Global Fund to Fight AIDS, TB, and Malaria. We have the President’s Emergency Plan for AIDS Relief.” The different groups have a shared need for cause-of-death data, but so far there’s been little collaboration among aid organizations and governments to start registering deaths. One of the exceptions is a method Setel

Bigger brains can mean big problems


t six months, the babies were smiling and laughing, even with strangers. At twelve months—nothing. “It was like someone pulled down a curtain,” says psychiatrist Joseph Piven. “The babies were not interested in social interaction. There was no back-and-forth, no eye contact.” When Piven saw a video of these children, who were later diagnosed with autism, he also saw a possible shift in his field. What if people aren’t born with autism? What if babies develop the disorder during the second half of their first year of life? Piven then considered another fact: autistic kids have brains that are 5–10 percent 6 endeavors

larger than normal. His team studied fifty autistic two-year-olds and found that the toddlers’ heads were indeed larger than normal. Then Piven studied the head circumferences of 113 autistic kids from birth to three years old. Head circumference is a good indicator of brain volume in young children. During the first year of life, the babies who eventually developed autism had normal-sized heads. But beginning at twelve months, the heads of autistic kids grew at an accelerated rate. “Presumably this brain overgrowth originated just prior to twelve months, or at the end of the first year of life,” Piven says. So what’s going on?

and his colleagues have been working on called Sample Vital Registration with Verbal Autopsy (SAVVY). The method has interviewers use a set of WHO standard forms to talk to people from a representative sample about deaths in their households. Doctors then assign causes of death using the interview data. Last fall Mozambique’s National Statistics Institute started using elements of SAVVY in conjunction with its 2007 national census. The institute will release preliminary results of the survey in summer 2008. Setel hopes Mozambique’s example will show that methods exist to help countries with very poor records to start registering vital events. “The regions that have no functional registration systems can record events for at least a representative sample of the population,” he says. “Over time, these sample systems can expand. The technological and administrative tools are at hand to do this.” The missing ingredients, he says, are money and political will. —Susan Hardy Philip Setel is Deputy Director of MEASURE Evaluation, a thirty-million-dollar program funded by the U.S. Agency for International Development. Much of the data for the “Who Counts?” Lancet papers came from the U.S. Census Bureau and WHO’s Health Metrics Network.

No one knows, but there’s lots of speculation. As a baby ages, everyday experiences help remodel any synaptic connections in the brain that aren’t necessary anymore. This remodeling helps shape the mind. But in autistic babies, remodeling might not happen correctly, and that could lead to the overgrowth of neural connections and a larger-than-normal cerebral cortex. All this could bog down the brain’s communication network, causing autistic behavior. Piven cautions that this theory oversimplifies complex brain functions and development. Although faulty remodeling might account for larger brains and cause autistic behavior, it might not. The problem could be in some other process in development. Piven is now leading a six-university

Learning to think beyond the drink


fter arguing with his ex-girlfriend, Michael was so distraught that he got in his car and headed to the liquor store. As he drove, he remembered what his therapist had said: it was exactly this reaction to stress that had led to ruined relationships and nights in jail. Michael took a deep breath, turned the car around, and went home. Did Michael muster some sort of mysterious willpower? Maybe. But behavioral neuroscientist Charlotte Boettiger says Michael may have trained his brain to think longer and harder about longterm consequences—something alcoholics struggle to do, she says. “We know how valuable this sort of cognitive behavioral therapy is,” Boettiger says. “But we don’t understand how the brain tends us to make short-sighted decisions or decisions that are good for the long term.” She designed a study to try to find out. She asked sober alcoholics (people who are addicted but no longer drink) and people with no history of alcohol abuse to choose between immediate rewards and larger rewards that would come later on. For instance, in one trial participants had to decide between taking seventy-five dollars right away or one hundred dollars in two weeks. Boettiger controlled for factors such as economic status and I.Q.

research project to study brain development in children at risk for autism: kids whose older siblings have autism. Using magnetic resonance imaging, researchers will scan the brains of children at six months, twelve months, and twenty-four months to show exactly how the brains of autistic babies and toddlers change over time. “Right now we’re starting to measure particular fiber tracts in the brain, the different neural circuitry,” he says. “We want to see what parts of the brain are associated with what sorts of autistic behavior.” —Mark Derewicz Joseph Piven, the Sarah Graham Kenan Professor of Pediatrics and Psychiatry in the School of Medicine, is director of the Autism Centers of Excellence, a ten-million-dollar program funded by the National Institutes of Health.

While the participants were making decisions, Boettiger used functional magnetic resonance imaging to measure activity in different parts of their brains. She found that sober alcoholics chose the immediate reward much more often than did people with no history of alcohol abuse, and that four parts of the brain in sober alcoholics apparently react to decision-making differently. For instance, the orbitofrontal cortex, which is directly above the eyes, was less active in sober alcoholics. “This was most exciting for me because of all the areas in the brain, it’s the one most commonly associated with addiction and related disorders such as obsessive compulsive disorder,” Boettiger says. “And clinicians have already linked damage to the orbitofrontal cortex to personal management problems— handling money and relationships, and acting on long-term consequences.” She says the orbitofrontal cortex may be

involved in contemplating abstract consequences to make them more concrete. So sober alcoholics who showed less orbitofrontal cortex activity in Boettiger’s test may have had trouble forming mental representations of important but abstract long-term consequences—for example, the abstract benefit that may come from having an extra twenty-five dollars in two weeks. Boettiger says that this may be one reason why cognitive interventions can work; the therapy helps addicts visualize what will happen after they take that first drink. In Michael’s case, he could turn his car around because the long-term consequences weren’t so abstract—and driving to the liquor store didn’t seem like a good idea anymore. —Mark Derewicz Charlotte Boettiger is an assistant professor of psychology in the College of Arts and Sciences. She received funding from the U.S. Department of Defense to study cognition and alcohol addiction.

Charlotte Boettiger

Tracking choices in the brain When a person chose to wait for a delayed reward, the orbitofrontal cortex, indicated by an orange-and-red splotch, was more active (1). When a person chose the immediate reward, the parahippocampal gyrus (2), the post parietal cortex (3), or the dorsal prefrontal cortex (4) lit up.

endeavors 7

Better blood flow


lood vessels are the highways of the human body, and blood cells are like cars streaming to different destinations. When vessels are blocked, the circulatory system builds alternate routes so blood-cell traffic can flow smoothly until the block is repaired. These new blood vessels are formed through a process called angiogenesis. Mohamed Zayed, a medical student researcher in the lab of biochemist Leslie Parise, has discovered a new function for a protein involved in angiogenesis. “I had a personal interest in vascular biology,” Zayed says. “During my first semester of undergrad my father had a heart attack.” Scientists in the Parise lab were already investigating the protein, called CIB1, as a potential factor in heart disease.

The researchers had generated a mouse that lacked CIB1. Zayed thought CIB1 might have functions beyond heart disease, and so he began to study the process of angiogenesis in the mutant mice. With help from collaborators Mary Elizabeth Hartnett and Jim Faber, Zayed determined that blood vessels could not form correctly without CIB1. Parise explains that CIB1 might be useful for treating diseases in which discouraging or encouraging blood vessel growth would be beneficial to patients. For example, tumors generate blood vessels to sustain themselves; removing CIB1 might starve the tumors. Stopping blood vessel formation could also help individuals with diabetic retinopathy, an eye condition that results from malformation of blood vessels and eventually leads to blindness. On the other hand, promoting blood vessel formation could aid people

at high risk for heart attacks by providing alternate paths for blood, in case an artery to the heart becomes blocked. Now scientists in Parise’s lab are searching for other proteins that interact with CIB1. The researchers hope that this study will lead to a treatment that can regulate angiogenesis, giving doctors an additional tool to sidestep traffic jams and maintain blood flow in their patients. —Meagen Voss Meagen Voss is a doctoral student in the Interdisciplinary Program in Biomedical Science at the School of Medicine at Carolina. Mohamed Zayed is a graduate student in the School of Medicine. Leslie Parise is a professor in and chair of the Department of Biochemistry and Biophysics. Zayed’s research was funded by the National Institutes of Health, and his study was published in the November 26, 2007 issue of Circulation Research.

Mohamed Zayed

Rerouting the flow of blood Leslie Parise’s lab studies angiogenesis, the formation of new blood vessels. One technique is to restrict blood flow in the leg of a mouse by ligation. The drawing at left represents the leg of a normal mouse. At right, the ligation (black area indicating a blockage of the artery) cuts off blood flow, causing new vessels to form and supply blood to the muscles (arrow).

Mohamed Zayed/Circulation Research

Mohamed Zayed/Circulation Research

These microscopic images show blood vessels stained red. Left: When blood flow to muscles is restricted in a normal mouse, new vessels form to supply the tissue with blood. Right: A mouse lacking the protein CIB1 shows little or no new formation of blood vessels.

8 endeavors

Before we get started,

I’d like to thank Margarite, who’s sitting right over there, for helping me pull all this together. It’s not easy to get so many UNC faculty in one place at the same time. How many did you say we have here today, Margarite? Fiftythree? Cool. And I know you’re all busy, too, so thanks for showing up. Any questions before we start? Yes, Professor Rial? Coffee? In the corner right back there. Mind the cord. Others? Okay. So. A while back, I asked all of you to help me suss out the biggest, most mysterious, and most challenging puzzles of knowledge that we face today. I wanted you to tell me about the most interesting gaps in our understanding of the world. I wanted to hear about what we don’t know. You told me, all right. You gave me a list so long that we could never talk about all of them here today. So I asked you to vote to narrow the list down to ten of the biggest, most interesting, unsolved science problems. And today I want to tell you about those ten. Now obviously, you all know a lot more about this stuff than I do. So feel free to chime in and help me out. Everybody ready? Good. Let’s start with number ten. endeavors 9

10. How can we be so genetically similar to, yet so different from, chimpanzees?


his question was sent to me by Brian Hogan in the Department of Chemistry. He’s sitting right over there in the back row. Depending on which study you read, human DNA and chimpanzee DNA seem to be somewhere between 96 percent and 98.5 percent identical. Yet you and I don’t sleep in trees, or groom each other compulsively, or—yes, Brian? You want to explain it? Okay. Go for it. “Thanks, Jason. Here’s what I’m getting at: what is it about that 1–2 percent difference that allows humans to mathematically determine the relative position of an electron in an atom, but makes chimps sling poop at passersby?” Um, okay. Thanks, Brian. Now, correct me if I’m wrong: you’re not angling for a discussion on evolution versus creation science, right? You just want to know how the genes work. Right. Humans are strikingly similar to a bunch of different critters—genetically, at least. Sixty percent of human genes are fundamentally the same as fruit fly genes, and somewhere around ninety percent of our genes are the same as mouse genes. We don’t have tails, but we have the same genes that mice have for building them. (Humans almost never switch on our tail-building genes.) Mice have a certain gene they use to build eyes. Fruit flies are more than happy to use that mouse gene to build their own eyes. So our genes are flexible, and at least some of them are generalists. But small genetic variations can make for big differences between species, and even among members of the same species. All of us humans are around 99.9 percent genetically identical—that’s according to large-scale genome sequencing projects. So, genetically, you’re only about one-tenth of one percent different from Einstein and Shakespeare. Mom would be proud. You’ve also got something called noncoding DNA. About six feet of DNA are wrapped up in each of your cells. We used to think that only about one inch of it actu10 endeavors

ally did anything; the other five feet eleven inches seemed to be junk. Now scientists are starting to think that our one inch of working DNA, along with all the proteins it produces—and RNA, and some other cellular cousins that we don’t have time to get into today—are just our nuts and bolts, and that what we once called junk DNA actually holds our blueprints and the systems that control how we’re built. So it’s starting to look like genes are a lot more complicated than we thought. A given gene may do a whole lot more than make one protein or control one process. Scientists once speculated that we’d need at least a hundred thousand genes to build a human. But we’ve only got about twentytwo thousand. Some worms have more than that. A lungfish has forty times more DNA than you or I have. In a given gene, different combinations of coding DNA can become active at different times to produce different proteins, and that helps explain how humans can be so complex with relatively few genes. But we still have no idea how our cells “know” which parts of a gene to pay attention to at any given time, and we really don’t know what determines which genes get switched on or off at specific times and in specific places. Researchers have identified a few genes related to brain development and speech that look like they took one evolutionary road in humans and another in chimps. It wouldn’t be ethical to remove one of those genes in a human or chimpanzee to see what happens. The best we can do is to keep working on big-picture comparisons of human and ape genes. But our next question involves some science that might one day help us figure this chimp question out.


How does development evolve?


his one was sent to me by Bob Goldstein from over in the biology department— hey there, Bob; thanks for coming—and I’ll admit that at first, I didn’t get it. But Bob is patient, and the gist is this: all organisms— from butterflies to wombats to you and I—develop, which means that we begin life as, say, a fertilized egg cell, and end up as a fully-formed insect, or an Australian marsu-

pial, or the quarterback for the Green Bay Packers. And all organisms have also evolved over time: our forms have changed, often quite a bit, when compared to our ancestors of billions of years ago. Throw the study of development and evolution together and you get a relatively new branch of biology called evo-devo. I know it sounds like an 80s band, but bear with me. According to evodevo, development helps steer evolution: if the history of life on earth is a long road, development has spent a lot of time in the driver’s seat. Evo-devo research is starting to explain how evolution happens. It turns out that a complex, completely new structure— say, a butterfly’s wing—doesn’t sprout up just because a bunch of ancestral insects arbitrarily happened to have just the right genetic mutations at just the right time and place. In other words, it didn’t take new genes, or radical new developmental plans, to make wings where there were no wings before. Instead, those ancestral insects only had to slightly modify their developmental plans and tweak a few genes that they already had. Wings didn’t appear overnight, of course. But evo-devo is starting to show that new structures such as wings may have arisen from an organism’s existing genetic potential. Another big point: we don’t need a whole gaggle of genes to make new stuff. For example, a single gene called BMP4 helps determine which parts of an embryo will become the back and belly. But BMP4 also determines whether a finch has a broad beak or a long one. It also specifies whether a cichlid fish is going to have a short, thick jaw for crushing mollusks, or a longer one for sucking algae. One gene, switched on at different times and places, may build flippers in whales and arms in football players. Evo-devo is changing some thinking about major evolutionary steps, too. Take the transition from life in water to life on land. It didn’t necessarily happen because fish suddenly grew hands and feet. It turns out that fish had the potential to grow those appendages long before any fish dragged itself onto dry land. Evo-devo suggests that it took both the right environment and key mutations to a few existing genes to make fish take that one giant step. So, Bob, have I done your question justice? I know evo-devo is still a fairly new field. Anything you want to add?

“Well, very few living biologists have been trained thoroughly in both of these fields. Some come from the evolution side: people who actually get to do some of their science in beautiful outdoor locations. Others come from the development side: nerds like me who stay in the lab working mostly at the minute level with molecules. So that’s a challenge. But it’s starting to change as interest in evo-devo grows. “We don’t have a lot of general rules of evo-devo yet, but we have the tools to make the important discoveries. And the answers will fill a big gap in our understanding of how the incredible diversity of organisms we find outside came about.” Thanks, Bob. What’s that, Dr. Rial? We’re out of coffee? Okay. We’re on it. Let’s move on to number eight.


What caused so many abrupt changes in the climate of the past? Are we experiencing something similar now?


ell, this is technically two questions, but you all voted for it, so I guess that’s okay. Jose Rial sent this to me. He’s in the geology department, and one of the things he studies is climate change as it relates to the history of the earth. Oh, and he’s waiting for coffee. Let me see if I can stumble through this one: up until about twenty years ago, we were sure that Earth’s climate changed gradually. It got pushed slowly one way—by, say, regular, predictable changes in Earth’s orbit—and then gradually pulled back the other way, in a nice, neat cycle. Sure, it might have to account for some unpredictable hiccups along the way, but the point is that it was supposed to react very slowly. But then paleoclimatologists started finding climate patterns, both in the ancient and the more recent past, that didn’t jibe with the conventional wisdom. They found a bunch of abrupt climate swings—big changes in average temperature, or puzzling patterns of storms, floods, and droughts, over a large part or parts of the world—that happened fast: sometimes within a decade. So fast

that the earth, and most things on it, had trouble adapting. We d o n ’t k n o w exactly why these abrupt changes happened, but we have a few suspects. The oceans store a lot of heat and move it around. Then there’s the atmosphere and its soup of temperature, humidity, cloudiness, and wind. Ice and snow reflect sunlight and heat, and sea ice keeps the oceans from giving heat back to the atmosphere. The output of the sun fluctuates. Any and all of these are fair game. In 2002 the National Academy of Sciences issued a book on abrupt climate change. They subtitled it Inevitable Surprises. They theorized that abrupt climate changes happen when the earth’s climate system gets pushed across a threshold, either by a sudden trigger—say, a massive volcanic eruption— or by combinations of more gradual forces. It all works, they said, like a canoe: you can lean out over the side a bit, but keep leaning further and eventually you’re going to turn the boat over. Professor Rial tells me there’s something called the Younger Dryas cold interval. Had you been living in the Northern Hemisphere around eleven thousand years ago, you would have been fairly happy to find that things were warming up. The earth was thawing out from an ice age. But for some reason, things got cold again, fast. Within about twenty years, the average temperature in the arctic regions had dropped by about twenty-five degrees Fahrenheit. It was almost cold enough to kick-start the glaciers again. This cold interval—which climatologists sometimes call the Big Freeze—lasted about a thousand years, and then it was gone as fast as it had come. But it may have sealed the doom of some of North America’s biggest land mammals. Saber-toothed cats, mammoths, and mastodons all went extinct around the end of the Big Freeze. A drought that came out of nowhere and lasted for decades may have been partly responsible for the end of the classic Mayan civilization in the ninth century. And your grandparents may have lived through an

abrupt climate change: the Depressionera Dust Bowl drought in the United States. Now, I’m not trying to be sensational, but any of this stuff could happen again. And some scientists believe we may be experiencing another abrupt climate change right now. This time, some of them are blaming global warming. That’s right: something we’ve all begun to think of as a climate disaster may actually cause another, separate climate disaster. Not all climatologists think that global warming will trigger an abrupt period of flooding, or drought, or even another ice age, but the record shows that past abrupt climate changes have tended to happen when some force was altering the climate itself—just like we seem to be doing right now. Professor Rial has done some research of his own on this stuff. He proposed a mathematical model to account for a particular kind of abrupt climate change that shows up repeatedly throughout Earth’s climate history: sudden and fast warming followed by much slower cooling. I’m not going to try to explain his model here today, mostly because I don’t understand a word of it. But I will ask Professor Rial to stand up and give us the bottom line. Would you mind, Dr. Rial? Thanks. “Okay. Well, since many of these abrupt warming events have happened often in the recent past, it is reasonable to expect them in the near future. But what’s different right now are the unprecedented concentrations of CO2 and methane we have managed to produce, and the unprecedented rates at which we’re still emitting these gases. Nothing like this has happened at least during the last million years. As paleoclimatologist Wally Broecker would say, we are poking a sleeping dragon in the eye. Get ready for the blast!” Thanks again, Professor Rial. Next question: endeavors 11


Are there too many humans for one Earth?


got this question, stated a slightly different way each time, from four different people, in biostatistics, biology, environmental sciences, and religious studies. It seems that many of you worry that we may run out of some of the things we rely on to live. Is Jonathan Boyarin here today? Oh, hi. Good to finally meet you face to face. Dr. Boyarin is a cultural anthropologist from the Department of Religious Studies. He’s convinced that we’ve reached or maybe already surpassed Earth’s capacity to sustain the number of humans we now have at our current levels of resource use. He told me that we have no idea how we could scale our consumption back with the least possible amounts of suffering and injustice. He insists that just thinking about local sustainability won’t do the trick: we need a game plan for global sustainability, but we don’t know how to best organize that. Dr. Boyarin, I think I’m in over my head here. These questions are almost beyond the scope of what we can talk about today. Can you help me out? “Well, I’ll try. You’re right in that nobody can take on this whole question, and it’s not just a question for academic specialists. But we can’t shrink from it, because there is a real world outside us that puts real constraints on us.” Okay. So where do we start? “Once upon a time we believed that the major differences between human groups were national differences—differences of language and other aspects of culture—and that these differences could be managed by giving each nation sovereignty within its own territorial state. But now it seems that that organizational scheme is inadequate when it comes to many of our most pressing problems. Group differences aren’t neatly contained within states, and human problems need to be dealt with at levels both larger and smaller than states.” Can you give us an example? “Sure. Here’s one that I’ve studied and which I care about deeply: Israel and Palestine. The population is conventionally divided into two national camps, cast in a struggle to the death over a single territory. No one reasonably expects those two strong 12 endeavors

collective identities to disappear or even weaken any time soon—nor would that necessarily be a good thing. But all life in that region is threatened by overuse of highly limited supplies of fresh water and other basic resources. Short of dissolving all group boundaries, can we find ways to mobilize people to resolve this shared threat? “That’s just one example. But I think that if there’s to be any future that includes us, we have to learn to communicate better now. So over the next few decades, I hope we’ll come to share a realistic vision of our common predicament without trying to dissolve our cultural differences.” Thanks, Dr. Boyarin. Did someone have a question just a minute ago? Oh. Hi, Woody! Woody Chambless, folks, from over in Biostatistics. He’s my father-in-law. But let’s hear him out anyway. Go ahead, Woody. “Well, the Scottish philosopher Adam Smith once wrote that the striving of all individuals for their own private good would bring about general well-being. But when I think about war, overpopulation, and pollution, Smith’s idea starts to seem pretty flawed. So I would reword your question like this: how do we balance the wants and actions of the individual versus the needs of the common so that all might survive and thrive?” Okay, good. Thanks, Woody. This is pretty sobering stuff. We’ve got a rapidly growing world population, increasing rates of consumption, and resources that are dwindling. When the United Nations Environmental Program released a global environmental outlook last year, the bottom line was that “the bill we hand our children may prove impossible to pay.”


How did the universe come into being and assume its present form?


aurie McNeil and Laura MersiniHoughton from over in Physics and Astronomy sent me this question. But they framed it with two more questions: “What was there before the Big Bang? And does that question have any meaning?” Now, if you already have trouble sleeping at night, don’t start studying theoretical

cosmology. It’s full of questions that just lead to more questions. And the math is really hard, too. The ancient Hindus believed that the universe was a cosmic egg that expanded from a concentrated form. That’s not too different from what we think we know now. The Big Bang, so the theory goes, was the explosion of a very hot, very dense, and very small body of matter that gave rise to our universe some thirteen-odd billion years ago. We’re used to thinking of the Big Bang as ancient history. But in a sense, we’re all living in the Big Bang right now: the universe is expanding as we speak. So it must have been smaller yesterday. Last week it was smaller still. Follow this logic far, far back, says classical Big Bang theory, and you’ll see that everything in the universe—including all matter, gravity, electromagnetism, and the forces that hold atoms together—must have once been squeezed into a point so infinitesimally small that it had no dimensions at all. And that little twist of logic is where most us quickly ask our server for the check. But most of us aren’t physicists. Physicists have faith. They have to. They know some of their theories, the Big Bang among them, aren’t perfect. But the best theories in cosmology are the ones that can’t be easily disproved by what we already know—or, in some cases, by what we already assume. The Big Bang plays nice with general relativity, with the cosmological principal, and with our other observations—prerequisites for any theory of the origin of the universe. And it may be that when we learn enough about something called quantum gravity, then we can patch up any holes in our understanding of the Big Bang—and what, if anything, came before it. That’s not to say that the Big Bang is the only plausible origin-of-the-universe theory. Professor Mersini-Houghton told me that all it would take to challenge faith in the Big Bang is for another theory to come along that explains the observed universe just as effectively. The second-place horse right now involves a cyclic universe—one that expands and contracts continually. Whatever we learn about quantum gravity may eventually make the cyclic universe theory the front-runner. But for now, most physicists feel that the Big Bang is the best horse. Professor Mersini-Houghton, are you here today? Hi there. Am I getting any of this right?

“Hi, Jason. You’re right that the theory of Big Bang inflation agrees exquisitely with our observations. But just remember that nearly perfect agreement with the data does not by any means constitute proof.” So physicists have faith, but they aren’t necessarily sentimental? “Right. The best science usually comes out of scrutinizing our most cherished principles and theories.” There you have it. Some of these cosmological questions border on the religious and the spiritual. And in a way, so does our next one.

5. What is the basis of consciousness?


ere’s another question that was sent to me by more than one person—Edward Perl in cell and molecular physiology, and Jesse Prinz in philosophy. That’s a good sign that it’s a humdinger. It’s an old question to philosophers, but a newer one to scientists. Plato thought about it. In the 1600s René Descartes said that body and mind must be made of different stuff: the body exists in time and space; the mind exists only in time. But today science says no: body and mind are different aspects of the same thing. Consciousness comes from the way the brain’s neurons work and the way they’re organized. But we have no idea how consciousness works. Historically, Science magazine has said, studying consciousness has been “a dubious career move” for anyone who didn’t already have tenure (and maybe a Nobel, to boot). But that’s starting to change. Let me point out Dr. Perl: he’s sitting right over here at the end of the second row on the left. His background is in medicine, and he’s been studying this stuff for far longer than I’ve been alive. And over there near the back corner is Dr. Prinz. Hi, Jesse. I like the Ramones shirt. Dr. Prinz is one of a relatively new crop of philosophers who are applying hard scientific evidence to the way they think about philosophy. He’s done a good bit of thinking about consciousness, too. Somehow, when you smell a tiger lily or look at the Mona Lisa or listen to “Sheena Is A Punk Rocker,” certain neurons and glial

cells in your brain “fire” and help you recognize and make sense of what you’re smelling, seeing, or hearing. When you see something red, you somehow experience the color red, rather than some other color or no color at all. You somehow turn tiny electrochemical impulses into your experience of the world— and in a sense, into you. We can use tools such as MRI and PET scans to pinpoint areas of the brain that are involved in sensory experiences. We see which parts of the brain light up, so to speak. But as Dr. Perl told me, we still have no idea what’s behind that process inside the brain. How could electrochemical activity produce consciousness? Dr. Perl’s research involves an aspect of consciousness that we’d like to think we could do without—pain. All of us have neurons that are involved in sensing it. Somehow, when a toothache or a tetanus shot causes those neurons to light up, we hurt. We don’t know for sure whether our brains have neurons specifically dedicated to pain, but Dr. Perl tells me that we’re beginning to think that our brains process pain information in several regions that operate in parallel. Lucky for us, consciousness isn’t all pain— it includes wakefulness, perception, reflection, simulation, our sense of personal identity, anything running through our thoughts at any given moment, and maybe lots more. It may be that different regions of the brain generate different aspects of consciousness. People who have injured their brains can have consciousness-related problems that seem limited to the functions controlled by the injured brain area. Dr. Prinz described consciousness to me like this: your senses are always taking in lots of information. You only pay attention to a small subset of what you take in—your brain broadcasts that small subset to its short-term storage centers. But that’s only part of the picture—right, Dr. Prinz? “Right. Essentially the same kind of thing happens in my laptop, but I doubt my laptop is conscious. Consciousness occurs only when attention is underwritten by specific kinds of neural processes.” Okay. So what do we think those neurons actually do?

“Well, I think the best current account goes something like this: when we attend— that’s psychological jargon for ‘when we pay attention’—certain kinds of neurons, called fast-spiking neurons, become more active. They send out signals that cause some of the cells in our perceptual pathways to start firing in synchrony. This synchronic firing allows those cells to send signals that can be received by the brain’s working memory— it allows one area of the brain to talk to another. And it may be that this synchrony is essential for consciousness. If it is, we’re within reach of identifying the neural correlates of consciousness.” Sounds like that might have some pretty big philosophical implications. “Well, maybe. Philosophers are more interested in why brain events—which are changing patterns of neurochemical activity in populations of neurons—feel like anything at all, rather than nothing. I don’t think that finding the neural correlate answers that question, but it may help us answer why we ask that question in the first place. What is it about the way that we introspect our conscious states that prevents us from discerning that they’re just brain states? Typically, people have a problem with the idea that minds are just brains.” Okay. Good stuff. Thank you, Dr. Perl and Dr. Prinz. Next question:

4. Can we find a sustainable energy source?


his one comes from Tom Meyer in the Department of Chemistry. Most likely, the electricity that runs everything in your house comes from coal that was formed three to four hundred million years ago. U.S. coal supplies may be good for another 100 to 250 years, which just might give us enough time to find a replacement. Ideally, our new energy sources would be renewable, efficient, and would cause no long-term damage to the earth. There are a few different candidates—solar, wind, geothermal, oceanic, and maybe nuclear. endeavors 13

(There’s some debate about whether nuclear energy is sustainable.) But right now we know two things: we don’t have a magicbullet solution, and no matter what we do, we’re going to have some growing pains. Solar energy may end up being a major player. Sunlight is plentiful, and humans will be long gone before the sun burns out. Solar really became viable in the 1970s, when oil and energy crises coincided with technological improvements that lowered the cost of converting sunlight to electricity. When oil prices dropped again, funding for solar research dried up, and the public largely forgot about it. Now it’s the world’s fastestgrowing energy technology, and it supplies about one half of one percent of the world’s total primary energy supply. But solar energy costs more to produce than almost any other electrical source. It’s obviously only available in the daytime, so solar power has to be stored, and it tends to work best as a complement to another power system. And about 4–12 percent of solar’s energy is lost in the conversion from direct current, which solar cells produce, to alternating current, which is the way we use it. Wind power is another fast grower. It accounts for just over 1 percent of the world’s total primary energy supply, but many European countries are now getting a good bit of their electricity from the wind: Spain and Portugal rely on it for 9 percent of their production; Denmark for 19 percent. Much of the cost of wind power is up front— those huge wind turbines aren’t cheap. But some argue that because wind power doesn’t pollute or contribute to global warming, it’s cheaper in the long run than almost any other energy source. Right now, wind can be a good complementary source to supply up to about 10 percent of total electrical demand. One study found that the wind could produce over five times more power than what we now use, worldwide, from all energy sources. Whether we adopt it on a larger scale depends on what the economy and environment will allow. Geothermal energy taps steam, hot water, or heat from deep within the earth. It’s relatively clean, safe, and sustainable. MIT researchers calculated that with a few technological improvements, geothermal energy could some day have the potential to supply all of the world’s energy needs. But funding for research and development have been relatively low. The trick is to 14 endeavors

figure out how to get high production out of a particular site without cooling that site off too rapidly. Right now, geothermal supplies less than 1 percent of the world’s total primary energy. The ocean’s surface waves, tides, and currents may hold some of our future power. No one has put any large-scale production into place, though Portugal, Scotland, and England have plans for wave farms, and there are a handful of tidal energy test sites around the planet. The U.S. government has issued almost fifty permits for ocean-based energy projects, mainly granting the right to study feasibility, but not to build anything. But the power-generating equipment would be expensive to build and to maintain, and there are environmental concerns, too: much of the technology is new and relatively untested, so there may be long-term consequences beyond the obvious impacts to areas that contain equipment. Researchers hope that ocean power would eventually be as cost-effective as fossil fuels. The Electric Power Research Institute thinks that, at best, oceanic energy could only supply about 6.5 percent of our current needs. Many scientists argue that nuclear power is sustainable: we can get scads of energy out of a relatively small amount of fuel; nuclear power plants don’t produce much in the way of CO2 emissions; and the waste is minimal and well-contained. But nuclear waste doesn’t sit too well with most folks. Then there’s the potential for power plant accidents and sabotage and for an increase in nuclear weapons as more countries adopt the technology. Thorium may be a better source of nuclear energy than the uranium we use now. It produces less waste—no waste, in some reactor designs—and there’s more thorium than uranium in the earth. But as long as uranium remains relatively easy to mine, we probably won’t put much effort into developing thorium-based reactors. Fusion, which is power generated by joining two atomic nuclei, has been a kind of pie-in-the-sky energy source for fifty years or so. (Nuclear power plants produce energy by fission, which splits atomic nuclei.) The largest planned fusion reactor, called ITER, could go online as early as 2016, in a multinational effort that will try to heat plasma to ten times the temperatures found in the core of the sun. But industry people say we’re still at least a hundred years away from having commercially viable fusion energy. Or as Dr.

Rial told me, fusion has been the energy of the future for many years, and it probably will be for many more. But there are other possibilities. Tom Meyer has done a lot of green chemistry research here at Carolina. He wants to use sunlight to force water to give up its hydrogen, which could then power a fuel cell. He feels like this will probably play a small role in our energy future. That brings up an important point: whatever we do, it’s unlikely that we’ll be able to rely on only one or two energy sources. As far as we can tell, we’re probably going to need to hitch our energy cart to a few different horses if we want to keep living the way we do today. How did I do, Dr. Meyer? Can you help me wrap this one up? “Sure. Finding new energy sources and learning how to use energy sustainably without enhancing global warming will arguably be the central issue for mankind in the nearto-medium-term future. How do we learn how to use what we know about technology, economics, and public policy to reach this brave, new, sustainable world?” Thanks, Dr. Meyer.


What is most of the universe made of?


his one was sent to me by Laurie McNeil and Reyco Henning in Physics. The stuff we can directly observe is fairly easy: we’ve nailed down the compositions of planets, stars, faraway galaxies, you name it. But about 95 percent of the universe is made of stuff we can’t see—some kind of matter and energy that’s not playing by the rules. It’s invisible to every kind of telescope. But we know it’s there. We know because the universe doesn’t contain enough visible mass—that is, stars and other objects—to account for the amount of gravity it seems to have. If our theory of gravity is right, then the universe contains more mass than we’re able to see. To help explain this, physicists came up with something they called dark matter. Dark matter, as physicists now realize, can’t be made of the same atomic stuff— protons, electrons, and the like—that makes up you and me and everything we can see.

Many physicists suspect dark matter will turn out to be some bizarre kind of particle. Some are even trying to create it in the lab. And they’re optimistic that we’ll soon know what it is. Dr. Henning studies experimental particle astrophysics. He tells me that there are three possible explanations for dark matter. Dr. Henning, if I run through them one by one, can you give us the lowdown on each? First, our understanding of gravity might be incomplete. If it is, dark matter might just be a misinterpretation. Is that a fair summary, Dr. Henning? “Not bad. The problem with that idea is that nobody has formulated a reasonable alternative theory of gravity that explains all our experimental and observational data. And our best theory of gravity, general relativity, passes all the experimental tests we’ve thrown at it.” Okay. Possibility number two: our universe might be relatively full of things that we just can’t see, even though they’re made of normal matter—cold gas, dead stars, and other objects that don’t emit or reflect enough light to be visible. “That sounds like the simplest solution, but there are serious problems with it. Some of our most verified theories in cosmology— such as our theory about the origin of primordial elements—would fall apart if dark matter turned out to be normal matter.” And now possibility number three: maybe some kinds of particles don’t interact with electromagnetic radiation. For example, some physicists hypothesize the existence of Weakly Interacting Massive Particles—or WIMPs—that may have been produced in huge amounts during the Big Bang. What about WIMPs, Dr. Henning? “WIMPs are, arguably, the simplest explanation of dark matter. Explaining them requires the least amount of new or exotic physics, and there are some strong theoretical arguments that support them, too. But we still need to resolve this issue experimentally. No matter what the explanation of dark matter is, we will learn something truly new and exciting by resolving this mystery.” Dr. Henning also tells me that dark matter isn’t all that’s bugging physicists. Until about ten years ago, we believed that the pull of gravity would cause the expansion of the universe to gradually slow down. But when we discovered that the universe is actually expanding faster and faster, we realized

something must be coun ter actin g gravity—something must be pushing everything outward. Physicists called that something dark energy. If dark matter has left us for the moment in the dark, at least we have some ideas about where to look for the light switch. With dark energy, we’re not yet sure there is a light switch. Is dark energy simply the cost of having space? That is, does a given volume of space have some intrinsic, fundamental energy? Is dark energy some sort of stuff? Did we get gravity wrong? Is there some other fundamental property of space-time that we just don’t understand? There are even scientists who theorize that the universe could be made of information: the yes-no, on-off “bit” of computer science may actually be, in a sense, the fundamental particle of the universe, which may itself be one gigantic computer. This sounds weird until you understand quantum physics—which I don’t. Anyway, back to the big picture: current thinking is that stars and the like account for only about one half of one percent of our universe. Another three and a half percent is intergalactic gas. About twenty-three percent is dark matter, and the rest—about seventy-three percent—is dark energy. Hundreds of years ago, mapmakers wrote “terra incognita”—unknown land—on any place they didn’t know about yet. Ninety-five percent of our universe is terra incognita. Next time you meet a cosmologist, buy him a drink.


We’ve got the genes... now what?


elly Hogan in Biology sent me this question. In 2003 researchers finished sequencing the

human genome. They created a map of our entire genetic material. Much hand shaking, back patting, and confetti ensued. Okay, probably not all that much confetti. But sequencing our genome was a landmark moment. Science magazine called it a “trip-to-the-moon logistical effort.” Now we’re starting to realize that maybe the party was premature. Sure, we have a map, but it’s like having one that shows only the general outline and topography of the United States, when what we want is a road map to Aunt Mildred’s house. As Kelly told me, we have the genome sequenced, but we don’t know what much of it means for us. Remember what we talked about earlier: genes are more complicated than we thought. They interact with each other, and with other cellular workhorses such as RNA. But we still don’t know how genes and all these other players interact, even when it comes to some of our most common diseases. If there’s a silver lining in this, it’s that understanding genes may turn out to be a lot easier than understanding the proteins that our genes crank out. Proteins, see, do most of biology’s heavy lifting. Before a protein can do any cellular work, it has to assemble, or “fold,” itself into one of an almost infinite number of shapes. Let’s take a very short and very simple protein made of a one-hundred-amino-acid chain. Now let’s say there are three different ways each link in the chain can fold. To find the total number of possible configurations this protein could end up in, we’d have to multiply three times three one hundred times. We’d get a number that’s roughly the current age of our universe, squared. But somehow our little protein already knows what shape to fold into, and how to fold, and then it does fold—in microseconds. But before cells can produce proteins, they have to transcribe DNA into something called pre-messenger RNA. Pre-messenger RNA molecules are then spliced—little sections are cut out and then recombined in different ways—to create mature messenger RNA. This mature stuff then gets translated into a protein. One RNA can be spliced several different ways to yield different proteins, and different variants of the same protein are endeavors 15

often found in different tissues, where their subtly different sequences allow them to do specific jobs. In fact, it’s impossible for us to count how many different proteins there are in humans. And if all that isn’t complicated enough, sometimes the little buggers pair up and work in teams. Remember the non-coding DNA we talked about earlier, the stuff that scientists once thought was junk? Did I mention that it makes up about 99 percent of our DNA? We’re learning how to read it, but we’ve just started. So you can see how we’re a long way from having a truly useful genetic map. I’m going to put Kelly on the spot here and ask her to stand up and tell me what a map like that would let us do. Kelly? “Okay. Let’s see: some day you may be able get a readout of the exact variants you have for different disease-causing genes. And we may someday interpret that readout as a risk value for susceptibility to complex diseases that involve multiple genes. But don’t forget that your environment will always play a role. So will we also be able to pinpoint environmental conditions that add or take away risk?” Thanks, Kelly. Let me see if I can sum this up. If we can come up with a set of rules to explain how our genes can create different proteins, in different amounts, in different circumstances, then our map may be a little more complete. But it’s going to be a while before we can use it to get anywhere. Now for the question that you all voted the most interesting and most pressing of all. Everybody ready? Everybody still awake? Here’s our number-one question:

1. What will replace petroleum?


ose Rial sent this to me. Cheap oil has spoiled us. In not much more than a hundred years, we’ve burned through a quantity of oil that took millions of years to accumulate. Almost all of our transportation here in the United States depends on oil. It’s looking like we’re going to have to make a choice: either come up with a different liquid fuel, or start figuring out how to power all our vehicles with electricity. Some people think that biofuels—made 16 endeavors

from corn, sugar cane, and other plants— may help wean us from petroleum. But as Dr. Rial told me, relying too heavily on biofuels would create a new dilemma: do we want to move, or do we want to eat? If we’re going to stick with liquid fuel, Dr. Rial says he might put his money on gas hydrates— methane gas molecules that are essentially trapped in ice under ocean-floor sediments. There’s a lot of gas hydrate on Earth—up to ten times the amount of known reserves of natural gas. But it’s difficult to extract, and methane and global warming go together like matches and dynamite. Right now, the world is using about 82 million barrels of oil a day, and that number seems to keep going up. Experts project that we’ll use 100 million barrels a day by 2020, and CEOs of some of the biggest oil companies have already announced that they’re going to have trouble meeting that demand. How long till the last drop? It depends on who you ask, and on how fast developing countries continue to grow. It’s not unreasonable to think we’ll be out of oil in forty-five years. Pessimists say thirty. There are immense reserves of a few unconventional sources of oil: tar sands, heavy oil, and oil shales, for example. But it takes a lot of energy and money to extract and refine them; it’s difficult to ramp up their production to high volumes; and their greenhouse costs are high: up to three times more emissions per barrel than refining conventional oil. If we started powering our cars, planes, ships, and trains with electricity, then nuclear energy, solar and wind power, and battery power may become good bets. But imagine the work it would take to make all our transportation run on electricity. So, Dr. Rial, this sounds pretty bleak. Any good news for us? “Well, this is a huge, fundamental problem, and only a few decision-makers are paying it the attention it deserves. “At the same time, if we’re really optimistic, maybe the best way to look at this is to say that the Stone Age didn’t end because we ran out of stones. That is, some new technology will come along that will render our carbon-age technology obsolete. The internal combustion engine is the machine from hell. We’re smarter than that—I hope. “Right now, we don’t know the answer. But wasn’t that what you wanted anyway?” Fair enough. Thanks, Dr. Rial.

And thanks to all of you who thought about these problems, and sent me questions, and put up with my questions—and after all that, even showed up today. I’ll wrap up with this: back around 1880, a lot of scientists felt the big questions were pretty much answered. Some physicists thought all that was left was to sharpen their ideas to slightly finer points—to measure certain constants “to another place of decimals,” as they put it. Seems quaint, doesn’t it? And in a 1997 book called The End of Science, John Horgan interviewed a lot of big-time scientists, who lamented that the best and most exciting discoveries had already been made—that all the big questions were behind us. Now I don’t know about you, but I’d love to come back a hundred years from now to see what we don’t know then. Good afternoon. And good luck. e Brian Hogan is a research assistant professor of chemistry. Bob Goldstein is an associate professor of biology. Jose Rial is a professor of geological sciences. Jonathan Boyarin is the Leonard and Tobee Kaplan Distinguished Professor of Religious Studies. Woody Chambless is a research professor of biostatistics. Laurie McNeil is the department chair and a professor of physics and astronomy. Laura Mersini-Houghton is an assistant professor of physics and astronomy. Edward Perl is the Sarah Graham Kenan Professor of Cell and Molecular Physiology. Jesse Prinz is the John J. Rogers Distinguished Term Professor of Philosophy. Tom Meyer is the Arey Distinguished Professor of Chemistry. Reyco Henning is an assistant professor of physics and astronomy. Kelly Hogan is a lecturer in biology.

What else don’t we know?

Just a few of the questions that didn’t make the top ten: • How complex is life beyond Earth? • Is morality just something we learn, or do we have an innate sense of right and wrong? • How did life begin on Earth? • What are the implications of artificial life? • How do we prevent a fat world from becoming fatter and more diabetic? • What is the ultimate fate of the universe? Images: Sergey Shlyaev (conference room); Claudio Baldini (coffee cup); Stefan Klein (pencil); Alexandr Stepanov (glasses); Mr. Mtfk (coffee stain)

A pathogenic bacterial protein was injected into these epidermal plant cells and redirected to the cell membranes (green). The larger red spots are the nuclei of the plant cells. Laser confocal microscope image by Erica Washington.

The Wizardry of Green Plants combat their enemies with potions, charms, and tricks, and Jeff Dangl is watching. By Jessica McCann

endeavors 17

Jason Smith

Left: Jeff Dangl in the greenhouse. Last year, he was elected to the National Academy of Sciences for his work on how plants defend themselves from disease. “When you go to the NAS induction ceremony in Washington, they have you sign ‘the book’—a list of every academy member, past and present. Do you know who the first person to sign that book was? Abraham Lincoln. Watson and Crick have signed the book. Oliver Smithies— Carolina’s own Nobel Laureate—signed it. I am incredibly honored to sign the book.” Right: Dangl has made a weed called thale cress into a model of plant immunology. In the microscopic image, jigsaw-shaped cells that cover the outer layer of a leaf are invaded by the parasite that causes downy mildew disease. The long blue lines are the parasite’s growing extensions, or hyphae. The small blue spots are parasite feeding structures that suck nutrients from the plant cell.

Soft rot. Fire blight. Stem canker. Leaf spot. These plant diseases are caused by bacteria that don’t infect people, says biologist Jeff Dangl. But that doesn’t mean we’re off the hook. Plants do more for us than produce the oxygen we breathe. Dangl writes in the preface of a report from Plant Genomics: “As you read this, you are likely digesting your last meal, made up largely of plants, or something that ate a plant as its last meal before you ate it! And you probably got to work today using transportation that runs on fossil fuel made from plants that lived a couple of hundred million years ago. Without plants, we’d be in serious trouble.” Nearly 30 percent of the world’s harvest is lost to disease each year, and reliance on pesticides is not diminishing. Protecting our most important food crops—such as corn, wheat, and soy—is also an issue of biosecurity. But when plants are teeming with millions of bacteria, both good and bad, how do they know which microorganisms to fight?


angl admits that chance played a big role in his decision to study the immune systems of plants. He began his science career as an undergrad at Stanford University. Leonard Herzenberg, a cel18 endeavors

ebrated immunologist, put Dangl to work at a summer job. He ended up staying for grad school. “I owe an incredible debt to the Herzenberg lab,” Dangl says. “They allowed me to see what science is like when it’s practiced with joy and dedication and creativity.” By the time he finished, Dangl had become a dedicated immunologist himself, interested in how living things recognize pathogens and protect themselves from disease. Toward the end of his graduate work, he went on what turned out to be a fateful errand: to the library to find an article. “The journal I held literally fell open to a paper on defense response in plants to microbial infection,” he says. The paper described the work of Klaus Hahlbrock, who showed that when plants encounter a pathogenic fungus, they turn on genes required to fight the infection within minutes. The response is fast—faster than scientists expected. Soon after reading the paper, Dangl sent Hahlbrock a letter and asked for a postdoctoral job. He joined the lab just as it was moving to the Max Planck Institute in Cologne, Germany. Dangl’s postdoc with Hahlbrook then morphed into a position as an independent-research-group leader at the neighboring Max Delbrueck Institute. Dangl’s position was well-funded, and he was given the freedom to explore risky new ideas in plant genetics. At that time, in the

late 1980s, scientists were just beginning to work out the genes involved in plant recognition of pathogens. Science funding during this period was hard to come by in the States. “In Cologne, we had the luxury of reliable money and were able to do pretty far-reaching stuff we wouldn’t have been able to get funded for in the U.S.,” Dangl explains. One of Dangl’s forward-thinking projects was developing Arabidopsis thaliana, a weed known as thale cress, into a model for plant immunology. Arabidopsis had been a staple of scientific research because of its small and malleable genome, so lots of tools were available for manipulating the biology of thale cress. But Dangl was among the first to use those tools to study how plants respond to pathogens.


nlike animal cells, plant cells have thick walls made of cellulose that are nearly impossible for germs to penetrate. But millions of years of evolution have provided plant bacterial pathogens with a unique tool: the type III secretion system, a deadly needlelike device built from around twenty different proteins. Once a type III needle punctures the cellulose wall, the pathogen can inject proteins directly into the plant cell. These injected proteins are called effectors. They turn off the plant’s antibacterial defense mechanisms

Ben F. Holt, III

and can cause nutrients to leak out of the plant’s cells. This triggers a disastrous chain reaction. New pathogens waiting outside the cells immediately feed on the released nutrients before finding and attacking new plant cells. While infection does not always mean death for the entire plant, the destruction of large leaf areas, for example, shuts down photosynthesis, weakening the plant and lowering its productivity. But because germs and plants have evolved together over millions of years, plants have had time to develop an arsenal against bacterial attack. Unlike people, plants don’t have immune cells that circulate throughout their bodies, surveying for signs of infection or disease. Instead, every plant cell must be ready to recognize pathogen infection and send signals to neighboring healthy cells, telling them to kill off the infected cells and bar nutrient flow in and out of the affected zones. The blocked-off cells will die along with the resident germs, but the rest of the plant will live. Plant biologists have thought that in order for this quarantine response to occur, an effector had to meet up directly with a corresponding plant resistance protein in the cell. This one-on-one interaction would then set off a cascade of molecular events that leads to death of the infected tissue. Scientists thought every effector had a disease-resistance-protein counterpart.

Dangl’s lab was among the first to isolate these resistance proteins in plants, shortly after coming to Carolina in 1995. But Dangl says there are just too many bacterial effectors out there and not enough plant disease resistance proteins to match them all. He guessed that the plant resistance proteins don’t actually recognize specific effectors directly but instead find parts of their own cells that have been damaged or modified by the effectors. He says when plants put up a fight, they are responding to “modified self.” In a series of experiments published in 2002, 2003, and 2005, the Dangl lab was able to prove this hypothesis. In one example, the plant pathogen Pseudomonas syringae secretes an effector enzyme into the plant cell. This enzyme chews up several proteins, disrupting processes that might kill the bacteria. But when the enzyme breaks up a protein called RIN4, a specific plant disease resistance protein recognizes the damage and goes to work to prevent further destruction. Disease-resistance proteins in plants look a lot like proteins involved in human immune defenses, and are found in most animals from sea urchins on up. Dangl and others think that our “innate immunity” probably evolved soon after multicelled organisms appeared on Earth. Because this family of proteins looks the same in most species, Dangl argues that Arabidopsis is a good model organism for studying questions about diseases that directly affect humans. Following work done by the Dangl lab and others, a French group found that a mutated form of a disease-resistance-like protein is partially to blame for Crohn’s disease, an illness that affects nearly 600,000 people in the United States and Canada. But let’s get back to plants.


lant breeders have known for centuries that if you cross a disease-resistant plant with a susceptible one, you can end up with offspring that are mostly resistant. But this process is messy. Many genes get transferred

during the crossing, not just the ones that code for disease resistance. You might end up with a more disease-resistant tomato that is easy to grow, but it might not be as tasty or as pretty as the one that always gets sick. Dangl and plant genomicists around the globe are hard at work sequencing the genomes of plants and plant pathogens in hopes of identifying the disease-resistance genes of plants that humans rely on. Dangl asks, “What if you could take the diseaseresistance gene of a wild tomato and drop it in an heirloom tomato, which are invariably disease susceptible?” Moving genes between different varieties of the same species might be a safe way to engineer plants that are disease-resistant and more tolerant of our changing climate. And it may let us cut back on the pesticides we use to grow fragile fruits and veggies.


here are some nasty plant pathogens out there, but they are far outnumbered by the billions of beneficial microbes plants depend on for growth and health. Dangl says that the next big frontier in plant science will be deciphering the complex relationships among groups of beneficial bacteria and fungi. He and his colleagues are setting up an ultra-fast DNA-sequencing facility on campus that will be used to identify and study the microbial communities living on and around plants. Knowing which bugs keep plants healthy is important as we cultivate more plants to sustain our ever-growing population. But even though plants provide us with nearly all of our basic needs—air, clothes, food, and our food’s food—we also depend on them for more than the bare necessities. “It is a simple but profound truth,” Dangl writes in Plant Genomics, “that plants and plant communities add beauty, flavor, fragrance, and tranquility to our existence.” e Jessica McCann is a doctoral student in the Department of Microbiology and Immunology in the School of Medicine at Carolina. Jeff Dangl is the John N. Couch Professor of Biology in the College of Arts and Sciences.

For more information about this research, go to For more on Arabidopsis, For a short history of the National Academy of Sciences, The NRC report, Achievements of the National Plant Genome Initiative and New Horizons in Plant Biology (2008) can be found at: endeavors 19

Jason Smith

Murder in Moscow by Mark Derewicz

Vasilii Prasolov skulked into a Moscow restaurant where his estranged wife Zina was drinking brandy with male companions. He asked her to step outside, but she refused. He pulled out a gun and shot her dead. Waiters rushed to the scene as Vasilii cried, “Don’t hit me! I’ll turn myself in!” If only Court TV had been around in 1911 to film what happened next. Louise McReynolds describes a Russian’s crime of passion.


ouise McReynolds has loved tsarist Russian society ever since reading Anton Chekhov’s short stories in high school. She related more to nineteenth-century Russia than to twentieth-century America. Later, as a historian, she wrote books on Russian newspapers and leisure activities. While researching these books she was drawn to the outlandish crimes and court trials that Russia’s middle class went gaga over, especially the sensationalistic murder cases. McReynolds says that well-to-do women would flock to courthouses. They’d jeer defendants and cheer verdicts, affecting juries and giving reporters a feel for public sentiment. “These court ladies, as they were called, sometimes brought food for defendants and collected money for the accused,” she says. “They were participants in the whole court drama.” 20 endeavors

McReynolds, who found reams of court documents and newspaper articles while digging through archives in a library in St. Petersburg, says that the Prasolov trial is a good example of how tradition and pop culture intermingle and produce the sort of justice that today we might not find all that just. Vasilii’s defense was simple. Zina’s promiscuous behavior crushed his spirit and he snapped. Vasilii’s lawyer painted the picture for the jury: he was an anonymous factory manager working for his beloved family. She was a social butterfly, hardly a proper wife and mother. Never mind that the couple hadn’t lived together for a year. He still loved her, the defense said, and continually tried to reconcile with her. Zina had cavorted with Nikolai Riabushinksii, who’d been present at the murder.

“Nikolai was the equivalent of one of the Rockefeller boys,” McReynolds says, “the black sheep of a famous merchant clan.” Zina had cozied up to opera star Dmitrii Smirnov at a Crimean resort. Vasilii, who happened to be vacationing there at the same time, confronted her. Zina told him that she needed help from Smirnov to launch a singing career. Vasilii didn’t believe her, and his lawyer made sure the jury got an earful of innuendo. Forced to play the same game, lawyers for the prosecution—essentially the Russian state—found out that Vasilii had been shacking up with a famous nightclub singer named Frumson. One night, Frumson exploded in a rage at Vasilii because, witnesses said, he had been leeching off of her for months, and she couldn’t take it anymore. Police had to calm her down.

Even uglier, Vasilii objected to Zina’s pregnancy because it spoiled her figure; he wound up soliciting Zina’s younger sister. The defense retorted: While the couple’s baby underwent surgery, Zina went to the theater. The baby died. Then the prosecution pointed out that Vasilii didn’t bother attending the funeral. The press reported every wretched detail, and Muscovites ate it up. Judges typically issued tickets to people who wanted to see a trial, but McReynolds says that the Prasolov trial was so popular that people would scalp their tickets outside the courthouse for a tidy profit. Publishers sold a booklet of trial transcripts. The cover read “The Prasolov Affair” and had a portrait of the Prasolovs during happier times. Almost immediately, their story was made into a film called In Moscow’s Golden Spiderweb. McReynolds says that the word “golden” referred to Moscow’s overindulged youth. Even before the film was made, theaters put promotional photos of the Prasolovs in the coming attractions, though Vasilii’s parents put a stop to that with a court injunction. Meanwhile, other films told similar stories. In the 1913 film Children of the Age, the heroine was married to a banker and was mother to a young child. But she caroused with older men at chic restaurants. Unlike Vasilii, the husband did not kill his wife; he killed himself. In the 1914 film Child of the Big City, the husband married a simple seamstress who fell prey to urban decadence. In the end, the husband killed himself in front of his wife who then, without remorse, stepped over his corpse to hail a cab to a fancy club. To courtgoers, Vasilii was a sympathetic character like the men in those films, a victim of a fast-changing culture that saw young women flout convention and leave hapless men in their wake. McReynolds doesn’t say that these films, or others, directly influenced the jury. But she does say that jury trials were part of popculture entertainment, and that trials, movies, and culture all influenced each other. Exhibit A: lawyers often treated courtrooms like theater stages, writing their own scripts and acting them out, McReynolds says. Here’s a line from the Prasolov defense: “When Vasilii asked her to go outside, and she refused, it was not his hand which shook, or even his heart, but the ground upon which

he stood! It opened up before him, and a shot rang out!” McReynolds says that Vasilii’s lawyer ingeniously victimized Zina to serve his client’s cause: “Now, I’m not going to criticize Zinaida Ivanovna because that would only upset the defendant, who loved her so. The path of a beautiful woman can be difficult because of the people who take advantage of her. I’m not a hypocritical moralist; I know that a woman can slip without falling.” The prosecution relied on sarcasm, trusting the jury to use common sense: “We have an anecdote in our courts—why did you kill your wife? Because I loved her, of course!” The prosecution also lamented, “Yesterday, Vasilii was known only to waiters, but today all of Russia knows him. Unfortunately, people know more about his biography than they do those of the inventors of the telegraph and steam engine.” Vasilii’s lawyer, meanwhile, continued to refer to pop culture. He alluded to the celebrated book Keys to Happiness by Anastasia Verbitskaia when he blamed outsiders for “stealing the keys to someone else’s happiness.” In other words, the lawyer blamed rich and powerful men for corrupting Zina and ruining Vasilii’s life. Defense witnesses testified about Vasilii’s emotional frailty, painting him as a dispirited, sympathetic man-child. “His mother testified about his cocaine use and what a weak mama’s boy he had always been,” McReynolds says. Hearing this, Vasilii burst into tears and had to be escorted out of the courtroom. In fact, Vasilii cried throughout the trial, sitting there alone in a cage opposite the jury as was customary in Russia at the time. Despite all the clear-cut evidence, the jury saw Vasilii as a wronged man who went temporarily berserk, and when the verdict was not guilty, the courtroom crowd cheered.

“People believed that Vasilii was genuinely sorry,” McReynolds says. “They were sympathetic—he’s weak. He’s sorry. They believed that. And in one sense, it’s enough that the prosecution is the tsarist state, and finding him not guilty was one way to get back at it.” Another reason is orthodoxy—Zina was not a proper wife and mother. She drove her husband mad. This defense would never fly in a modern jury trial. But it did in tsarist Russia, as well as in pre-War America. (Search online for “girl in the red velvet swing,” an American equivalent to the Prasolov trial.)


asilii was released into the care of his father, a character witness who impressed newspaper reporters so much that they compared him to a movie star. “A lively witness in the case, a rarity on this particular stage,” one reporter wrote. “He’s a veritable Max Linder.” Linder, McReynolds says, was a wildly popular movie star in Europe. He often played a comical but charming character, a “man-about-town” in the lingo of the day. The defense made it clear that Vasilii, if found not guilty, would be placed in the care of his upstanding father. And so he was. The prosecution appealed the verdict, there was a second trial, and Vasilii was found not guilty once more before fading quickly into anonymity whence he came. “If you’re looking for the logical explanation for the verdict—don’t,” McReynolds says. “You’ll just waste your time. This is just the way it was.” e Louise McReynolds is a professor of history in the College of Arts and Sciences. She published a book in 2003 called Russia at Play, and she is now writing a book on sensationalistic murder in tsarist Russia.

In tsarist Russia, sensationalist murder trials were big entertainment. This booklet contains trial transcripts of the Prasolov case, in which Vasilii Prasolov murdered his estranged wife Zina. The booklet cost 25 kopecks, or one-quarter of one ruble, back in 1911.

endeavors 21

Jack Griffith

A bit of salt, a trace of life

Jack Griffith and Smaranda Willcox used a sterilized needle to penetrate salt crystals 250 million years old.

IN A TUNNEL two thousand feet below the desert near Roswell, New Mexico, a lone beam of light from Jack Griffith’s headlamp struck a thick wall of ancient salt. Griffith had no idea what was trapped inside this 250-million-year-old crystallized formation. Ancient fossils? Bacteria? Nothing? He didn’t dream of finding an organic molecule that might help scientists find life on other planets. That sort of idea usually belongs above ground in Roswell, de facto home of ufology and alien conspiracy theories. But Griffith’s science project turned out to be almost as surreal as Roswell, and a whole lot more provable than a UFO. Jack Griffith isn’t known to traipse into the wild searching for odd and ancient matter. He is, though, something of a legend for his work photographing the tiniest of things with his electron microscope. He figured out a way to see the finer details of DNA, and he took the first photo of DNA bound to a known pro22 endeavors

Deep under the desert, Jack Griffith found something too old and almost too good to be true. By Mark Derewicz

tein. Such work has helped biochemists analyze macromolecules of all shapes and sizes. (See Endeavors, Fall 2004, “Seeing Things Jack’s Way.”) One such biochemist is Bonnie Baxter. Several years ago, she asked Griffith to photograph bacteria she had found in the Great Salt Lake. He agreed, and while peering at Baxter’s samples he saw surprisingly large amounts of bacterial viruses in the background. “We didn’t expect to see that,” Griffith says. “Scientists had seen it in other salt environments but never in the Great Salt Lake. The viruses looked like ones that grow in people.”

Curious but focused on his main research, Griffith had two high-school interns spend the summer studying the bacterial viruses. But Griffith’s curiosity eventually got the best of him. He knew that really old halite formations exist around the world; what if bacteria and their viruses were trapped inside these ancient salt crystals? Baxter sent Griffith salt crystals from an old mine in Utah, but he couldn’t find many with inclusions, the pockets of water that might contain very old organic material. Geologists told him that surface water had continually leaked into that salt formation, redissolving the salt over and over and casting doubt on the age of anything encased in the crystals. “That salt deposit was geologically trashed,” Griffith says. “So then it became a matter of, well, should we drop this, or should we get a little more serious?” Griffith read up on ancient halite formations and found out that the most promising, undisturbed salt deposit lies two thousand feet below the desert thirty miles southwest of Roswell. Last summer, Griffith flew to El Paso, Texas, and drove two hundred miles into the middle of the desert until he came upon an inconspicuous mining operation. “Unless you knew what this was you’d drive right past,” he says. “There are no signs. It has a very low profile, yes, except for all the security.” Turns out, this is no salt mine. It’s a dump for nuclear waste. Or, as the U.S. Department of Energy (DOE) calls it, a Waste Isolation Pilot Plant. Either way, it’s where the DOE buries transuranium waste from old nuclear warheads. But Griffith says this is no pilot plant. “There’s almost an entire city cut into that gigantic salt deposit a half-mile underground.” In the 1990s, the DOE dug a mine shaft through 2,000 feet of rock to reach the Permian Salado Formation, a 250-millionyear-old conglomeration of halite—crystallized sodium chloride, also known as salt. The Salado is 2,000 feet thick and extends for miles underground. Using a gigantic drill, the DOE has hollowed out miles of tunnels and dozens of rooms as big as football fields to store thousands of barrels full of nuclear waste. The 100-gallon barrels are stacked three high, wrapped tight, and then sealed behind a twelve-foot cement wall.

Geologists say that over time, the salt will act like a glacier, slowly covering the barrels and encasing them permanently. Griffith says that most geologists who have studied the Salado are confident the formation is too deep under rock to have ever been penetrated by surface water. This means that the Salado very likely had remained unperturbed for 250 million years, since the continents were clumped together in one landmass called Pangaea. When the continents began drifting apart, a large pool of oceanic salt water was trapped inland near the equator. The water eventually evaporated, leaving an enormous salt deposit that crystallized and was covered by sedimentary rock. That part of Pangaea is now southeastern New Mexico. Most macromolecules are thought to degrade well before 250 million years have passed. DNA definitely isn’t supposed to last that long. But few people have looked for it in such a strange place.


riffith pulled up to the outer fence at the nuclear waste site, watching a conveyor belt dump tons of large salt crystals onto enormous trucks. There, he met geologist Dennis Powers, a DOE consultant and Salado formation expert who handed Griffith a hard hat with a headlamp. They stepped inside an elevator and whooshed down through two thousand feet of darkness. “Whether you’d like this depends on whether you’re claustrophobic and like insects or not,” Griffith says. The elevator let them off near the intake shaft that sucks in air—and bugs—from the surface. “There’s a colony of black widow spiders, thousands of them, just hanging out by the intake shaft waiting for insects to be blown into their nests,” Griffith says. “We did our sampling elsewhere.” Powers and Griffith hopped on an electric-powered cart and navigated their way through the eerie caverns, passing the many rooms full of nuclear waste until they came upon a freshly cut wall of halite that glowed when lit. There they found a perfect chunk of salt that Powers chipped away at, searching for inclusions. He found a lot. For two days Griffith and Powers, along with Bonnie Baxter and DOE physicist Roger Nelson, collected chunks of halite and stuffed them into Ziploc bags. All told they hauled out more than one hundred

pounds of salt, packed the bags into black fiberglass camera cases, and FedExed the lot to Chapel Hill. Back at his lab, Griffith and graduate student Smaranda Willcox searched for inclusions. They sterilized the surfaces of the crystals to kill any bacteria that might have contaminated them since they were removed from the Salado. Then, while peering through a regular microscope, Griffith and Willcox clamped a crystal to a drill press and, with a very fine needle, drilled into the salt until they reached the trapped water. Then they used a glass microcapillary to remove the salt-saturated water. “This was a complete nuisance,” Griffith says. “And we didn’t get much material.” They did this eighty times before Griffith found a way around. He whittled the crystals to their pristine cores before dissolving them in ultraclean water. Using the bulk-dissolved crystals and the salt water recovered from the tiny inclusions, Griffith and Willcox prepared the samples for viewing in the simplest way possible, in order to avoid contamination. They applied a drop of the sample to a threemillimeter round copper screen coated with carbon, washed it with water, air dried it in a vacuum, and finally coated the sample with tungsten so that the microscope would contrast whatever was in the sample with the background. Then Griffith and Willcox popped samples into the electron microscope and searched for the remnants of ancient life. It didn’t take long to find something. And at first they weren’t sure what it was, because they had never seen anything like it. Griffith searched through dozens of samples and saw this same strange substance. Then he found another substance he didn’t expect. It was DNA. And it had to be 250 million years old. But other scientists had found supposedly ancient organic matter, only to face questions about its true age. Is the DNA Griffith found truly ancient, or is it a modern contaminant of ancient samples?


ver since scientists figured out how to study the tiniest of things, they’ve been trying to find remnants of the oldest life forms on Earth. They’ve dug up skull fragments of bears and Neanderthals from 100,000 years ago. They’ve tracked down dormant bacteria in ice glaciers endeavors 23

Bonnie Baxter

Bill Willcox

Griffith and Smaranda Willcox prepare a sample of ancient salt for his electron microscope.

Jack Griffith stands in a cavern cut out of an ancient salt deposit two thousand feet below the New Mexico desert.

that date back 750,000 years. And they’ve unearthed 11-million-year-old cellulose— the chief component of a plant’s cell wall—in Canada’s arctic forest. Griffith says this work is pretty much accepted as fact within the scientific community. Other findings, though, are kind of murky. In the 1990s, for example, researchers found bacteria in amber—fossilized sap—that was between twenty-five million and forty-five million years old. “That’s the Jurassic Park stuff,” Griffith says.


ut Griffith and others say that such amber might not be the best stuff for this kind of test. The bacteria inside might not be very old. Also, scientists didn’t find a colony of bacteria to study under a microscope. They put a sample from amber on a Petri dish coated with nutrients and hoped that this new environment would be conducive for a dormant bacterium to multiply. “You do this a couple hundred times and see what happens,” Griffith says. The colony is definitely not ancient, he says. It might have grown from something ancient. Or maybe not. “This bacterium could’ve been something that floated in from your hair or anything else,” he says. “You just don’t know.” 24 endeavors

Facing page: Dense mats of cellulose fibers as seen through Griffith’s electron microscope. He found them in tiny water inclusions trapped inside ancient salt crystals. At 250 million years old, the fibers are the oldest macromolecules ever found.

Last year, NC State researchers reported finding fragments of proteins from dinosaur eggshells dating back sixty-eight million years. Griffith says their findings are certainly valid, but to detect ancient DNA, researchers normally rely on the polymerase chain reaction, or PCR, to amplify their samples. And PCR, like growing colonies on a plate, is an amplification method. “You do a PCR and out comes a test tube full of DNA,” Griffith says. “But only one DNA molecule is needed to start that reaction. Does the PCR truly amplify something that was an ancient molecule, or something that was stuck to the tube?” Griffith says PCR tests have a bad habit of amplifying present-day contaminants. The only thing an electron microscope amplifies is an image. Still, contamination could play a role. Griffith did see DNA trapped inside ancient halite. And he thinks it’s the remains of a 250-million-year-old organism, which would make it the oldest macromolecule ever found. “This flies in the face of biochemical experiments that show that DNA probably shouldn’t last that long,” Griffith says. “But the high-salt environment of the Salado is probably quite protective to DNA.” He wrote his paper and sent it to journals,

but the biochemists who reviewed it were skeptical because Griffith found only a trace amount of DNA. “When you see things only occasionally with an electron microscope, you worry,” Griffith says. His lab is now using biochemical assays to confirm or disprove his findings. But he’s confident he found ancient DNA, and one reason is that he found so much of that other strange substance. It looked a lot like cellulose, only no type he had ever seen. “Modern cellulose is clumped together,” he says. “This ancient stuff was untangled. We were tripping over it, there was so much.” Biologist Ann Matthysse, also at UNC, helped Griffith figure out if it was really cellulose. Matthysse told Griffith to douse the ancient material in a mixture of sodium hydroxide and sodium borohydride at sixty-five degrees Celsius. This stuff will eat through leather, disintegrate insects, and burn away dead skin. But it won’t harm cellulose, and nothing happened to Griffith’s sample. Certain enzymes, Matthysse told him, will chew up a lot of things, but they won’t harm cellulose. These enzymes didn’t harm Griffith’s sample either. Then Matthysse told Griffith that a protein enzyme called cellulase chops up

Jack Griffith

cellulose so it can be degraded. And that’s what happened to Griffith’s substance. He snapped pictures with the highresolution digital camera inside his electron microscope, and then he remembered that Malcolm Brown, a former UNC biologist now at the University of Texas, had taken some of the only photos of cellulose microfibers with an electron microscope. “His photos looked very similar to ours,” Griffith says. It’s cellulose. Griffith has no doubt. And it’s at least 250 million years old, by far the oldest native macromolecule ever found. Griffith says that analyzing this cellulose may reveal more details about Earth’s ancient biosphere. It might even give us some clues about places other than Earth.


hen Griffith sent his findings to journal editors, they gave the paper to geologists. The geologists not only questioned the likelihood of finding DNA in halite, but a few of them also wondered if the Permian Salado Formation is really that old and undisturbed. Griffith rewrote the paper, adding nearly everything known about the Salado so that the geologists would be satisfied. He also removed his findings about DNA

and focused on proving the cellulose side of the story. The presence of ancient cellulose in Griffith’s samples led him to an interesting conclusion: when scientists go seeking life on other planets, cellulose microfibers—not DNA—may be the best thing to look for. He sent his paper to the journal Astrobiology, which published it in April 2008. Griffith’s theory, which he admits is a bit philosophical, is that life on other planets would likely be carbon-based. The six-carbon glucose molecule is the fundamental energy currency of most known carbon-based life forms, including the most primitive bacteria that existed 1.6 billion years ago, before there was an oxygen-rich atmosphere. Cyanobacteria are the modernday descendents of those ancient primitive bacteria. And cyanobacteria create cellulose out of glucose molecules. “It’s very likely that any of the earliest life forms on other planets would learn how to stick these glucose units together to make this semi-crystalline five-nanometer cellulose microfiber,” he says. “It’s stiff and rigid, and very few things will break it down.” When cells die, cellulase enzymes typically are not around to break up cellulose. That means that cellulose won’t degrade quickly, as Griffith’s findings prove. And even if cellulase is present, it cannot completely

digest the thirty-six-glycan chain that makes up a cellulose microfiber. DNA and proteins, on the other hand, rapidly degrade when enzymes are released after cells die. Cellulose microfibers exist in a semi-dehydrated state. This means that cellulose might cope better than other macromolecules with the dry conditions found on other planets in our solar system. Cellulose would probably also withstand radiation levels like those on Mars better than other macromolecules would. Scientists already suspect that Mars and other planets have evaporites—mineral sediments created when surface water evaporates. Griffith says that scientists should explore halite in these evaporites. “Who knows?” he says. “Cellulose might just be the electron microscoper’s version of little green men.” e Jack Griffith is the Kenan Distinguished Professor of Microbiology and Immunology in the School of Medicine and a professor of biochemistry. Ann Matthysse is a professor of biology in the College of Arts and Sciences. Griffith used a stipend from his Kenan Distinguished Chair award to fund this research. He is now seeking a grant from the National Science Foundation to research older salt deposits around the world; there’s a 400-million-year-old halite formation thousands of feet below Detroit. endeavors 25


hen cosmologist Laura MersiniHoughton first looked at the evidence for a giant void in space, what she saw left her cold. It was a gaping hole nearly a billion light-years across, containing virtually no galaxies or matter and showing up as a cold spot in a heat map put together by NASA’s WMAP satellite. The heat map is a plot of the cosmic microwave background—remnant heat left over from the Big Bang. The void represents a volume of space with temperatures between 20 and 45 percent lower than the average for the rest of the skies. The void is somewhere in the range of six to ten billion light-years away from Earth, and it’s forty times bigger than the void formerly known as the biggest in space. Voids in space are not a novelty, but one this big was just what Mersini-Houghton needed. To her, the void is evidence in favor of a theory that borders on the outlandish— the existence of a parallel universe. In August 2007 Lawrence Rudnick, a physicist at the University of Minnesota in Minneapolis, uncovered the first experimental evidence of the void using the Very Large Array radio telescope in New Mexico. The telescope located the void in the direction of the constellation Eridanus, in the same place where scientists had previously noted the cold spot picked up by WMAP. It all started with a hunt for the origin of our universe, Mersini-Houghton says. A universe that begins with very high energy, such as our own at the Big Bang, would have very low entropy. Entropy is the measure of a system’s disorder and is inversely related to energy. A universe at very low entropy 26 endeavors

could not have stood a chance of being born. Entropy explains how our fledgling universe managed to escape its own gravitational attraction by stretching space out and flinging matter far enough apart to prevent the universe from coalescing back into the singularity it was when it began. Cosmologists call this the theory of inflation. “Cosmologists have shown that the probability that our universe would come to be is basically zero,” Mersini-Houghton says. And yet, it happened. If an event so unlikely as the birth of our universe can happen, Mersini-Houghton says, virtually anything is possible. Including multiple universes. Cosmologists turned to string theory to find out why such an unlikely windfall came to pass. “String theory continues to be the leading candidate for a solution to the problem of our existence,” Mersini-Houghton says. But string theorists predicted 10500 possible universes, whereas many scientists had hoped that string theory would predict a unique universe. “That kind of dashed hopes,” she adds. Until recently, these results have been considered a major crisis in string theory, Mersini-Houghton says. Rather than solving cosmic problems, applying the theory created more challenging ones. Most of the cosmological community, at least in the United States, responded to the dilemma by taking an anthropic approach. They suggest that we may have ended up with this universe because it supports life. Mersini-Houghton, who never took anthropic reasoning very seriously, looked

for a more straightforward explanation. “Something was missing in the picture,” she says. She decided to solve the problem by upending it. Mersini-Houghton reasoned that any good theory that is meant to explain the origin of our universe must indeed predict the possibility of multiple universes. “We’re asking, ‘Why did I start with this universe?’ But the question does not make sense if all you have is one sample,” she says. “That question immediately begs another: Compared to what else?” That’s when Mersini-Houghton hypothesized multiple universes, each with their own physical properties and constants. Each of those baby universes, she says, would have started as a tiny patch in the fabric of space, distinct from others in the amount of matter and energy it contained. Matter, which tends to clump, vies with energy, which tends to cause matter to expand. Depending on the outcome of this cosmic tug-of-war, some of those tiny patches would have survived to become universes, while others would have collapsed into cosmic obscurity. This theory explains how a universe that started with very high energy might have survived the fatal pull of matter. Mersini-Houghton began to look for testable signatures of her theory. “That’s where the void comes into the picture. We predicted the void in 2006, and in fact, we were lucky that it was discovered a mere eight months after we predicted it,” she says. In a multiverse scenario, our neighboring

What opened this

gigantic hole?

The outlandish answer: a parallel universe and cosmic tug-of-war. By Prashant Nair This heat map of the infant universe draws on three years of data from NASA’s WMAP satellite. Temperature fluctuations (shown as color differences) correspond to the seeds that grew to become galaxies billions of years ago. Using a model that assumes the existence of a parallel universe, Laura Mersini-Houghton predicted the giant void (see the arrow) before it was observed. The void was inconsistent with standard cosmological theory, she says, “so no one expected to see such a thing.” Image from the NASA/WMAP Science Team.

universes exert a gravitational tug on our universe, causing matter to shift toward the attracting universes. This would create a hole in our universe, Mersini-Houghton says. It’s akin to creating a hole in a piece of stretched fabric while trying to pull at it from one point. Mersini-Houghton’s calculations of this gravitational force predicted a void of the exact scale that Rudnick’s team observed a few months later. When the WMAP satellite recorded the cold spot in their heat map in 2004, NASA put it down to an instrumental snafu or an experimental artifact, because such a humongous void could not be explained by standard cosmology. But Rudnick’s observations with the radio telescopes confirmed

the void’s presence. “It was in the exact place in the sky where WMAP had seen the cold spot. Then we knew for sure,” MersiniHoughton says. “That was the first real test of our model.” Leonard Parker, a cosmologist at the University of Wisconsin-Milwaukee, says, “The idea of a very large metauniverse, of which our universe is a relatively small part, goes all the way back to the nineteenth century. What is new in Mersini-Houghton’s work is the possibility that regions of the metauniverse with which we cannot communicate (what you could call parallel universes) may still have influence on our universe because of correlations between our universe and other ‘parallel’ universes.” But Mersini-Houghton’s dissenters have

come up with alternative explanations for the void. Some say that the void may represent a giant knot in space called a topological defect, while others say the void results from textural aberrations in the fabric of space. “It’s madly courageous to even contemplate the possibility of addressing the beginnings of our universe,” Mersini-Houghton says. She adds that we may never be sure of how it all started, but testing more predictions could make the theory stronger. She is now working on predictions of more voids. And another of her theory’s predictions will be tested this year by the Large Hadron Collider in Switzerland, where scientists can recreate some of the conditions present in the early universe. Mersini-Houghton says her approach is based on the Copernican view of nature. “No, we are not at the center,” she says, “and our whole universe is not at the center.” e Prashant Nair is a master’s student in medical journalism at Carolina. Laura Mersini-Houghton is an assistant professor in the Department of Physics and Astronomy.

The first stars in the universe turned on about 400 million years after the Big Bang. Image from the NASA/ WMAP Science Team.

endeavors 27

©2005 Kay Chernush

On Lake Volta in Ghana, these boys are slaves, chosen because their small and nimble hands can pick fish from the nets.

The wide, tangled net In Ghana, a UNC student tries to fathom the slavery of children. By Mark Derewicz


here’s a museum in Ghana that shows visitors where slaves were kept before boarding ships to the New World. You can touch the old chains and latches and whipping posts. You can read about the evils. Or you can walk down the road and see a different kind of slavery. You can talk to children born in Ghana, some as young as five, who were sold into bondage by their own relatives. Undergraduate Angela Harper spent six weeks talking to these families, searching for reasons why a parent would do such a thing. At the end she took a break, visiting that museum. “It was awful and depressing,” Harper says. “I thought, ‘The world is fooled that slavery is something we can put in a museum, something historical.’” 28 endeavors

Harper, twenty-two, became interested in child trafficking during her freshman year when she joined Free the Slaves, a UNC student group that tries to raise awareness and money to free child slaves around the world. The more she learned, the more she wanted to see these children for herself. After a two-week internship with AntiSlavery International, a nonprofit in London, Harper flew to Ghana. There she met her guide Priscilla, a twenty-year-old college student and volunteer for the Association of People for Practical Life Education (APPLE), a group that works on humanitarian issues, including child trafficking. From Accra, Ghana’s capital, Priscilla drove Harper three hours to Atitekpo, the first of three villages Harper visited during her six-week stay in the Volta region of

southeastern Ghana. Priscilla helped Harper settle into her new home where a large farming family lived in a mud hut with no electricity or plumbing. This family had never sold a child to traffickers, but they knew all about the problem. Harper says that 20 percent of Volta’s child population is sold into slavery. That’s thousands of children. Harper says that the poorest adults are forced to make a stark choice: keep all of their children—sometimes as many as eight or ten—and go hungry, or sell a child to feed those who remain. “There are lots of widows trying to work and raise kids,” she says, “or grandparents who take care of kids whose parents either died or jumped the bush, which is what they call it when someone abandons their family. It’s a situation of desperation. It’s not like

these people are evil or don’t love their kids. It’s more like they’re extremely poor and have tons of kids.” Harper says that traffickers, wanting to maximize profits, go to the poorest villages and offer parents a deal. “Traffickers are known to offer a hundred dollars when they take the kids, and then another hundred when they return the child.” Sometimes the slave owners say they’ll return the child in two years, sometimes four. “But they never return them,” Harper says. No one with APPLE has heard of a single case in which a child was returned. A lot of parents told Harper that they were honest with their children, telling them they had to go work for a few years. In Ghana, child labor is common and so is apprenticeship. “But I spoke to a mother who tricked her kids into going with a child trafficker,” Harper says. “She told her boys that her uncle was coming and that they’ll get to go play with their cousins. Then the trafficker came in the middle of the night and took the kids to Lake Volta.” There was no playing at the beach. Children at work Lake Volta, the largest man-made reservoir in the world, was created in 1965 when the government dammed the Volta River to generate electricity, little of which makes it to surrounding villages. The lake was once abundant with fish and provided jobs to millions of people. But stocks dwindled because of overfishing, and fishermen started using large mosquito nets with tiny holes to catch more small fish. Fishermen use children, whose hands are small and nimble, to pick fish out of these nets. Harper says that the nets get tangled on tree trunks and branches that lurk in the brown muck just beneath the surface. Children are forced to dive in and untangle them. Many children are injured this way or contract waterborne diseases such as bilharzia and guinea worms. “So these kids are tired from these parasites living in them,” Harper says, “and then they have to work sixteen, seventeen hours a day.” This can go on for years if no one intervenes. APPLE staffers go to the villages to ask parents if any of them have sold their children and if they want them back. Staffers collect names and then search Lake Volta. When they find a child, they negotiate a deal with slave owners, typically paying money

for a child’s freedom. Slave owners are less likely to release older children because they can work longer, harder hours. APPLE workers then take the children— about a hundred each year—to a shelter for medical attention and counseling. The children remain there for two months before returning home. These are the children Harper spoke to. “They are all traumatized,” she says. “They say they don’t know why they were sent away and they don’t know why they came back home. Parents don’t explain what’s going on. ” Untold stories For Harper, a typical day in Ghana began at dawn with breakfast over an open fire. The meal would include cassava, a starchy root vegetable central to the Ghanaian diet. After breakfast, Harper biked to villages to speak with parents and their rescued children. The stories she heard made her sick. One girl told Harper, “I cannot count how many times my master raped me.” For girls of a certain age, if they’re sold into slavery, rape is a fact of life. Harper met another girl; this one was six when APPLE rescued her from Yeji, the major fishing village on Lake Volta. She was the youngest ever rescued. Her job had been to pick out small fish from nets all day. An eight-year-old boy did the same job. “He saw a boy drown,” Harper says. “So he was scared of untangling the nets in the water. He refused to dive in, and his master beat him so bad that he couldn’t move for two days. Then he agreed to go in the water.” His eleven-year-old brother was in charge of smoking the fish. Both boys told Harper that they longed for home. Their mother, who sent them away because she earns just five dollars a week, visited them and was shocked at how poorly her sons were treated. She sought help and APPLE rescued the boys in 2006. These children and the many others that APPLE has interviewed often went hungry and slept little. Still, Harper heard a common refrain from parents in the villages: “The masters can feed them, while I cannot.” Harper spoke to a sixty-year-old woman who had worked hard to support several grandchildren until she fell ill. She sent three grandchildren with a trafficker, who agreed to pay her a box of yams and a box of fish every six months. He sent only the yams. Another woman, unable to feed her chil-

dren, gave them away without recompense. At first, Harper was surprised at how unemotional the children and parents seemed. “I guess the parents had already been through all this,” she says, “but if you’re hearing your child talk about all this stuff that you put them through, it seems like you’d get emotional hearing them talk about it. But it was pretty emotionless. I didn’t expect that.” And Harper didn’t expect that she, too, would become numb to the stories. She never cried, not in front of the families. During her six-week trip, her shock and sadness turned to anger and pessimism. “There are thousands of kids that APPLE will never save,” she says. “I’m pessimistic because it’s hard to see a solution to end this quickly. Ghana’s government is more interested in economic development. Development is a good thing, but the benefits are not trickling down to the poorest people.” So far the steps Ghana has taken against child slavery seem meaningless. Ghana was the first country to sign the United Nations Covenant on the Rights of the Child. It signed a regional pact with other African nations, and passed antitrafficking laws of its own. But Harper found that there’s been only one arrest. The only noticeable thing the government has done is put up posters saying child trafficking is illegal. “Ironically, there’s a child labor office right outside one of the villages,” Harper says. “And there’s a man there in charge of making sure child trafficking doesn’t happen. I told him, ‘go outside!’ He told me that child trafficking is not a priority in Ghana. He was frank about it. He was totally against slavery, but he said there are lots of things that go into this problem.” Police aren’t trained to identify trafficked children or to know what to do with them, and traffickers pay off police and border guards. Because parents have to pay for school, not all children go. Those who do go eventually drop out so they can work. If there’s no work and relatives can’t help, sending a child away to work on Lake Volta starts to look like an option. Even if that option is slavery. e Angela Harper graduated in May with a dual degree in sociology and international studies from the College of Arts and Sciences. She will attend the UNC School of Law this fall to focus on international law and human rights. endeavors 29


The technology behind video games could come in handy in a war zone. Or even at the crosswalk. by Margarite Nathe Smoke from a car bomb rises behind a U.S. soldier, who stands guard near the entrance of a Baghdad hotel.

Virtual danger, safer places S

ay you’re a U.S. soldier on patrol in Baghdad. Your job: walk the perimeter line, maintain the peace, keep your eyes open. You’re strapped into heavy-duty body armor and carrying an M16 in the crook of your arm. The streets are busy, the weather is warm, and the monotony of it all is making you a little sleepy. Then the shooting starts. You have no idea who’s doing it or where it’s coming from. Before the smoke clears, the shooter vanishes, blending so seamlessly into the hysteria on the streets that you’ll never know where he came from or where he went. What were you doing in those few seconds? And what will you do in the moments to come? Really think about it. You could say that when the firing started you would have ducked and covered, or that you would’ve immediately returned fire, or that you would have done any number of things. But 30 endeavors

you never know for sure until you’ve been through it. Training that uses virtual reality can help soldiers imagine themselves in situations like this before they ever happen. And simulation technology—the nuts and bolts software behind virtual reality—is at the core of everyday civilian lives, too: the patterns of our highways, the engines in our cars, the layouts of our doctors’ offices. Being able to model how we do things as a society and as individuals can keep us safe, whether it’s at home, at work, or in enemy territory.


n terms of sheer numbers of military personnel, North Carolina ranks fourth in the country. And for the past ten years, the U.S. Army has been investing billions of dollars into developing more advanced training for the thousands of troops they send to the Middle East. The Army put those

two facts together and in 2003 approached Carolina computer scientists Ming Lin and Dinesh Manocha. By then, Lin and Manocha had already been working on algorithms for new kinds of simulation technology for at least a decade. Now, through a modeling and simulation program called OneSAF (which stands for One Semi-Automated Forces), officials at over six hundred sites across the country have used Lin and Manocha’s defense technology. This kind of simulation training can teach soldiers how to stay alive in the combat zone. “When we say defense, we’re not talking about hardware, like missiles or nuclear weapons,” Lin says. “Rather, it’s about simulation and training technologies that can actually train our military personnel so they can get away from their attackers or defend themselves.”

“The nature of war has changed,” Manocha says. “The kind of war they have in the Middle East and other parts of the world today is what we call urban warfare.” The violence of today’s wars takes place in the cities—in buildings and on streets—more often than in open battlefields where the armies can see each other coming. And while some war-making takes place from far afield—explosions caused by the push of a button thousands of feet above or hundreds of miles away from the resulting boom—most U.S. soldiers are at eye level with the action. Knowing how to respond quickly and calmly in a tight situation can save their lives. “Before the Army sends someone from Fort Bragg to Iraq, they need to know how to train them for a very difficult and hostile environment,” Manocha says. “So the Army has been using what we call training environments to give soldiers a feel for what it will be like there.” Most of these training exercises involve powerful computers (or even supercomputers) that are paired with some sort of display, whether it’s a computer monitor or a huge projector. Some even use head-mounted displays. (One of Lin and Manocha’s goals is to develop powerful software that could be used with hardware that’s easier to get hold of—everyday desktop or laptop computers that you can buy in any electronics department.)

But how can a simulation be anything like the real thing? Well, the more realistic it is, Lin and Manocha say, the better. Take shadows. You can create a digital image of a potted plant at the end of a hallway, and most of us would be able to look at the monitor and tell what it’s supposed to be. But if you manipulate the image to add in detailed, realistic shadows, we’ll be able to tell what the lighting is like in the hallway, how close the plant is to the wall, and how close it is to where we’re standing. The simulation of sound in these virtual or training environments is especially important to get right, Lin says. “Soldiers use sound cues all the time. A lot of the sounds you hear in the movies are actually prerecorded, and they can be very expensive and dangerous to record—for example, explosions. And so we’re using computers to simulate those sounds.” By designing new algorithms, they try to get realistic sound propagation: Is that voice coming from a tiny room or a deep chamber? How big was that piece of glass that just shattered, and from how high did the pieces fall?


o make all this happen in a virtual environment—and happen fast—the software designer has to develop very efficient algorithms. An algorithm is a set of rules to make the software do certain things: first the foot moves forward, then it touches the ball, then there is a sound and the ball

rolls away. Those instructions come in the form of a lengthy mathematical equation. “That’s what computer science is all about, you know,” Manocha says. “Designing efficient algorithms.” One of the hardest things about creating a realistic virtual training environment, Manocha says, is that it has to be very fast, just like a video game. “If you have a joystick in your hand, or a mouse, and you take an action, the game responds to you right away.” Here’s an example: You’re a soldier inside a bombed-out building. It’s dark, shadowy, and there’s smoke in the air. There are several pounding footsteps on the floor above you, gunshots in the street outside, and the whispery sound of cloth rubbing against cloth—a stealthy movement—coming from just around the corner. If this were a real-life scenario and the lurker just ahead of you jumped out from the shadows, your mind would give your body the command to defend itself—jump to the side, roll forward, take aim. In order to survive, your body would have to respond to that command instantaneously, in a fraction of a second at most. That’s what makes a realistic virtual training environment; if all of your senses are engaged and the environment is truly interactive—that is, things happen in real time—you can have a completely immersive training session. Things have to be detailed, have to look and feel real.

Knowing how groups of people move with and around each other can help computer scientists develop realistic simulations for evacuation procedures, pedestrian walkways, and even sports arenas. Images courtesy of GAMMA Research Group.

endeavors 31

Knowing how people and things move, interact, and function means that you can use different sets of algorithms to design cities, power plants, and commercial airplanes—Lin and Manocha’s team worked closely with Boeing to make algorithms to model the 777 and 787 airplanes. And medical students are beginning to use virtual environments to learn how to perform surgeries; they can rehearse everything from incisions to suturing this way. “Surgeons need to have their hands practiced,” Lin says. “And you don’t want to be the one they practice on.”


Dinesh Manocha (left) and Ming Lin. Photo by Steve Exum.


ome of the most engrossing simulations out there are a product of the videogaming industry. Today’s video and computer games can be so real, so intense, that more than one of them has given players nightmares. And gamers love that. Software developers designing algorithms for simulation love it, too, and that’s one of the reasons they sometimes borrow concepts from the video game industry. “The whole concept of a video game is that it runs at an interactive rate, and users are immersed in it because the display looks real and the behaviors feel real,” Lin says. “And so we’re thinking that if we develop equally efficient engines, like those that are behind the design of all these games—the graphics engine, the display engine, the physics engine, the behavior engine—we could start modeling all types of behavior.” And Lin and Manocha do model different types of behavior; their work involves what they call simulation technology for dual use, which means that it can be used for dozens of things beyond defense. 32 endeavors

side from stitches and open heart surgeries, there are plenty of things you don’t want to have to learn on the fly. One of them is how to evacuate your collapsing office building. Manocha points to a computer screen where dozens of colored cylinders are bumping and scooting around inside what looks like a maze. It’s a floor plan, Lin says. The breaks in the lines are doorways, and the cylinders are people. (“In modeling these,” Manocha says sheepishly, “we don’t worry about looking realistic like in the Pixar movies. We want to perform these simulations in real-time, unlike Hollywood studios that take minutes to generate a single frame.”) Then he says, “Say you’re designing a new building, and there are four floors and a hundred people in each floor. If there were two emergency exits, how long would it take to evacuate this building? How long if there are four exits? What is the hallway like leading to the exit? How large is the average person? These are the kinds of questions you want to ask before you design the building.” He pushes a button and the cylinders mob the doorway at the bottom of the screen. “You know how people go through exits,” Manocha says. “The whole crowd gathers. There’s a little bit of pushing going on. So

the question is, if I made this doorway twice as wide, would my evacuation time go from four minutes to two minutes?” And with just a few tweaks to the algorithm, he and Lin can play with the layout of the floor plan, change the number of exits, make the cylinder-shaped people slimmer or fatter. “Similar kinds of training technology can be used for training first responders, emergency response, police—things that are really important to what people call homeland security,” Lin says. “It’s not necessarily about going out there and fighting with another country, but even defending our own country, being able to respond to natural disasters. Some of the work we do with simulating crowds is targeted toward that area.” A good understanding of mob psychology, crowd flow, and transportation issues can help computer scientists develop simulations to predict how huge numbers of people— we’re talking hundreds of thousands, entire cities’ worth—will react in emergency situations. Where do people go during a crisis? How do they get there? How well do they drive when they’re panicking and in heavy traffic? How do crowds in a subway bottleneck to squeeze onto a subway car? How do people weave through crowded sports arenas without bumping into each other? Whether it’s during a terrorist attack or a natural disaster, knowing the answers can help emergency responders do their jobs more quickly and effectively. For example, officials in cities that are at high risk for flooding or storm damage could use simulated models of pedestrian flow—another project Lin and Manocha are working on—to design better roads and freeways. For that, the programmers need to know such things as how quickly most people cross the street, and how much wider the crosswalk would have to be to shave fifteen seconds off the crossing time.

“Surgeons need to have their hands practiced. And you don’t want to be the one they practice on.” —Ming Lin

As the start of the hurricane season approaches, Lin and Manocha are getting ready to work with Carolina’s Renaissance Computing Institute to tackle a big question: Exactly how long would it take to evacuate the coastal city of Wilmington? No one really knows yet, Manocha says, and although a lot of their research is still in its youth, crowd simulation technology may provide the answer. “People think games are fun, that games are for our kids,” Manocha says. But simulation technology—whether it’s designed to score points or to keep you safe—is a serious industry. The field is growing, too, especially in North Carolina. In October of 2007 in

a virtual epidemic

the Triangle area alone, Lin says, gaming companies reported roughly two hundred open positions—these are high-paying software development gigs. It’s the same with defense contractors—companies and even the government can’t fill the jobs fast enough. e Dinesh Manocha is the Phi Delta Theta/ Matthew Mason Distinguished Professor and Ming Lin is the Beverly W. Long Distinguished Professor, both in the Department of Computer Science in the College of Arts and Sciences. Their work is funded by the Defense Advanced Research Projects Agency, the U.S. Army Research

Office, the U.S. Army Research, Development, and Engineering Command, the Intelligence Advanced Research Projects Activity, the Office of Naval Research, the National Science Foundation, the Intel Corporation, the Walt Disney Company, the Microsoft Corporation, and the U.S. Department of Defense. Other Carolina researchers working on realistic virtual training environments include Fred Brooks, Henry Fuchs, Greg Welch, Mary Whitton, and Herman Towles. Their teams include more than fifty undergraduate and graduate students.

Bodies of characters killed by an epidemic called Corrupted Blood cover the ground in Ironforge, a city in the popular online game World of Warcraft.


n September 2005 an epidemic called Corrupted Blood ran rampant on the online role-playing game World of Warcraft. It killed so many that the fantasy world’s major cities were uninhabitable. As players mourned the deaths of their characters, an epidemiology student saw an opportunity. Popular games such as World of Warcraft could allow epidemiologists to improve their predictions of how people behave during epidemics, Eric Lofgren says. Epidemiologists use computer simulations in their planning for pandemic influenza. “We make all of these assumptions about human behavior,” Lofgren says. “What if none of them are true?” But online gaming could help. World of Warcraft is the largest of what are known as massively multi-player online role-playing games, in which players create virtual characters who go on quests, battle monsters, and interact with other characters. Players take the game seriously. Lofgren, who admits he spent “too much” time playing World of Warcraft as an undergraduate, says, “there are people who treat it like a full-time job.” Because players invest so much effort in developing their virtual characters, the game offers opportunities for the study of human behavior, Lofgren says. Game designers created Corrupted Blood as a challenge to powerful players fighting a winged serpent named Hakkar in a remote region of the game world. They did not foresee the spread of the disease to weaker players, who died in droves. The disease spread rapidly in part because of the ability of characters to transport themselves instantly to other locations—similar to how people traveling by airplane spread disease.

The Corrupted Blood epidemic provided several insights into behavior. “The amount of altruistic behavior was very surprising to us,” Lofgren says. Many players deliberately went to infected areas to attempt to heal others who were ill, only to become infected themselves. Other players violated the quarantine set up by game creator Blizzard Entertainment to try to contain the disease. And a few players acted as “super spreaders,” deliberately trying to infect others, Lofgren says. Unfortunately for epidemiologists, no one was collecting data during the epidemic that could have allowed for detailed analysis of how characters spread the disease. That’s what Lofgren and his former advisor, Nina Fefferman of Tufts University, hope to change before the next online epidemic emerges. The challenge will be to get game-makers’ cooperation. —Sheila Read Sheila Read is a master’s student in journalism at Carolina. Eric Lofgren is a doctoral student in epidemiology in the School of Public Health. He coauthored an article, “The Untapped Potential of Virtual Game Worlds to Shed Light on Real World Epidemics,” that appeared in the September 2007 issue of Lancet: Infectious Diseases. endeavors 33

George Plym has survived eleven brain tumors and has lived with cancer for more than forty years. Aside from some problems with his memory—which doctors attribute to his radiation treatments—he says he’s doing very well. Photo by Coke Whitworth.

A fight to the death Can science trick brain tumor cells into committing suicide? by Prashant Nair

34 endeavors

On a baseball field one hot

summer evening in 1967, eleven-year-old George Plym watched a ball sailing toward him. As the moment approached to make the catch, he couldn’t make up his mind which ball to reach for—Plym saw two balls where there should have been one. His parents took him to an eye doctor, who found nothing wrong with his eyes and prescribed rest. But Plym’s double vision worsened, and he started getting blinding headaches and bouts of nausea. A neurological exam confirmed his parents’ worst fears. A ball of brain tissue the size of an orange was pushing against Plym’s optic nerve.

Plym’s neurologist diagnosed a grade-three oligodendroglioma—a late-stage, recurrent brain tumor. When doctors diagnosed Plym’s first brain tumor, they thought he had about a year and a half left to live. Today, at age fifty-three, Plym is a retired sports-car technician who lives in Asheville, North Carolina. He has survived eleven relapses of his brain tumor, and he’s determined to fight his cancer to the death. Most brain tumor patients are not as lucky as Plym. Many die within a year of being diagnosed with late-stage glioma. According to the National Cancer Institute, 20,500 people were newly diagnosed with a brain tumor in the United States in 2007. Of those, 12,740 died that same year.


reatment for glioblastoma involves surgically removing the tumor and treating the area with a beam of radiation to kill tumor cells that the surgery missed. Chemotherapy follows for several weeks. Radiation and chemotherapy keep Plym alive, although the treatment actually triggered some of his tumors. As Plym fights to stay alive, hearing loss, balance problems, seizures, headaches, numbness in his limbs, and double vision punctuate his day-to-day life. Radiation therapy has taken away his energy along with the tumor cells. “I would wake up feeling tired already, no matter how much rest I had,” Plym says. After each dose of radiation, he’s so tired that he can’t stay awake for more than two hours at a stretch. “That has continued over the years, and I’m still dealing with that,” Plym says. Matthew Ewend treats hundreds of brain tumor cases every year at UNC Hospitals. Although the treatment works for some, it does not dramatically improve survival time, Ewend says. Most must endure the side effects of radiation and chemotherapy during the extra time the treatment allows them to live. “It’s always difficult to tell a forty-five-year-old patient with grade-four glioma that the treatment might prolong their life by twelve to fifteen months,” Ewend says. A major challenge of the therapy is targeting tumor cells with the radiation and chemotherapy, Ewend says. A specialized form of radiotherapy called brachytherapy delivers the radiation as close to the tumor as possible. In brachytherapy, radioactive seeds are surgically placed at the tumor site, or a catheter that delivers a radioactive liquid to the site is inserted into the brain. Ewend says brachytherapy cannot be performed on all patients and can cause brain swelling. The surgery involved increases the risk of a stroke, he adds, although surgery rarely causes one. The biggest problem with current therapies is that the tumors almost always come back, Ewend says. About 90 percent of patients with high-grade glioblastomas die within eighteen months of therapy.

Ewend says that molecular therapies aimed at manipulating tumor cells to trigger their own death hold great promise. But there’s an obstacle—a membranous shield surrounds the brain and regulates the flow of molecules carried by blood into the brain. Ewend cautions that it would be no easy task to get a “magic bullet” treatment across the blood-brain barrier. The question of how to treat brain tumors looms large, but the answer may lie hidden within the tumor cells themselves.

Plym in the hospital in 1967, at age eleven, five days after the first surgery to remove his brain tumor. Photo courtesy of George Plym.


ohanish Deshmukh, a neuroscientist at Carolina, is looking for answers inside the tumor cell. Deshmukh tried to exploit the innate vulnerability of brain tumor cells to a process called apoptosis. Cells in the brain, as everywhere else in the body, are programmed to die when the time is right. Age, stress, environmental agents, and microorganisms can turn on apoptosis, or programmed cell death, causing cells to die off and be replaced by new ones. Cancer can also trigger apoptosis.

“Apoptosis is an inherent mechanism of cell suicide that gets turned on when our cells detect abnormalities in themselves,” Deshmukh says. Tumor cells manage to override apoptosis before they go on an often fatal rampage, he adds. If they couldn’t override apoptosis, cancer would not be the scourge it is. “This also makes many cancer cells resistant to death by chemotherapy because chemotherapy essentially tries to activate cell suicide,” Deshmukh says. Chemotherapy does not specifically target cancer cells and often fails because cancer cells override apoptosis while normal cells succumb to it. Doctors do a tricky balancing act when they decide just how much chemotherapy a patient can take. The holy grail of molecular oncology has been to find a way to target cancer cells for death without affecting normal cells, Deshmukh says. Apoptosis destroys cells through a systematic, multistep process. Traditional chemotherapy targets an early stage in apoptosis. A molecule called cytochrome c that occurs naturally in human cells targets a later stage of the process. Deshmukh knew from previous research that several types of brain cells become resistant to apoptosis when they mature. Neurons need this resistance partly because they are difficult for the brain to replace and often must last until a person dies. Deshmukh’s team had found that normal neurons are sensitive to death by chemotherapy, but resistant to death by cytochrome c. “For most normal cells, injecting cytochrome c means the game’s over. Not so for mature neurons. We were very surprised when we found out,” Deshmukh says. This difference between neurons and most other cell types is attributed to complex molecular cascades involving proteins that regulate cell division. But the million-dollar question was whether brain tumor cells were also resistant to apoptosis by cytochrome c. Deshmukh sought to find out by testing cells from mouse models of two commonly occurring, late-stage brain tumors. endeavors 35

Yolanda Huang

At right, malignant cells from a human brain tumor, or neuroblastoma, show marked structural differences from the healthy neurons shown at left, which are sympathetic neurons from a mouse. Neuroblastomas occur most often in infants and young children.


eshmukh and his graduate student Yolanda Huang tested several different cell types from brain tumors and found that all were sensitive to cytochrome c. Around the same time, Sally Kornbluth, a Duke cell biologist who had been working on brain tumors and cell death, saw the same thing. Kornbluth and Deshmukh joined forces. Kornbluth’s team implanted three types of human brain-tumor cells in mice, extracted the tumors and tested them for sensitivity to cytochrome c. All three tumors revealed the signature of cytochrome c-induced death, while normal human brain cells remained resistant. To mimic human brain tumors more closely, the two teams asked whether cytochrome c could induce cell suicide in mouse models of glioblastoma and medulloblastoma. These mice had spontaneously formed brain tumors that closely resemble human brain tumors in their molecular characteristics. Once again, extracts from normal brain cells were resistant to cytochrome c, while those from the tumors showed distinct signs of cell death. The results were rock-solid, and the researchers had indeed hit upon the Achilles’ heel of brain tumors. But a big part of the picture is still missing. Deshmukh does not know if cytochrome c can shrink tumors. All he has are molecular signatures of death in the tumors uncovered in a series of test-tube experiments. “We don’t have a way of introducing cytochrome c into brain tumor cells,” Deshmukh says. Cytochrome c comes attached to a chemical structure called heme—the same chemical compound that colors blood hemoglobin red. Heme-attached cytochrome c 36 endeavors

protein is too bulky to get across the cell membrane. “We’re screening for small molecule compounds that are cell-permeable and would have the same activity as heme-attached cytochrome c,” Deshmukh says. His team is also trying to modify cytochrome c to get it across the membrane. A third option may be to use nanoparticle technology to deliver cytochrome c to cells. The team is collaborating with Carolina nanotechnologist Joseph DeSimone to investigate that possibility. Getting cytochrome c into tumor cells may not be Deshmukh’s only concern. Just getting the molecule into the brain—and only the brain—is a mighty challenge. Cytochrome c cannot be administered as an intravenous injection or taken as a pill because it targets actively dividing cells. The vast majority of normal neurons are non-dividing and therefore resistant to cytochrome c. But cells elsewhere in the body are in various stages of division, and just one systemic hit by cytochrome c could wreak havoc. Eugene Johnson, a neuroscientist who works on neuronal apoptosis at Washington University in St. Louis, says that in principle, a small molecule that mimics cytochrome c could be administered by techniques such as convection-enhanced delivery to reach the bulk of the tumor and destroy it. “The difficulty is that I am not aware of any such compounds. But this may be an approachable problem,” he adds. Deshmukh says he does not believe that a chemical compound would be ready for a first phase of clinical trials for at least another ten years. Kornbluth says one possible application of

the findings would be to use cytochrome c, engineered to permeate the cell membrane, instead of a chemotherapeutic agent in the wafers placed adjacent to brain tumors during the surgical removal of tumors. “This wouldn’t affect the normal brain cells because they’re not sensitive to cytochrome c,” she adds. There’s promise in brain cancer research to slow the march of the disease one day. Until then, survivors pin much of their hope on the will to live. “When I was young,” Plym says, “I had a big space between my teeth, and I was conscious of it. We thought about getting braces to get that straightened out, but the doctors said that I wouldn’t live long enough to have the braces off. Now, in my fortieth year of being a brain tumor survivor, I’m glad I went and got those braces. When I do die, I’ll die with a smile on my face.” e Prashant Nair is a master’s student in medical journalism at Carolina. Matthew Ewend is the Chief of Neurosurgery and a Distinguished Associate Professor of Surgery. Mohanish Deshmukh is an associate professor of cell and developmental biology. George Plym is the founder and president of the Western North Carolina Brain Tumor Support Group.

Plym at home in Asheville, North Carolina. “I am so appreciative of life,” he says. Photo by Coke Whitworth.


hen I was in a band, our drummer would show up to practice drunk before passing out on his snare. He seemed to be forever hung over. After checking out Fulton Crews’ research on the effects of binge drinking, I’m not sure “forever” is an exaggeration. Crews studied the effects of alcohol on rats, mice, and humans, and found that binge drinking can cause the kind of brain damage that impairs decision-making long after the drinking stops. In one experiment, Crews showed that adult rats that had been exposed to binge levels of alcohol three weeks prior could learn things effectively but had trouble relearning. Crews’ researchers placed a control rat in a tub of water. The rat easily learned to find a platform just below the water’s surface. The other rat—now sober for three weeks—also easily learned to find the platform. When the platform was moved to a different place, the control rat swam to the old location, couldn’t find the platform, spun around once, and then searched briefly before finding the platform. The alcohol-exposed rat swam to the old location and couldn’t find the platform, but instead of searching elsewhere, it circled the old location repeatedly. Eventually the rat gave up and swam slowly along the side of the tub until researchers removed it. This behavior suggests that relearning processes are disrupted long after a binge-drinking episode. Some of Crews’ other studies have shown that binge drinking can cause loss of neurons and inhibition of neurogenesis— the process by which stem cells form new neurons. Such changes in brain structure cause subtle behavioral changes long after binge drinking. “Repeated behaviors and difficulty relearning can happen when you have damage to the frontal cortex,” Crews says. “You can’t pay attention and you don’t think through decisions. You don’t think, ‘Where’s that platform, I just looked where I thought it was?’ You keep doing the same thing repeatedly without thinking about it.”

Forever Forever hung hung over over by Mark Derewicz

first to discover increased cytokines in the brains of alcoholics. Cytokines regulate anxiety, depression, mood, and drinking behavior, all of which are associated with alcoholism. His lab is still studying how binge drinking, cytokines, and neurotoxicity might change behavior. Crews says that binge drinking might disrupt the parts of the brain that normally inhibit impulses. These brain regions typically promote thoughtful reflection on the consequences of actions. Crews says, “It’s possible that binge drinking changes the cerebral cortex and with each binge a person becomes more impulsive, and then compulsive, and then he’s stuck. He’s addicted.” e Fulton Crews is the director of the Bowles Center for Alcohol Studies. Fulton Crews


n other studies on the effects of binge drinking, Crews found that cytokines—a group of proteins—were released in the liver and traveled to the brain, where they are toxic to neurons and stem cells. Crews found that cytokine levels in blood serum and the liver returned to normal after one day, whereas cytokine levels in the brain stayed elevated for months. This long-lasting increase in brain cytokines could result in long-lasting changes in brain function, similar to the disrupted learning of the rats exposed to binge levels of alcohol, Crews says. But mice and rats aren’t men, so Crews studied human postmortem brains that he got from a donor program in Australia. He found that cytokine levels were much higher in the brains of alcoholics than in moderate drinkers. Alcoholics are known to have smaller brains due to chronic drinking, but Crews is the

Brain sections from control rats (top row), rats that were binge-treated with ethanol for four days (middle row; T=0), and rats treated with ethanol for four days with three days abstinence (bottom row; T=72). The binge-treated rats in the middle row show extensive brain damage (dark sections) during intoxication. This kind of damage is still visible in the binge-treated rats that had a three-day abstinence. The loss of these neurons may cause lasting changes in brain structure and behavior.

endeavors 37

A girl’s life

and a glass half-full Novelist Marianne Gingher isn’t apologizing. by Margarite Nathe Every four or five years when Marianne

Gingher was a child, her father (nicknamed Rabbit) would buy a new car without telling her mother. When Bunny (her mother’s nickname) saw the unfamiliar car in the driveway, she’d say, “I wonder whose car that is?” Rabbit would look up from his newspaper and say, a twinkle in his eye, “Oh, Bunny, I don’t know whose car that is.” “And then she wouldn’t speak to him for five days,” Gingher says. After one such episode, little Marianne crept downstairs in her pajamas. Her father was eating dinner at the kitchen table, and when he caught sight of her peeking out at him, he grinned. “Daddy’s in the doghouse,” he said. Naturally, a murder in the Rabbit family was the topic of Gingher’s first short story, which she composed as a series of crayon drawings when she was six. Mrs. Rabbit lies dead on the floor in a pool of gore, a knife buried to the hilt in her back. Mr. Rabbit stands over her. His expression is unreadable. “Promptly after Mrs. Rabbit’s funeral—at which even the old car cries tears—Mr. Rabbit, who was never implicated in the crime, goes out and buys a new car,” Gingher says. “And he’s smiling, and there’s a lipstick print of a new rabbit on his cheek. That’s how the story ends.” 38 endeavors

The drama seemed dire when she was a little girl, Gingher says. But over the years her parents turned the business with the cars into a sort of a game. “My parents were in a very happy marriage,” she says. And that’s one of the reasons Gingher and her brothers had such a happy childhood. Her first book about her life was called A Girl’s Life: Horses, Boys, Weddings, and Luck, if that tells you anything. In spite of all those well-adjusted years, though, A Girl’s Life is full of sharp edges, hawk-eyed character assessments, and a girl’s wish to give in to her inner ruffian: I once stepped into my mother’s car—she was transporting a freshly baked pie to my grandmother—and planted one sludgy boot in the middle of the pie. It was clumsy of me, ruinous and knavish, but I laughed. What did a bit of nastiness matter? I was immune to the nitpickery of cleanliness and caution. I remember thinking that my mother was fussy not to try and salvage the pie. I would have eaten around the bootprint. Unlike Dorothy Allison, Mary Karr, and other memoir writers from the South, Gingher was at ease in her hometown. She was never neglected. She didn’t suffer at the hands of her family. In an interview with Cellar Door magazine, she said, “I didn’t have to worry about when Daddy came home like so many kids do, because when Daddy came home that was a cause for celebration. I didn’t have to worry that Mama would be drunk when I came home from school. No, Mama would be there and she’d probably made cookies. And she would sit down and play cards with me if I asked her to. It was a privilege. I didn’t do anything to earn it; it was just there. And I don’t want to have to apologize for it.” So, you might ask, where’s the story in that? It was Eudora Welty who said, “A sheltered life can be a daring

life as well. For all serious daring starts from within.” It was the serenity in Gingher’s home that grew her into such a daring, bigeared pitcher, as her mother called her. “I was a bit of an eavesdropper,” Gingher says. “I wanted to be a spy.” When she was growing up in the 1950s, all of Gingher’s friends loved to play at her house, in part because of the even tempers and her mother’s fragrant baking. “My home life was fairly predictable,” she says. The walls and carpet were innocuous, neutral colors; soda bottles wore little sweaters to keep them from dribbling on the furniture. “But as a kid, I had a built-in hunger for conflict. I didn’t have it at home in my beige household, so I went out and found it.” Her favorite place to look was in the homes of her friends, many of whom had less-than-sunny family lives. She was enthralled by the hiding, the tiptoeing, the bullying fathers, the absent mothers, even the shouting matches and hair-trigger punishments. “I loved going over to the ferocious stew of their houses, and listening, observing,” Gingher says. “I had to experience what I didn’t know about any way I could, even if it was just as a fly on the wall.” One such friend lived in a mansion so massive it could have doubled as Gone with the Wind’s Tara, and carried the distinction of a former owner who was jailed for being a member of the Communist Party. The friend’s mother had suffered a stroke that left her almost mute and marooned in a wheelchair. The father, a college professor and businessman, ate formal dinners with the children before slipping out to attend to his secretive social life. The older children in the family were supposed to check in on their mother periodically, but they made a haphazard job of it. They were generally too busy scrambling over the roof, perching themselves on the gables, and skylarking about the ballroom. Gingher writes about the woes of pronated ankles and adolescent awkwardness in A Girl’s Life, as well as her more sobering early encounters with death and racism. But the book is mostly a celebration of youthful simplicity—selling horsedrawn sled rides to other kids, the celebrity in being elected classroom monitor (and all the

Facing page, center: As fifth-grade class president, Marianne Gingher faced impeachment when the boys in her class discovered that, whenever their teacher left the classroom for a smoke, Gingher documented only the misbehaving boys’ names in her classroom monitor notebook—the girls never made it into the record. Facing page, left and right: A new car almost always caused a fight in the Buie household. These shots show Marianne Gingher’s parents, Rabbit (Roderick Mark Buie, Jr.), and Bunny (Betty Jane Buie) in 1955 with the new family Buick. Images courtesy of Marianne Gingher. This page: Gingher in her office with a cardboard cutout of Star Trek’s Dr. McCoy.

crimes she recorded in her Blue Horse notebook during her reign). In fact, the idea to write A Girl’s Life came from a book reviewer’s letter in The New York Times lamenting that there were virtually no memoirs written about happy, trauma-free childhoods. Gingher’s glass is still half-full in her new memoir, Adventures in Pen Land. “But I gave it some ballast,” she says, “so it wouldn’t sail off into the rosy sunset.” The chapters cover her life from 1953—the year of the murdered Mrs. Rabbit—to 1986, when she published her first book. A lot happened in that time, she says, and it wasn’t all happy. It was between those years, particularly when she first started teaching, that Gingher had a series of crippling panic attacks. “They were awful,” she says. She never thought she’d put those moments on paper—the terror, the nausea before every class session she taught. Same thing with her separation from her husband, and her subsequent single-motherdom. One chapter of Adventures in Pen Land is an exposé of some of the loonier students she’s taught in her creative writing classes since 1975. Early in her teaching career, she says, her sunny classroom atmosphere was a beacon to eccentric personalities within the major—including some who thought her workshops wouldn’t burden them with honest criticism. “One student went back to his dorm after creative writing class,” Gingher says, “and was so upset by the criticism he got in class that he set his dorm on fire.” Whether he dropped out or was expelled, she’s not sure, but he never showed up in class again. Criticism can cast a decades-long shadow on a writer, or anyone, Gingher says. She laughs off some of the tougher reviews she’s gotten—the big red C (for “Crummy”) that her high school nemesis on the yearbook staff scrawled across her work, the writing teacher who took one look at her batch of poems and told her she was “no poet” (“He probably did both the world and me a huge favor,” Gingher says). But here she is, many years and dozens of stories beyond the Mrs. Rabbit murder mystery. “Somebody is always standing on the periphery of your artistic jubilee, wanting to jump in and stomp it flat, at any age,” Gingher says. “And you either rise Jason Smith above it, or you get so depressed by what you’re told that you give up. You have to get on the horse you fell off of and you carry on. Or you get on a different horse.” e Marianne Gingher won the Johnston Teaching Excellence Award in 2007. She’s an associate professor of English and codirector of the Thomas Wolfe Scholarship program at Carolina. Adventures in Pen Land (illustrated by Daniel Wallace) will be published in 2008 by University of Missouri Press. Some segments appeared in serial form as “A Woman at Work and Play” in The Rambler. Gingher also edited GRAM-O-RAMA, a maverick grammar book published in 2007 about the similarities between grammar and music, written by Daphne Athas of the Department of English. Gingher is now compiling an anthology of short-short stories by North Carolina writers, which will be published by University of North Carolina Press. endeavors 39

Jason Smith


g-force and gray matter meet

High-tech helmets pinpoint the forces that damage a brain. by Mark Derewicz

Melik Brown threw

a typical block during a football practice. It didn’t look like much, but he left the field with a concussion. A few weeks later Brown suffered another concussion while making a routine tackle during a game. This tackle also wasn’t one of those crushing blows seen on SportsCenter. Kevin Guskiewicz, director of Carolina’s Sports Medicine Research Laboratory, videotaped both plays and showed them to then-head-coach John Bunting, who immediately noticed that Brown lowered his head before impact. Bunting and Guskiewicz showed Brown the evidence, trying to hammer home a long-standing lesson— don’t lower your head during a hit. In the second-to-last-game of the season, Brown slammed into an NC State player during a kickoff return. It was a much harder hit than the ones that caused his concussions. But after the play Brown trotted to the sidelines unscathed. 40 endeavors

Jason Mihalik and Kevin Guskiewicz with one of the test helmets.

“The video shows him rotating his head out of the way just before impact,” Guskiewicz says. “His shoulder pads absorbed most of the blow. This is just one case, but it shows that there’s a lot more to concussions than just force of impact.” Guskiewicz has been studying concussions ever since he was an athletic trainer for the Pittsburgh Steelers. He started UNC’s Center for the Study of Retired Athletes, researching how football injuries have longterm consequences, including depression and dementia from concussions. (See Endeavors, Fall 2001, “Play Now, Pay Later.”) Concussions occur when the force of impact to a person’s head causes the brain to move violently against the opposite side of the skull. Headaches and dizziness are the most common symptoms. But concussions can also cause nausea, light-headedness, balance problems, light sensitivity, ringing in the ears, disorientation, blurred vision, amnesia, and loss of brain function.

Guskiewicz’s lab is coming to the end of a five-year study of head trauma in college football players. It’s the first research project in the nation to use state-of-the-art technology to study head trauma during live practices and games.


ixty Carolina players use helmets fitted with accelerometers, which are soft, spring-like devices situated between the cushions inside the helmets. When a player’s helmet is struck, the accelerometer records the g-force (a measure of acceleration against the earth’s gravitational pull). In a roller coaster or fighter jet, a person withstands about 4.5g. In a car crash at 25 miles per hour, a test dummy hits the windshield at 100g. In football the majority of impacts fall between 20g and 25g. But hits of 50g to 120g are common, and some approach 200g. Four years ago the National Football League commissioned research in which

biomechanists created a simulation experi- hits to the top of the head. And two other symptoms. The worse the concussion, the ment with crash test dummies. They esti- hits that caused concussions—one to the side higher the score. mated that any hit above 85g would likely and one to the front—were awfully close to The four concussed players with the result in a concussion. Anything below, the top of the head, too. lowest g-force impacts were tested within probably not. The average top-of-head impact was 29g, forty-eight hours of the hits. Each player “That’s not what we found,” Guskiewicz much higher than the average g-force of failed to meet his baseline scores. Oddly, says. His lab recorded thirteen concussions impacts to the front, sides, and back of the the players with the four highest impacts over five seasons. Six concussions came from helmet. did not show measurable deficits from their hits at or below 85g. The other seven ranged A player’s lowered head turns a helmet preseason baselines. And their concussion from 100g to 169g. into a battering ram, which can lead to con- symptoms were not nearly as bad as the “People see massive hits and think, ‘That’s cussions. Guskiewicz also says that helmets running back who was injured during an the one,’ and ignore more trivial blows,” he were redesigned several years ago to give impact of 60g. says. But those less spectacular tackles and more padded support to the sides of the head blocks can be deceptive. They’re part of because manufacturers thought players susll of this tells Guskiewicz and Mihawhat Guskiewicz calls a hidden epidemic. tained more hits there. But doctoral student lik that the reasons for head trauma Some players walk off the field feeling a little Jason Mihalik found that players sustained are a lot more varied than researchdazed. If a coach or athletic trainer doesn’t hits to the sides of the head 14 percent of the ers previously thought. Athletic trainers and notice, the player might not say anything. time, far less than Guskiewicz expected. doctors need to be on the lookout for the But the player could be concussed, and They want to study this further, but not-so-huge hit, too. would need to take a breather, and possibly another round of helmet redesigns might Already, Guskiewicz and Mihalik’s miss several practices and games. be in the offing. research has had real-life implications not During the study, Guskiewicz’s researchThe accelerometers cannot measure only for Melik Brown but for several other ers recorded 104,714 helmet impacts. And symptoms of a concussion. So before the Carolina players as well. they measured many hits above 98g During the heat of preseason camp that did not cause concussions. In last season some players left the field Jason Smith fact, only one-third of one percent feeling woozy, weak, and nauseous. of all hits above 80g resulted in But those symptoms can point to concussions. Other players sustained dehydration, too. impacts of 60g and 63g and did suffer “We looked at the data from the concussions. accelerometers to see if any hits to “I have videos of players, their these players exceeded thirty-five helmets and mouth guards flying or forty-five g,” Guskiewicz says. “If out,” Guskiewicz says. “These hits not, they probably didn’t have a conlook like really dramatic injuries, but cussion.” The accelerometers didn’t they’re not. It’s important to know register any hits above 45g, so athletic the g-force, but we can’t neglect trainers gave the players fluids to see someone who comes in at sixty g; we if they would recover. “Sure enough, can’t assume he’ll be fine.” they did,” Guskiewicz says. “On the In fact, one Carolina running back field, we’ve actually been able to rule who was concussed at 60g had some out concussions more than we’ve been of the worst symptoms recorded able to identify them.” e during the study. One probable Kevin Guskiewicz chairs the Departreason, Guskiewicz says, is that the ment of Exercise and Sports Science and player took the brunt of the impact is the director of the Sports Medicine on the top of his helmet. Research Laboratory in the College of Arts Accelerometers also measure the and Sciences. He is also research director Inside the helmet: small white discs between location of each hit—top of the for the Center for the Study of Retired the larger cushions are accelerometers, spring-like devices that record data from helmet, side, front, or back—and Athletes. Jason Mihalik is a doctoral head impacts. Researchers use a sideline the angle of each impact. The more student in the Curriculum in Human computer to retrieve and analyze the data. severe hits are direct, but a lot of Movement Science. Graduate student tackles and blocks come from variMeghan McCaffrey also contributed to ous angles. this study. The researchers coauthored “We expected about five or six percent of study, Guskiewicz conducted clinical tests on three articles for the December 2007 issue of the all the impacts we recorded to be to the top each player to create baseline measurements journal Neurosurgery. The National Center of the head,” Guskiewicz says. “But it was of things such as balance, reaction time, for Injury Prevention and Control funds their much higher: nineteen percent.” posture stability, and cognitive function. He study, which is administrated through UNC’s Six of the thirteen concussions came from also created a way to score the severity of Injury Prevention Research Center.


endeavors 41



n early August 2002, a city water manager near Statesville, North Carolina, walked out of church and noticed something unusual. The South Yadkin River was only inches deep and so narrow that he could jump across it in his Sunday clothes. The next day, the river was gone. Water managers were stunned; they had never heard of a river disappearing overnight. Pilots flew along the dry riverbed in search of the water. But the city never figured out where it went. North Carolina has long been considered rich in water. But in just the last ten years, the state has suffered two droughts that have forced cities across North Carolina to implement ever-increasing water restrictions. Unlike in the West, where water is seen as a precious resource in short supply, institutions, businesses, and residents in the Southeast are unprepared to deal with recurring water scarcity. Many local politicians seem to view the record-setting drought of 2007–2008 as a problem to be solved by heavy rains. “You have a lot of people who see drought as a short-term issue,” says Richard Whisnant, an expert in water resources law. “In their view, as soon as it’s over—which could be any day now—we’ll be back to business as usual.” 42 endeavors

North Carolina, once awash in water, faces future shortages. Rains won't save us.

What will? But even when the drought ends, the problem of future water scarcity in the state will remain. The reason is the double whammy of climate change and a rapidly growing population. Climatologists predict that global warming will result in a hotter southeastern United States, with more frequent heavy storms but also more droughts. Mix in high demand for water from a public that doesn’t yet understand the need to conserve, and it’s a recipe for trouble. Signs of water scarcity have already appeared throughout the state. In 2005 the growing cities of Concord and Kannapolis sparked a water war with counties to the west when they asked to pipe water from the Catawba and Yadkin Rivers to their own Rocky River. Last summer, South Carolina sued North Carolina, arguing that the transfer of water will leave the Catawba dry, harming electricity generation, river recreation, and the state’s economy. The case

is pending with the U.S. Supreme Court. In February drought-stricken Raleigh began fining people $1,000 for using water outdoors. And last fall Rocky Mount nearly ran out of water before Wilson agreed to pipe it to the thirsty city.


he problem is one of supply and demand. The state’s population is just over eight million, already enough to strain water supplies during droughts. By 2030 these same water resources will have to slake the thirst of a projected population of twelve million. Population growth not only means more people—and more lawns—competing for the same water supply. It also deteriorates

Greg Characklis: One way to encourage water conservation is to raise the price.

water quality, especially when growth is sprawling. “This state has been sprawling at an unbelievable rate, but there’s no leadership on planning,” says Philip Berke, professor of city and regional planning. More impervious surfaces—such as roads, parking lots, and driveways—cause more runoff of pollutants into drinking water supplies. Stormwater runoff occurs when rainfall on the ground picks up contaminants such as fertilizers, herbicides, and insecticides from residential yards and farms; oil, grease, and toxic chemicals from urban and residential areas; and sediment from construction sites. The runoff then carries those contaminants into waterways. As a result, the state is struggling to curb algae growth from excess nutrients in many major reservoirs, including Falls Lake, Raleigh’s main water supply. And while building new reservoirs may seem like a long-term solution to water shortages, that’s unlikely to happen on a large scale. Reservoirs are very costly, take twenty to thirty years from planning to completion, and also raise environmental concerns because they destroy wild habitat. Water supply is only half of the problem. The other is excessive demand for water for nonessential uses such as green lawns. In 2007, the driest year the state recorded in 118 years, North Carolina received on average 34 inches of rain. That’s more rainfall than half the country receives in a typical

year. If the state’s water policy and systems were restructured to better regulate the use of water, there would be enough to go around, Whisnant says. We tend to throw away, or use carelessly, what we don’t value. “We take water for granted because there’s historically been so much of it in the East,” Whisnant says. Martin Doyle, an expert on rivers, refers to the diamond-water paradox. Although water is essential for life, and diamonds have no practical use, people pay exorbitant sums for diamonds and almost nothing for water. “The last thing you need in your day-to-day life is a diamond,” Doyle says. “In the end you absolutely have to have water. It’s worth any price when you’re at the end of it.” How cheap is local water? In North Carolina the median charge per gallon of water for a customer using 6,000 gallons per month is four-tenths of a cent, according to a report from UNC’s Environmental Finance Center. If you buy that same gallon of water at a major grocery store, they’ll charge you about $1, or 250 times more. For a 20-ounce

bottle of water from a vending machine, you’ll pay $1.25, or 2,000 times more. Here’s another way of looking at it. For the price of a Starbucks latte, you could buy 855 gallons of water. For 5 bucks, you could buy a 6-pack of beer or 1,250 gallons of water. “We want people to conserve, but there’s no reason to conserve if you’re charging them a distorted price for water,” Doyle says. Greg Characklis, a specialist in environmental engineering and economics, says, “One way to encourage conservation is to raise the price. But politically, that’s not a popular option.” Indeed, most water utilities have prided themselves on delivering water at the lowest possible price. “There are ways of pricing water so you’re not imposing serious hardship on folks without the means to pay for it,” Characklis says. For example, the Orange Water and Sewer Authority of Chapel Hill and Carrboro (OWASA) encourages conservation through a tiered pricing structure, with rates that increase sharply once a household uses more than the average amount of water.


nother issue with North Carolina water policy is the right to tap into water resources. North Carolina, like most eastern states, has what is known as a regulated riparian approach to the ownership of water. Basically, that means that if you own property alongside water, you have the right to use that water. David Moreau, director of the Water Resources Research Institute, likens state water law to the big straw theory: “You can get whatever you can suck out.” North Carolinians are accustomed to ample water, and the state has placed few

Martin Doyle: Water is essential, and diamonds have no practical use. But people pay exorbitant sums for diamonds and almost nothing for water.

endeavors 43

Philip Berke: We can improve water quality by creating compact urban centers instead of suburban sprawl.

restrictions on access. If you own land and want to build a car wash, or a water bottling plant, or a power plant, you get a permit to put an intake pipe into a river or stream and you’re free to draw hundreds of thousands— even millions—of gallons of water. For example, Duke Energy has proposed building a coal-fired power plant at Cliffside on the Broad River, while Progress Energy just announced proposals to build two nuclear reactors on the Cape Fear River. “Neither one of those is going to require any kind of permit to withdraw water, despite the fact that the amounts of water they will withdraw are enormous,” Whisnant says. State legislators have enacted a few water laws, but the state has rarely attempted to use them to regulate water use, Moreau says. One law allows the state to create “capacity use areas,” or areas in which it considers water to be scarce. That designation allows for greater government regulation and coordination of water use. The only part of the state that the Environmental Management Commission has designated a capacity use area is a fifteen-county region on the Central Coastal Plain, east of Greenville and New Bern. Several cities and a large phosphate mine have pumped out large amounts of groundwater from the counties’ aquifers, placing them at risk of intrusion of saltwater and permanent damage to their water storage capacity. Historically, local governments have 44 endeavors

been responsible for public water supplies in North Carolina. They are responsible for everything from building reservoirs to treating wastewater to developing plans for drought response. But some experts now say that North Carolina���and many other southeastern states—have outgrown the old, localized approach to water supply. Looking at water supply from the community level doesn’t work anymore, says Francis DiGiano, professor of environmental sciences and engineering. “That community depends on water from somewhere else, and discharges water somewhere else,” he says.


arolina researchers in a number of disciplines are working on solutions to the state’s water problems. Some are looking at helping cities respond to the current drought by analyzing ways to add to the water supply or ways to help local governments prepare to enact tougher conservation measures.

Some want to analyze options to inform the coming debate on water policy. Others have more far-reaching ideas, such as figuring out how to reuse wastewater for irrigation and toilet flushing, or changing the way we plan and build new neighborhoods. To assist local governments in responding to the drought, Moreau has analyzed the success of several water-conservation initiatives. He’s also been trying to help politicians understand the need for stricter water-conservation measures. “None of the cities are prepared to go beyond cutting off outdoor water use, and making public appeals for people to limit use of water,” Moreau says. “That has limited effect.” Last fall Governor Mike Easley called for citizens to reduce water use by 50 percent. But even the cities doing the best in conserving water haven’t achieved more than a 20 percent reduction, Moreau says. What if the drought continues and communities really need to cut water use by 50 percent? “It’s a problem no one wants to talk about,” he says. In the short term, some cities are considering connecting pipes to tap into the water supplies of nearby communities that have more water. Characklis has been analyzing the feasibility and cost of options for allowing Durham and Chapel Hill-Carrboro to get access to the water supply in Jordan Lake. The town of Cary would treat the water before piping it to those cities. Whisnant expects to spend the next year or more developing policy options to help state legislators considering changes to water policy. Although it’s too early to predict

Francis DiGiano: Looking at water supply from the community level doesn't work anymore.

what policy changes may occur, Whisnant expects legislators to consider requiring a permit before major users of water tap into public water supplies. “We’re different from other states in not having any way to regulate how much water somebody’s withdrawing,” he says. To encourage public debate and a deeper understanding of our state’s water issues, Whisnant and Bill Holman of Duke’s Nicholas Institute for Environmental Policy Solutions created a water wiki (http://water.


ith an eye to the future, engineers and architects have ideas about how to better use existing water supplies. The challenge will be winning public acceptance, they say. Water reuse is one such idea. Communities use treated drinking water to fight fires, to cool water in power plants, to flush toilets, and to irrigate lawns, golf courses, and public spaces. But to save water, future communities could install dual water systems that separate water for use in drinking and washing from water for other uses that don’t require such high quality, DiGiano says (see “Reclaiming Water at Carolina North,” page 46). In a process known as water reclamation, communities would treat wastewater to a high level and use it for nonpotable purposes such as irrigation and toilet flushing. If communities used reclaimed water, they could save up to 50 percent of the demand for fresh water supplies, DiGiano says. UNC has developed a reclaimed-water system in partnership with OWASA that

David Moreau: Our state's water law follows the big straw theory: you can get whatever you can suck out.

will save an estimated 210 million gallons of water per year. Beginning in March 2009 Carolina will use treated wastewater in the cooling towers at four chilled-water production plants that provide air conditioning on the main campus. The cooling towers are the largest consumers of water on campus. Eventually the university may use reclaimed water in landscaping and to irrigate athletic fields. The biggest barrier to implementing water reuse is public skepticism. Opponents of the idea raise concerns about people coming into contact with water and being exposed to microbes. Such risk is very low if the reuse system is engineered and managed properly, DiGiano says. A study by Mark Sobsey, an expert in environmental microbiology, concluded that the level of pathogens in OWASA’s reclaimed wastewater would be negligible and that exposure would be unlikely. People who are not accustomed to the idea of water reuse tend to fear it. But people who live in cities downstream

Richard Whisnant: Because there’s historically been so much of it in the eastern United States, people here take water for granted.

from major rivers, such as Washington, DC, and Wilmington, North Carolina, already experience the de facto reuse of water, DiGiano says. They drink water that has been excreted and treated many times before it reaches them. Architects also see the potential to save water—and preserve water quality—if we change the way we design our cities. Creating compact urban centers instead of suburban sprawl may be one answer, Berke says. One local example is Southern Village on the outskirts of Chapel Hill. That development features a core of businesses and shops intermixed with residences at higher densities than in typical suburban areas. Because people can walk to restaurants and shops, the development needs fewer large parking lots, Berke says. And since residences are close together, streets can be shorter. Less pavement equals less water runoff and pollution during rains. Denser developments also allow planners more opportunity to protect open spaces, Berke says. By avoiding sprawl, communities can protect sensitive lands, such as wetlands and land adjacent to streams, that buffer watersheds from pollutants. Doyle’s research on water quality dovetails with Berke’s work. Doyle’s geography students are collecting water samples from different parts of Jordan Lake, with the aim of analyzing how different types of endeavors 45

land use affect the lake’s water quality. In conjunction with the Department of City and Regional Planning, Doyle is projecting where people are likely to move to and what types of neighborhoods they are likely to build. Then he will estimate the impact of those developments on water quality in the lake. Doyle hopes to figure out how to offset the effects of development on water quality by reducing pollutants elsewhere in the watershed. North Carolina policymakers have expressed interest in expanding the use of water-quality markets, in which large polluters of waterways pay someone else in the same watershed to decrease their own contaminants so that there is no net gain in water pollution, Doyle says. Solutions to water scarcity, such as policy reform, water engineering, and city planning, all face the same big barrier: the status quo. But whether we like it or not, the period of abundant water in the Southeast appears to be over. “Water is not an infinitely available resource,” Characklis says. “What have traditionally been very abundant supplies are not going to be so in the future if we continue to develop as we have been. We’re going to have to come up with better systems for managing this resource.” e

Carolina water by the numbers Gallons saved per year by installing dualflush toilet valves in campus buildings:


Gallons saved per year by discontinuing irrigation of campus landscape:

12 million

Gallons to be saved per year when UNC begins using reclaimed wastewater in cooling towers in 2009:

210 million

Average number of gallons used per day by students in UNC dorms before the start of a water conservation competition with NC State:


Average at the end of the conservation competition:


Percentage of Orange County, North Carolina water used by UNC-Chapel Hill:


Sheila Read is a master’s student in journalism at Carolina. Richard Whisnant is a professor of public law and government in Carolina’s School of Government. Philip Berke is a professor in the Department of City and Regional Planning and director of the Center for Sustainable Community Design. Martin Doyle is an assistant professor of geography in the College of Arts and Sciences and director of the Center for Landscape

Change. Greg Characklis is an associate professor in the Department of Environmental Sciences and Engineering. David Moreau is a professor in the Department of City and Regional Planning and director of the Water Resources Research Institute. Francis DiGiano is a professor in the Department of Environmental Sciences and Engineering. Mark Sobsey is a Distinguished Kenan Professor of Environmental Sciences and Engineering. Photos by Jason Smith.

Reclaiming water at Carolina North If environmental engineers have their way, UNC’s planned research campus at Carolina North will serve as a model for water reuse, says environmental engineer Francis DiGiano. A dual-use water system would separate highly treated drinking water from reclaimed wastewater that would be used in landscape irrigation, air-conditioning systems, and toilets, among other things. Using reclaimed water at Carolina North could decrease fresh-water use by up to 50 percent, DiGiano says. The system would be the first of its kind on a college campus. The Orange Water and Sewer Authority would build a satellite wastewater-treatment plant on site. The reclaimed water system would also be linked with a One Hydrosphere Center dedicated to research, education, and outreach on water reuse, water quality, and water management. The name One Hydrosphere refers to the fact that water goes through a con46 endeavors

tinuous cycle involving both natural processes and human uses. The One Hydrosphere Center would offer public tours of the water reclamation plant and show how reclaimed water is used throughout campus, including for landscape features such as decorative streams. “Reclaimed wastewater is not a new idea, but it still lacks public acceptance—except in emergency situations created by droughts,” DiGiano says. But the idea is gradually catching on. A new wastewater reclamation plant in Duluth, a wealthy suburb of Atlanta, was designed with a barn motif that blends in with its surroundings. The water now irrigates golf courses and farmlands, DiGiano says, and has no odor or other obvious detriment. —Sheila Read Francis DiGiano is a professor in the Department of Environmental Sciences and Engineering.

in print the simple life Threefold Sun. By Taj Forer. Charta Art Books, 135 pages, $59.95.


t was two thirty in the morning and photographer Taj Forer’s pickup truck was bumping along the back roads of western Tennessee. He’d gotten a late start, and he was sure that his host, a local farmer, would have gone to sleep hours ago. The road changed from asphalt to gravel to dirt before it finally ended at a little house, where a warm, friendly glow shone through the windows. The music coming from inside stopped when Forer knocked on the door. Jeff Poppen, known to his friends and neighbors as The Barefoot Farmer, answered. “You must be Taj,” he said. “Welcome! Come on in.” Inside, several of Poppen’s friends sat around a blazing woodstove, each holding a musical instrument. They’d been picking out bluegrass tunes all night long. Poppen’s Long Hungry Creek Farm was the first stop on Forer’s trek across the United States; he was looking for images of schools, people, landscapes, and other subjects that in some way embodied the things he’d learned at his childhood Waldorf school in rural New Jersey. For over a month, with only his camera to keep him company, Forer rambled along in his biodiesel pickup, stop-

Above: A boy with a bloody nose at his Waldorf school in San Diego. Left: A playground tree soaks up mop water at the Enchanted Desert School in Tucson, Arizona. Top: The Barefoot Farmer at Long Hungry Creek Farm. Folks in Red Boiling Springs, Tennessee, know it’s got to be deep winter for Jeff Poppen to put on a pair of shoes. Photos by Taj Forer.

ping occasionally to refuel from grease traps at fast-food restaurants. There are only about a hundred Waldorf schools in the United States, and their curricula are based on the teachings of earlytwentieth-century philosopher and agriculturalist Rudolf Steiner, who wanted young students to have the freedom and experience to be “childlike enough.” So teachers in Waldorf schools take a holistic approach to education, and often combine tactile lessons with academic ones. For example, along with reading and math, kids might learn how to tend a garden, grind wheat for bread, or build a waterproof shelter in the woods.


teiner also developed an agricultural method known as biodynamic farming. Everything that’s produced on a biodynamic farm is also somehow consumed, so the farm becomes a sort of self-nourishing organism. Steiner wanted not only to heal farm soil that had been sucked dry of its nutrients, but also to connect the ecology of the living farm to endeavors 47

the workings of the entire cosmos. Imagine preparing your fertilizer by plugging a hollow cow-horn with a mishmash of manure, soil, and other farm miscellany, and then burying the concoction until the stars align just right. “These things can sound a little hokey,” Forer says. “But they work.” Forer made camp at Long Hungry Creek Farm because Poppen has worked his land biodynamically for the past twenty years. And he’s been incredibly successful. Aside from his vast vegetable crops, fifteen-yearold strawberry patch, and small herd of cattle, Poppen has over a hundred subscribers to his produce through communitysupported agriculture. “He doesn’t eat anything he doesn’t produce,” Forer says, “and his skin just glows.” The morning after Forer arrived at Poppen’s place, they toured around the threehundred-acre farm, which Poppen manages

with the help of one farmhand. They worked together for hours in the January cold, Poppen without shoes the entire time. After that Forer set up his Hasselblad camera.


ne of Forer’s photographs from Long Hungry Creek Farm shows Poppen with one bare foot propped comfortably on a stack of cedar logs, looking for all the world as if he’s just sprouted from the cold ground himself. Another is of Poppen’s stepson: he sits loosely on a tree stump in jeans too big for him; even though it’s January, his arms and feet are bare, and he looks quizzically at the camera. The child in that photo is just a little younger than Forer was when, after finding an old camera in his parents’ basement, he became obsessed with color photography. “One of the great struggles for me as a photographer, though, is the fact that I do work with color,” he says. The chemicals necessary

for making film, developing it, and creating prints all eventually wind up in the water table, he says, and that really weighs on him. “At the same time, I think there’s something to be said for using the medium to explore issues and raise questions in people’s minds about our relationship with the natural environment.” After a few days at Poppen’s farm, Forer moved on to Waldorf schools and biodynamic farms across the country. Most of the resulting photographs in his new book, Threefold Sun, were made during the winter, and many of the plates depict wintry landscapes. A hill beneath a swing set where the snow has been worn away by too many sleds. Two young girls bundled up and nestled in a fort made of hay. A single corn husk angel lifted on a reedy stalk against a flat, stony sky. But it’s in many of the portraits and interiors—warm kitchens, Valentine-strewn cafeterias, a schoolboy with a bloody nose— that the color in Forer’s photographs is indispensable. “My dream is to live on a small farm and grow most of my own food,” Forer says. “Live a little more simplistically and emphasize agriculture as part of my daily life.” So he does the best he can to find a balance for his two passions—a simple pastoral existence on the one hand and a highly technical, mechanized art form on the other. In one of Forer’s photographs from a Waldorf school in Tucson, Arizona, two chalkboards hang side by side on a painted cinder block wall. In neat yellow cursive, one reads: We will buy 2 trios of rabbits $60.00 each = $120.00 Each child will earn and bring $6.10. And beside it: Everyone does the best they can. —Margarite Nathe

At the Rudolf Steiner College in Fair Oaks, California, farming students learn the biodynamic method. While Taj Forer was visiting the college, he spotted this man working out in the field, and the two talked for a while. The young farmer was working on an internship there. Photo by Taj Forer.

48 endeavors

Taj Forer is a graduate student in the Master of Fine Arts program at Carolina. He received funding for his work in Threefold Sun from the Rudolf Steiner Foundation. He is an Artist-inResidence at the North Carolina Contemporary Art Museum and editor of the photography journal Daylight Magazine.

endview The old and the new

Neil Caudle

Steve Walsh, professor of geography, conducts research in the Galapagos Islands (see “Defending the Galapagos,” Endeavors, winter 2007). His work led to a collaboration of scientists from UNCChapel Hill and from the University of San Francisco Quito (USFQ), a private university in Ecuador. In February 2008 a team from Carolina traveled to the Galapagos to learn how new research could help preserve the islands’ ecosystems, which are under pressure from tourism and development. Right: A giant tortoise, probably over 100 years old, at the Charles Darwin Research Station on Santa Cruz Island. A Galapagos giant tortoise can measure up to five and a half feet across and weigh up to 650 pounds. Tortoises have different carapaces that probably evolved as adaptations to the environments on each island. Saddle-back types are raised at the front to allow the tortoises’ long necks to reach for higher vegetation on drier islands. Dome-shaped tortoises, such as the one in this photo, live on moist islands with lower vegetation. The dome shape probably helps them push through dense growth. Below: A sea lion pup seems right at home with young marine iguanas on a beach near Isabela Island.

Holden Thorp

endeavors 49

Office of Information & Communications Research & Economic Development CB 4106, 307 Bynum Hall Chapel Hill, NC 27599-4106



Page 38: Writer Marianne Gingher won’t

Page 22: Jack Griffith thinks he has found the

Page 34: George Plym is determined to fight

apologize for having a happy childhood. But she also knows that life ain’t always so grand. Photo by Jason Smith.

most ancient DNA ever discovered on Earth, trapped inside a chunk of salt. Is he right? Photo by Jack Griffith.

his brain cancer to the death. Scientists are now wondering if they can trick brain tumor cells into committing suicide. Photo by Coke Whitworth.

50 endeavors

Spring 2008