Issuu on Google+

Research and Discovery Volume 33 | 2013


Research PennState

A Chronicle of

Volume 33 | 2013

Research | Penn State 2013

Research/Penn State is published annually by the Office of the Vice President for Research at The Pennsylvania State University. The magazine samples the diversity and drama of Penn State’s $805-million-a-year research program as a public service to inform, entertain, and inspire the University community. Opinions expressed do not reflect the official views of the University. Use of trade names implies no endorsement by Penn State. c2013 The Pennsylvania State University. For permission to reprint text from Research/Penn State (U.Ed. RES 14-02) contact the editor: phone 814-865-3478; email Visit to learn more. Publisher: Neil Sharkey, Interim Vice President for Research

Research Penn State is now available in e-magazine format. Scan this code with your smartphone, or go to:

Director of Research Communications: Michael Bezilla

Editor: David Pacchioli

to view the interactive version, complete with videos on student and faculty research.

Production Manager: Joan Scholton Designer: Heather Reese

On the Cover: Illustration of a

hybrid conduit developed by a research team including Mohammad Reza Abidian, assistant professor of biomedical engineering. The design incorporates a soft hydrogel external wall and a conducting polymer as a supporting internal wall, creating a tunnel that guides the regrowth and reconnection of severed nerve endings. The eventual goal is to aid nerve regeneration in patients who have suffered spinal cord injury or other extensive nerve trauma. See story on page 27.

credit: Mohammad Reza Abidian






3  A Message From the Vice President Neil Sharkey 4 A New Approach to Intellectual Property Management and Industrially Funded Research


22 Materials Fold Up that Television and Put it Away? Mussels inspire innovative new adhesive for surgery Smart fiber

10  En ergy and the Environment Deep Trouble for Deep-Water Corals? Brown dwarf star spotted Invasive grass fuels increased fire activity in the West

Acoustic cell-sorting chip may lead to mini med labs Hybrid tunnel may help guide severed nerves back to health A New Frontier

Wild plants are infected with many viruses and still thrive

32 32 S  ocial and Behavioral sciences The Mexican Children of Immigrants Project— Interpreting the numbers Exposure to violence has long-term stress effects among adolescents

Engineering Biofilms Microbes make ‘clean’ methane

Reactions to everyday stressors predict future health

Climate change had political, human impact on ancient Maya

Time with parents is important for teens’ well-being

Jets’ contrails contribute to heattrapping high-level clouds

An Ounce of Prevention


Study shows bullying affects both bystanders and target


Research | Penn State 2013




68  C yberscience and Information Technology

44 Health and Life Sciences A Cure for Leukemia?

Red cell lab: on the cutting edge of security and risk analysis

Gypsy moth caterpillars hormonal slaves to virus gene More virulent malaria parasites evolve when vaccine is used

Technology only a tool in search for solutions to poverty

New method of resurfacing bone improves grafts

Factors identified that influence willingness to use new information technology

In touch with…Peter Hudson Compound stimulates tumor-fighting protein in cancer therapy New model of how brain functions are organized may revolutionize stroke rehab Overweight pregnant women not getting proper weight-gain advice Transforming health care through personalized medicine


58 Arts and Humanities Richards Center prepares next generation of Civil War scholars Gettysburg guidebook adds new research perspective to historic battle Religions play positive role in African AIDS crisis In Touch With…Helen O’Leary Indomitable Will Humanities mini-courses for doctors sharpen thinking and creativity Evolution helped turn hairless skin into a canvas for self-expression


Labor into poetry

No LOL matter: tweens texting & language skills Technology convergence may widen digital divide AMON—An Eye on the Universe 74 Industry Partnerships Dressed to Kill (Cancer) From the Earth to the Moon, via Penn State Chevron is industrial partner of the year

A Message from the Vice President

This issue of Research/Penn State surveys the breadth and depth of the research enterprise at one of the nation’s premier land-grant universities. While by no means comprehensive, it provides a good snapshot of the far-reaching activities of our faculty and students. It is an honor to serve Penn State as its Interim Vice President for Research. My thanks to Hank Foley, my predecessor and now Executive Vice President for Academic Affairs for the University of Missouri System. During Hank’s tenure as Vice President for Research and Dean of the Graduate School, Neil Sharkey research expenditures and graduate school enrollment continued to increase to record levels, helping make Penn State the largest non-governmental driver of economic impact in Pennsylvania. The University opened world-class research facilities, highlighted by the Millennium Science Complex. Foley led the effort to invigorate Penn State intellectual property, enabling us to become the nation’s first major research university to transfer ownership of IP rights to industry sponsors of research. Hank’s article detailing these changes and the thinking behind them appears in this issue. Throughout my Penn State career I have worked with people leading some of the most creative and technically advanced research projects undertaken anywhere on the globe. The University is home to hundreds of internationally distinguished scientists conducting cutting-edge research aimed at improving the human condition. Through large multidisciplinary investigations and smaller discipline-specific studies, Penn State aims to increase our ability as human beings to meet the challenges of a rapidly changing planet. With researchers in the arts and humanities, life and biomedical sciences, social and behavioral sciences, and materials and engineering sciences, we bring together innovative thinkers who can find answers for the spectrum of societal problems. I hope this issue of Research/Penn State, which also appears online as an e-book complemented by video clips relating to many of the stories in print, is to your liking. I look forward to sharing more ways that our University’s research activities impact the Commonwealth, nation, and world in future issues of Research/Penn State, the Penn State Research and Discovery Newswire, and my office’s internet homepage. Thank you!

Neil A. Sharkey

Interim Vice President for Research


A New Approach ›

A New Approach to Intellectual

Property Management and Industrially Funded Research at Penn State

Michelle Bixby

Research | Penn State 2013

by Henry C. Foley, Former Vice President for Research and Dean of the Graduate School


oday, land-grant universities such as Penn State are called upon to be engines for national and regional innovation. To fulfill that mission, these universities need all innovation avenues to be wide open for the two-way traffic that is translational research and development. At Penn State we are responding to that call by seeking to transform our culture from that of a traditional research-intensive, public land-grant university to one that is more dynamic and nimble and better able to drive the transfer of science into technology. As a part of this effort, in addition to fostering entrepreneurship, we seek to spur the growth and development of industrial research partnerships. The goal is to make the University a model for open innovation in the twentyfirst century, while at the same time bringing us back to our core historical mission. To do all of this we have developed a sevenpoint plan to reinvigorate our culture to be implemented over the next two years.


Of these seven points, none is more important than the second—to spur growth in research by taking a more flexible approach to IP ownership—nor has any garnered more attention from industry and from academia. After a thorough analysis, Penn State has concluded that it is no longer viable to maintain the long-held position that we must own all intellectual property that derives from any and all research that we do, even that which is the product of industry-funded research. This change in approach arises directly from a renewed engagement with our core mission to benefit students and society. It is our view that it is to the benefit of society, and to our students and faculty, to let the ownership of IP developed with industrial funds flow back to the sponsor. This, we believe, will catalyze more commercialization of new technology, help the University build stronger ties to practitioners, and create new adjacencies between theory and practice from which both students and faculty can learn. In this article, I lay out the factors that led the University to make this change and try a wholly new approach to IP developed at the University.

Background Land-grant universities were established by the Morrill Act of 1862, which allocated land grants to each state and specified that all moneys derived from the sale of the lands aforesaid by the States…the interest of which shall be inviolably appropriated, by each State, …to the endowment, support, and maintenance of at least one college where the leading object shall be, without excluding other scientific and classical studies, and including military tactics, to teach such

Penn State’s Seven-Point Plan for Reinvigorating the Research Culture In late 2010, after much analysis of our past performance and in recognition of growing expectations that research will have economic impact and of our aspirations for the future, Penn State created a seven-point plan to foster a shift in our research culture. The goal is to reinvigorate and to reenergize those entrepreneurial members of our research community who are excited by the prospect of moving more Penn State research from the laboratory to the marketplace. 1. Create an Office of Technology Management, uniting functions performed by the Industrial Research Office and the Intellectual Property Office. 2. Spur growth in industry-funded research with more flexible intellectual property policies. 3. Manage master agreements in a way that provides real value to the industry partner and to the University by building end-to-end partnerships. 4. Create a culture of entrepreneurship by creating more trust, ownership, and excitement among the faculty. 5. Raise revenue by selling off existing university-owned intellectual property. 6. Rename and explain the conflict of interest policy to encourage participation and better protect faculty members and the University.

branches of learning as are related to agriculture and the mechanic arts, in such manner as the legislatures of the States may respectively prescribe, in order to promote the liberal and practical education of the industrial classes in the several pursuits and professions in life.

even seek to protect its inventions and innovations. Rather, as an institution well supported by public funding, Penn State simply disseminated findings as effectively and as quickly as possible, with little thought of institutional commercial gain. This changed, for Penn State and for many universities, approximately three decades ago, circa 1980. In a significant departure from the practice of over 100 years, university inventions and technological innovations that were the products of our research, whether funded publically or privately, were to be protected with patents and held as the institution’s intellectual property for license to industry. The shift had two drivers: the Bayh-Dole Act of 1980 and the recognition of the success of a few schools—for example Florida, Wisconsin, Michigan State—with blockbuster inventions that lifted them

This is the historical basis for Penn State’s mission, its raison d’être. As the nation’s first land-grant institution, Penn State’s core purpose has always been to do research with practical value and to disseminate that new knowledge for the betterment of society. For most of its history, the University did exactly that and did it well. The creation of new inventions and innovation was in the core mission, but we did not call it intellectual property, nor did we think in terms of its market value. For many decades, the university did not

Research Expenditures $900 $804M















Figure 1. Penn State’s research expenditures over two decades









Federal Non-Federal


7. Create the Techcelerator innovation center by co-locating the new Small Business Development Center, the new Office of Technology Management, the Office of Sponsored Programs, and the Ben Franklin Technology Partners’ Center with the New Business Incubator, the Innovation Park Management Office, and the Centre County Chamber of Business and Commerce.

Universities tend to think that the subject of a patent is much closer to market than industry does, and so there is a built-in, significant difference in the perception of value.

A New Approach ›

Research | Penn State 2013

to new levels of financial success. The Bayh-Dole Act had as its goal to push more of the products of federally funded research to the marketplace. To do so, it transferred IP rights from the funding agency to the university where the research was done. With this transfer of rights came the expectation that the university would show due diligence in seeking to license the intellectual property for commercialization. The second driver—the realization that the products of university research that had been given away for decades could actually have very high value in the marketplace— led to the formation of university IP offices charged with protecting university research with patents and then licensing the technology for commercialization. It was thought at that time that the kind of market success that a few schools had seen could be systematized and duplicated elsewhere. Doing so, it was reasoned, would create a significant revenue stream, helping the institutions while serving to fulfill the goal of the Bayh-Dole Act. While this thinking seemed sound, it had a few hidden assumptions beneath it: 1) that the percentage of university research that was of value to industry was large enough to justify the added costs of managing IP in this new way; and 2) that universities, which had never been in the IP business, would learn how to do this and would do it well. It also did not account for the very different perceptions of value and risk held by industry and academia. Universities tend to think that the subject of a patent is much closer to market than industry does, and so there is a built-in, significant difference in the perception of value. Even more importantly, this new thinking did not account for the very real change the shift made in the unwritten compact with society that public land-grant universities had subscribed to for over a century—namely that they would serve as public institutions for the betterment of all, not for the betterment of the institution itself. One can argue that the seeds of today’s trend toward the privatization of public higher education were planted with this subtle shift. With the benefit of hindsight, it seems that such a significant change should have been approached more carefully with a fuller analysis of


Data such as these must lead to serious questioning of the assumptions that have undergirded the last 30 years of university IP management.

unintended consequences. Today, it is easier to see how this shift in IP management could affect the mission of a public institution, but then it was viewed as simply pragmatic.

The Research and IP Experience at Penn State Penn State’s experience over the last 30 years illustrates how the hidden assumptions in IP management have played out over three decades. Research expenditures have grown significantly at Penn State since 1990 as the efficacy of the University’s research enterprise has improved (Figure 1). As research has grown, so too has the activity in intellectual property—from just 66 invention disclosures in 1990 to over 200 in 2000 (Figure 2). At the peak of disclosure filings, about one-third of disclosures were converted into issued patents and this number was rising, as were expenditures for this activity. It was already evident by 2001 that a problem was developing in that expenditures to create and manage IP were very much outpacing revenues. By 2002, Penn State was expending about $1.9 million on patent costs each year, but the licensing revenue, including patent cost reimbursement, was well below that (Figure 3). In order to control this imbalance, the then vice president of research concluded that fewer invention disclosures would be patented. An inventions disclosure review committee was created to select those disclosures deemed to have enough promise to merit patent protection. This had the predictable effect of throttling the number of disclosures filed and the number of patents issued to Penn State each year. From almost 70 patents issued in the peak years of 2001 and 2002, the

number settled to about 35 to 40 per year. Disclosures that were not patented were abandoned. However, even as the number of disclosures filed and patents issued dropped over the course of the next decade, IP management costs continued to escalate. By 2010, the gap between expenditures and reimbursements had grown to nearly $1,000,000 per year. Cost recoveries were never 100 percent and were never expected to be; however, the University’s intellectual property enterprise was expected to be self-supporting and overall revenues were expected to exceed expenditures. Yet, at no time in the last 30 years has this happened; expenditures continue to mount and always exceed revenues, as we wait for that one invention that will be lucrative enough to compensate for this accumulation of losses. At this point, Penn State holds 579 active patents, the great majority of which have not garnered any interest, let alone produced revenue-generating licenses. Data such as these must lead to serious questioning of the assumptions that have undergirded the last 30 years of University IP management. While I cannot state that Penn State’s experience has been the same as others, I can say unequivocally that only a few schools have netted outstanding IP revenues; these cases are well documented and few in number. It seems likely that more than a few other public and private institutions have had relatively modest or little gain from their IP enterprises. How, then, can we change this dynamic? We must begin by reexamining the motivations behind current practice. The Bayh-Dole Act prescribes how we handle IP derived from federally funded research. It leaves open, however, the handling of IP derived from industry-funded

A New Approach to Intellectual Property Management and Industrially Funded Research at Penn State

research. Nevertheless, Penn State, like other universities, persisted in applying the same standard to all research, insisting that if we did the research, we own the IP. Several reasons have been proffered for such rigidity. Bayh-Dole is often cited, as are requirements associated with the institution’s not-for-profit tax status. But driving the entire thought process is the notion that the IP developed in the course of industry-funded research is potentially more valuable because industry-funded research will be more applied, and thus any resulting invention will be closer to commercialization. The assumption is that the institution should share in this value. None of these arguments stands against a logical approach. Bayh-Dole does not apply unless the industry funding is a pass-through of federal dollars or the research is based substantially on previous work done with federal dollars. The argument that the tax-free status of the institution could be jeopardized is one that needs to be considered at each institution, but tax regulations are a set of issues to be managed, not insurmountable barriers. Institutions routinely manage them and manage them well in other spheres of their business. Thus, the only argument that stands is that of the potential loss in license revenue if one were to let the ownership of the IP flow to the sponsor, and this argument is one that we can test with data. Industryfunded research comprises at most 12 percent of research expenditures each year at Penn State. But, of the 1,197 inventions disclosed at Penn State between 2000 and 2007, only 92 disclosures—less than 8 percent—resulted from de novo industryfunded research, which is to say that over 90 percent of the University’s IP is derived from other forms of research funding, namely that from federal and state agencies. This figure stands in stark contrast to expectations. Given the argument for the higher value of industry-funded research, we had expected more disclosures per dollar expended than from federally supported research. On this basis, we expected the number of disclosures based on industry-funded research to have been 20 to 25 percent, or more like 300–400 disclosures.

that both the return on investment and the cost-benefit ratio are markedly negative.

Nor did those disclosures lead to a higher licensing rate, and commensurately higher revenues, than did disclosures from other research. Of the 92 invention disclosures resulting from industry-funded research, 30 led to 18 license agreements. Worse, of those 18 licenses, only 4 generated any revenue—a total of $92,000 between 2000 and 2007, or $13,000 per year. Even in the absence of detailed calculations, it is clear

Clearly, this analysis does not support the argument that it would be financially irresponsible to let IP ownership flow to the sponsor. In fact, quite the opposite: it shows just how costly IP ownership is in general and how little return there is on investment in owning the IP from

Invention Disclosures Received 200






















Issued U.S. Patents


























Figure 2. Invention disclosures and issued patents over two decades

A New Approach › industry-funded research. Yet, because this analysis had never been done, Penn State had been negotiating vigorously with industry for ownership of IP resulting from sponsored research. We had done this for over 30 years, in a manner consistent with what we knew other universities were doing and in a way that led to much tension.

Would there have been more national and regional innovation had we yielded on this point? These are hard questions to answer and thoughtful people can come to quite different conclusions. However, we can all agree that doing research leads to new products and innovations and that doing more research, whatever the source of funding, generally increases the number of inventions and innovations. Within that framework, it becomes clear that lost opportunities—in the form of failed negotiations with potential research partners—likely did result in less innovation. Penn State does not keep records of unsuccessful research negotiations per se, but we do have an intuitive feel for the probabilities of success. In practice, there are actually two hurdles to be cleared in

Lost Opportunities to Innovate Openly and Collaboratively This analysis raises some provocative questions: If Penn State had not insisted on retaining the ownership of IP that resulted from industry-funded research over the last 30 years, what would have changed? Would we have better served our mission as a public land-grant university?

Research | Penn State 2013

negotiating research contracts with industry. The first is IP ownership. The second is the University’s licensing practices: Under previous practice, the University would not establish a license cost until the invention had been reduced to practice so that it could be valued, which was unacceptable to many potential industry partners. Nor would Penn State promise not to license the invention to another company, should licensing negotiations with the sponsor fail. This presents potential research sponsors with the daunting risk that a successful research outcome could end up in the hands of their competitors. If we assume that 50 percent of wouldbe sponsors accept the IP ownership provisions and that 50 percent of those will proceed even after the licensing arrangements are detailed (and 50 percent

Patent Costs & Licensing Revenues $2,000,000

$1,931,353 $1,823,994



$1,994,209 $1,946,603

$1,888,448 $1,860,657

$1,771,004 $1,565,523


$1,602,265 $1,341,437 $1,218,880







$1,020,339 $1,049,151


$937,518 $728,104 $643,991


$429,897 $328,724

$114,581 $10,509 91

$184,864 $47,741 92

$234,939 $71,737 93


$445,801 $308,610


$196,629 $176,967 $188,613 $102,096








Figure 3. Growth in IP expenditures versus licensing revenues over two decades












Patent Costs Licensing Revenues

A New Approach to Intellectual Property Management and Industrially Funded Research at Penn State

Penn State has concluded that its insistence on ownership of the IP resulting from industry-funded research is not beneficial to the institution, to our students, or to the public that the University was established to serve.

likely overestimates the real number of negotiations that succeed when these two issues arise), that means only 25 percent of potential research agreements would be successfully concluded. Removing the first hurdle by definition removes the second hurdle. So we can conclude that at least four times more industry-funded research could have been done over the last 30 years, had Penn State not followed the rest of academia in insisting on IP ownership. It seems reasonable to conclude that if Penn State, as a top-tier research institution, could have done at least three to four times more research with industry, then it is probably true that other top research institutions could have as well. This is an enormous amount of research that was never done—at least in partnership with these universities. It represents an enormous loss to universities in terms of opportunities for both faculty members and students to acquire new learning, apply their knowledge in industry contexts, and develop new relationships and new interests. It is also a loss to industry and to the nation, in that new innovations with real commercial success and economic benefit could have emerged from such research. For these reasons we felt a new approach to this portion of our research enterprise was warranted.

A New Approach to Industry Partnerships From this analysis, Penn State has dispassionately concluded that its insistence on ownership of the IP resulting from industry-funded research is not beneficial to the institution, to our students, or to the public that the University was established to serve. We are now resolved to engage with prospective industry partners from

a more open, flexible stance, captured in four simple principles intended to guide negotiations with industry: 1. The value to Penn State of industry sponsored research lies in research itself, in the support of that research and in the relationship with the partner, not in the creation and ownership of IP. 2. The best agreement is the simplest form of agreement that is necessary and sufficient to meet the needs of the program and reduce negotiation to a minimum. 3. If industry funding is a pass-through of federal dollars, or if industry funding is matching federal funding, then Penn State must retain ownership of IP by law. In such cases, Penn State will offer flexible licensing options. 4. When there must be an exception to the above principles, Penn State will strive to explain it fully and clearly to the industry sponsor and then seek the best way to handle the exception to the benefit of both partners. The first principle gets at our core mission, which is to do research and teaching for the benefit of society and to our students. Therefore, the real value of industrysponsored research lies in the value of the research itself, in the economic benefit it may provide to society and in the deeper relationships that it fosters between faculty members and students and their counterparts in industry. Indeed, engagement in industry-sponsored research benefits even students who are not directly involved. Faculty members who engage with practitioners develop a deeper appreciation of the connections between theory and practice, which informs their teaching, opening up new channels of thinking for

them, and making them better able to prepare students for the working world.

Conclusions At Penn State, we have always been mindful of the core and strategic aspects of our mission, and we take these very seriously. Key among these is student success; Penn State’s first-place ranking in the 2010 Wall Street Journal survey of corporate recruiters reinforced this for us. The first priority in the University’s strategic plan is to ensure student success, and this new approach will help us to do this. Although the law makes clear that we cannot change our responsibilities in the management of IP resulting from research funded with federal dollars, the fact that there is so much more research that can be done with industry makes this change well worth pursuing. In 2012, Penn State and other land-grant institutions marked the one hundred fiftieth anniversary of the Morrill Act. At Penn State, 2013 is also the 150th anniversary of graduate education and of research at Penn State. Though many things have changed markedly since our founding, our role is not that different today than it was at the university’s inception: to drive progress for the nation and to help young people make their way to a better life. Both from the practical and the historical vantage points, taking this new approach to intellectual property management at Penn State makes good sense, and it is consistent with our principles. Reprinted with permission from ResearchTechnology Management September— October 2012


Energy and the Environment ›

Deep Trouble Research | Penn State 2013

for Deep-Water Corals?

At the bottom of the Gulf of Mexico, in the vicinity of the Deepwater Horizon oil spill, biologist Charles Fisher discovered previously unseen impacts on coral communities.


By Sara LaJeunesse

illions of dollars.

That’s what’s at stake for BP as a result of the damage caused to ecosystems in the Gulf of Mexico from the Deepwater Horizon oil spill. News of that spill—which began on April 20, 2010, with an explosion onboard the Deepwater Horizon drilling rig that killed 11 people and injured 17—dominated the media for weeks. Millions watched with a feeling of helplessness as the rig sank and over the next 86 days over 200 million gallons of oil spewed out of the Macondo well and into the ocean. Five months after the spill was capped, the federal government estimated the marine animal death toll at 6,104 birds, 609 sea turtles, and 100 mammals, including dolphins. But what of the deep-water corals that provide habitat and reproductive grounds for numerous species of fish, shrimp, and crabs?


According to Charles Fisher, professor of biology at Penn State, these corals and the organisms they support are important components of a healthy deep sea and open-ocean ecosystem. That’s why both BP and the government are closely collaborating with him on his investigation of the disaster’s impact. “It’s a new experience for me to conduct research that could have such a dramatic financial impact and also to have so many people involved in everything we do,” says Fisher. “You have to be very careful to document all the details and be very sure that you’re right with your interpretations. We’re always careful, but every little comment we make could be misinterpreted, so we’re being extra conservative with this data set.”

Calling on a World Expert It was the middle of May, about a month after the oil spill began. With classes over, Fisher was looking forward to spending a little extra time on his farm, located 25 miles east of State College. But that was before the calls started to come in from federal agencies.

Over a period of about a week, Fisher was contacted independently by program officers from the National Science Foundation, the National Oceanic and Atmospheric Administration (NOAA), and the Bureau of Ocean Energy Management. All had financially supported Fisher’s research in the Gulf in the past, and all were now calling on him to help assess the impact and damage of the oil spill to the deep-sea ecosystems he knows so well. Fisher “was selected as an expert based on his extensive and unique experience working on the ecology of the cold seep and deep-sea coral communities in deep-sea, hard-bottom habitats in the Gulf of Mexico,” says Robert Ricker, southwest region branch chief of NOAA’s Office of Response and Restoration. “He is a recognized leader in his field, and we pick leaders.” Fisher agreed to help. After all, he already was leading another big research program that had overlapping goals—to locate, describe, and study deep-water coral communities throughout the Gulf of Mexico that could potentially be impacted by energy company activities. For nearly three decades, Fisher has been studying the physiology and the ecology of the communities of animals that inhabit cold seeps—areas of the ocean floor where methane and other hydrocarbon-rich fluid seeps out—and hydrothermal vents— underwater fissures in the Earth’s surface that emit geothermally heated water rich in reduced chemicals—in the deep sea. Marine invertebrates such as clams and tubeworms live in these dark places, surviving the lack of sunlight by forming symbiotic associations with bacteria. The bacteria use the reduced chemical compounds contained in the water as an energy source and, in turn, supply nutrition to their animal hosts. Fisher has visited these deep places in submarines some 120 times. “When you’re down there, you feel like you’re on another planet because the landscape is like nothing you’ll see on the surface of the Earth,” he says. “You’re oftentimes in a place where nobody has been before, so you have in the back of your mind that you may see something that nobody has ever seen. Every once in a while you do.”

Charles Fisher Professor of Biology

Among his accomplishments are the discovery of ice worms living on methanerich ice at the bottom of the Gulf of Mexico and the unraveling of the complex physiological ecology of giant hydrocarbon-seep tubeworms, among the longestlived animals on Earth. The bizarre two-meter-long tubeworms use their buried roots to suck up toxic hydrogen sulfide that lies deep in the sediments of the seafloor. They then pass the hydrogen sulfide to symbiotic bacteria living inside their bodies. These bacteria, in turn, oxidize the sulfide and provide nutrition back to the worms. The end product is sulfuric acid, which the tubeworms pump back into the sediments, where yet other bacteria use methane to remake the sulfide and supply it back to the worms. Whenever possible, he works with Jim Brooks, president and CEO of TDI Brooks International, a company that specializes in conducting offshore surface geochemical exploration for petroleum producers. “Jim’s group discovered seep communities in the Gulf of Mexico in the 1980s

when he was on the faculty at Texas A&M University,” says Fisher. “I’ve been involved in multiple projects with him over the years. In addition to his expertise in oil geochemistry and prospecting, his company can handle all the administration, travel, budgets, and reporting, and I get to just concentrate on the science.” So in October 2010, with TDI Brooks International managing the expedition, Fisher and his colleagues set out for the Gulf of Mexico on board the NOAA ship, the Ronald H. Brown.

Discovering Damaged Corals For nearly a month, the team revisited deep-sea coral sites all over the northern Gulf of Mexico that they had discovered the year before during a previous project. Each time they stopped, they used Jason II—a remotely operated vehicle (ROV) or submersible designed for scientific investigation of the deep ocean and seafloor—to sample and study corals and associated animals.

“We revisited all of the sites for which we had good baseline data,” says Fisher. “We were all quite pleased to find that there was no obvious damage to the deep-water coral communities at any of these sites.” Although they had covered a four hundred-mile span east to west and a depth range from 1,300 feet to almost 6,500 feet, Fisher and his colleagues had observed only a couple of coral sites close by the Macondo well. So, on the last dive of the expedition they decided to check out a very promising area they had identified about seven miles southwest of the well and 45 miles from shore. The research vessel coasted to a stop with nothing but the occasional seabird in flight to break the monotony of the view. Six hours into the ROV’s dive, Fisher was working in the ship’s laboratory, glancing up every now and then at the 36-inch screen through which video was streaming from the vehicle’s camera, now positioned 4,500 feet below the ocean’s surface. As the ROV moved across the seabed, the camera recorded scenes of mud, mud, and more mud, he remembers. Then, all of a sudden, a coral popped into view, and another and another. But something was wrong. The animals were not brightly colored as they are supposed to be. Fisher recalls jumping up and sprinting across the deck of the ship to the control van. “Stop!” he warned. “Don’t touch anything!”

Sea Floor Explorer: Watch Charles Fisher discuss his deep-ocean experience in the Gulf of Mexico.


Energy and the Environment › The ROV pilots were about to take a sample, but he asked them instead to zoom in with the camera. What he saw were corals covered in dark gunk and dripping snot. “When a coral is physically insulted, it reacts by exuding mucus,” he explains. “It’s a normal stress reaction. It helps to clear the surface if there’s something irritating or sticking on it.” To avoid stressing the animals further, the team decided to minimize sampling.

Research | Penn State 2013

“Normally we would take little pieces of lots of different corals for genetic identification and population genetic studies,” Fisher says, “but we decided to back off on that and try to do our sampling around the edges, taking only samples of corals that we didn’t recognize. We also collected one of the impacted corals so we could take a closer look at the gunk and what was underneath and determine whether the coral branch was dead or alive.” By the end of the cruise, the team had visited 14 sites, all but one of which were at distances greater than nine miles from the Macondo well. Only corals at that last site, just under seven miles southwest of the well, had clearly been impacted. As the researchers headed home with their samples, they began to discuss future expeditions. They knew that impact to at least some corals could be readily

identified visually and, since the organisms are attached to rocks and don’t swim or float away when impacted, they provide a record of past events. Their next steps would be to discover the full extent of the oil spill’s reach with regard to corals, and to determine the animals’ ultimate fate. Would they live or would they die?

The Impact On five subsequent cruises over the next two years, Fisher and his team have explored for additional sites and revisited the established ones to check the corals’ statuses. They have carefully monitored about 50 of the corals that they first discovered in November 2011. Those that were not too heavily impacted seem to be recovering. “When I say recover,” notes Fisher, “I don’t mean that tissue died and the coral got better. I mean they were covered with slime, but they never died. These corals still do not look as healthy as corals at other sites, and we may have to monitor them for several years before we will know their ultimate fate.” The corals that were heavily impacted, on the other hand, are largely not recovering. “We are seeing absolute proof of total death of parts of them,” says Fisher. Since corals are colonial, branching animals,

parts of them can die while other parts remain alive. Specifically, at the first damaged site they witnessed—the last site of the October cruise—the researchers have discovered that 86 percent of the coral colonies show signs of damage, with 46 percent exhibiting impact to more than half the colony, and 23 percent displaying more than 90 percent damage. At each site visited, the researchers deployed markers and set up permanent monitoring stations with a goal of returning to them again and again to monitor both natural processes and, potentially, long-term effects. “At that depth and at those temperatures in the deep sea, life passes at a slow pace,” notes Fisher. “These are animals that often live 500 years. They live slow; they die slow. We’ll have to monitor the sites for a decade before we’ll have very much confidence we know the full extent of the impact.”

What’s Next? The team’s second cruise, which took place in December 2010 and made use of the Alvin deep-diving submarine, included Helen White, a geochemist from Haverford College. White used state-of-the art oil fingerprinting technology and determined that the brown muck on the corals did, indeed, include oil from the Macondo well. Fisher’s research to date has demonstrated that the Deepwater Horizon oil spill killed some corals. As a result, BP is going to have to pay. But how much and to whom?

Chuck Fisher/Timothy Shank

“People have asked me how much a dolphin is worth, and there is no clear-cut answer,” says Timothy Zink, spokesperson for NOAA, the organization that oversees natural resource damage assessments performed by researchers like Fisher, tabulates the check for the parties responsible, and formulates and carries out a plan for restoring the ecosystem.

The Deep Submergence Vehicle Alvin working at the coral site found to be impacted by the oil spill from the Macondo well in the Gulf of Mexico.


“The public needs to be compensated for its losses, and not just for the resource itself, but for the human use of the resource— such as recreational fishing, bird watching, and going to the beach—as well,” said Zink. “The final price that BP will pay will

How Does Oil Affect Corals?

Deep Trouble for Deep-Water Corals?

By the time a coral that has been exposed to oil shows signs of sickness it may already be close to death. That’s why Iliana Baums, assistant professor of biology at Penn State, is investigating the use of molecular tools to detect signs of stress in corals before they become ill and also to determine just how much oil is lethal to corals. “Our goal is to understand how deepsea corals react to oil and dispersant exposure so that ecosystem managers can be better prepared to respond during and after oil spills,” said Baums, who is using funds from a grant she and Fisher recently received from BP’s $500-million Gulf of Mexico Research Initiative to do the work. “The surface of corals’ thin tissue layer is covered in mucus, in a way that is similar to the mucus lining of our body cavities,” said Baums. “Just like us, this lining contains a great number and diversity of bacteria. When the environment changes, for example by adding a lot of oil to the water, so will the bacterial community in the mucus, thus providing us with something we can observe in corals exposed to oil.” Baums plans to grow live corals in aquaria in her laboratory, expose them to different quantities of oil and dispersant, and monitor which genes get switched on, which proteins are made, and how the bacteria change in the mucus layer. She will compare the results to those of animals sampled in situ in the Gulf of Mexico that were not affected by the oil spill. “This data will help us understand how oil spills and the dispersants used to clean up those spills affect corals who build the three-dimensional structure of deep and shallow water reefs,” Baums says.

Unfortunately for deep-water corals, the full effects of the spill may not be felt for many years, too late for any near-term settlement to fully cover them. “I believe everyone involved would like to settle as soon as we can,” says Fisher. “However, the full extent of damage to deep-sea ecosystems may not manifest itself until after a settlement is reached. If corals all over the deep gulf start dying, and we thought only those very close to the Macondo well would die, then we have to reassess the situation.” In that case, Zink says, the investigation could be reopened. BP has already paid over $20 billion to cover some of the damages from the spill, and in a November 2012 settlement with the Justice Department, agreed to pay $4 billion in criminal fines. The company has also committed hundreds of millions to research into understanding the effects of oil spills on ecosystems and preventing future disasters. Despite the trouble the oil spill caused for deep-sea ecosystems, Fisher says he’s not against deep-water drilling for oil. “As much as I love the ocean, there are a lot of resources in the ocean, and as long as I drive a car, it would be pretty hypocritical of me to say that we shouldn’t obtain those resources for human use,” he notes. “I’m conflicted in the way I feel about it, but I don’t think this means we should stop accessing oil in the marine environment. “I think, in general, oil companies try pretty damn hard to be responsible.” Fisher adds. “It’s in their best interest to be responsible. This has cost BP billions of dollars; they don’t want it to happen again. In a way, this oil spill has been a beneficial wake-up call in that it tells us that the unthinkable can happen. I think a result of it will be better oversight by oil companies and the federal government.” Charles R. Fisher, Ph.D., is professor of biology,

professor of biology, —SL



Iliana Baums, Ph.D., is assistant

be based on the full cost of restoring the environment back to what it was on the day the oil spill happened.”

Energy and the Environment ›

Brown dwarf spotted Astronomers pick up record-breaking radio waves from ultra-cool star

Research | Penn State 2013

Astronomers using the world’s largest radio telescope, at Arecibo, Puerto Rico, have discovered flaring radio emissions from an ultra-cool star, shattering the previous record for the lowest stellar temperature at which radio waves were detected and possibly boosting the odds of discovering life elsewhere in the universe. The team, led by Alex Wolszczan, Evan Pugh Professor of Astronomy and Astrophysics at Penn State, has been using the 305-meter (1000-foot) telescope to look for radio signals from a class of objects known as brown dwarfs: small, cold stars that bridge the temperature gap between Jupiter-like giant planets and hotter, more massive, hydrogen-fusing stars. The astronomers hit the jackpot with a star named J1047+21, a brown dwarf 33.6 light years away in the constellation Leo. “This object is the coolest brown dwarf ever detected emitting radio waves—it’s half the temperature of the previous record holder, making it only about five times hotter than Jupiter,” according to team member Matthew Route, a graduate student at Penn State. That means it’s much smaller and colder than our Sun, Route adds. With a surface temperature not much higher than that of a giant planet, and a size comparable to Jupiter’s, it is scarcely visible in optical light. Yet the radio flares detected at Arecibo show it must have a strong magnetic field, implying that the same could be true of other similar stars.

The possibility that young, hot planets around other stars could be detected in the same manner—because they still maintain strong magnetic fields—has implications for the chances of finding life elsewhere in our Milky Way galaxy, Wolszczan explains. “The Earth’s field protects life on its surface from harmful particles of the solar wind. Knowing whether planetary magnetic fields are common or not throughout the Galaxy will aid our efforts to understand chances that life may exist beyond the solar system.” —Robert Minchin and Barbara Kennedy

“Knowing whether planetary magnetic fields are common or not throughout the Galaxy will aid our efforts to understand chances that life may exist beyond the solar system.”


Above: An artist’s impression shows the relative sizes and colors of the Sun, a red dwarf (M-dwarf), a hotter brown dwarf (L-dwarf), a cool brown dwarf (T-dwarf) similar to J1047+21, and the planet Jupiter. Credit: NASA/IPAC/R. Hurt (SSC) Below: An artist’s impression of a brown dwarf similar to J1047+21. Credit: R. Hurt/NASA.

Invasive grass fuels increased fire activity in the West

Marilyn Roossinck

Professor of Plant Pathology and Environmental Microbiology and Biology

Wild plants are infected with many viruses and still thrive

An invasive grass species may be one reason fires are bigger and more frequent in certain regions of the western United States, say researchers from Penn State and several other institutions.


esearchers have studied viruses as agents of disease in humans, domestic animals and plants, but a study of plant viruses in the wild may point to a more cooperative, benevolent role of the microbe, according to a Penn State virologist.

The researcher team used satellite imagery to identify cheatgrass, a plant species accidentally introduced by settlers in the West during the 1800s, in a disproportionately high number of fires in the Great Basin, a 600,000 squarekilometer arid area in Nevada, Utah, Colorado, Idaho, California, and Oregon. “Over the past decade, cheatgrass fueled 39 of the largest 50 fires,” notes Jennifer Balch, assistant professor in the Department of Geography and Earth and Environmental Systems Institute. “That’s much higher than what it should be when you consider how much of the Great Basin that cheatgrass covers.”

Marilyn Roossinck, professor of plant pathology and environmental microbiology and biology, has examined more than 7,000 individual plants for viruses. “Most of these wild plants have viruses,” Roosinck said. “But they don’t have any of the symptoms that we usually see in crop plants with viruses.”

The average size of the fires in cheatgrass grasslands, which dominate only about 6 percent of the Great Basin, was significantly larger than the average fire in most regions dominated by other vegetation, such as pinyon-juniper.

In fact, studies indicate that viruses can be beneficial to some plants, making them hardier and helping them survive extreme temperatures and drought, said Roossinck. “When most people think of viruses, they think of serious diseases and death, such as the AIDS virus,” she said.

“What’s happening is that cheatgrass is creating a novel grass-fire cycle that makes future fires more likely,” says Balch, who started her investigations at the National Center for Ecological Analysis and Synthesis. “Fire promotes cheatgrass and cheatgrass promotes fires.” —Matthew Swayne

Mike Bellant, BLM

The researchers also found that cheatgrass may play a role in increasing the frequency of fires, adds Balch. “From 2000 to 2009, cheatgrass burned twice as much as any other vegetation,” she says. As a result, landscapes dominated by the grass have a fire-return interval—the time between fires in a region—of 78 years, compared to other species like sagebrush, which has a 196-year fire return interval.

However, on a research trip in Costa Rica, a biodiversity hot spot in Central America, Roossinck observed that most of the approximately 10,000 species of wild plants at the study site appeared healthy. However, commercial crops—melons, oranges, pineapple and aloe—that were growing near the site were not as healthy.

Roossinck said she is curious about how the wild plants avoid disease, and if there is a way this can be used in agriculture. “When I went to the forest, the wild plants looked healthy and gorgeous,” Roossinck said. “Then, I went 10 kilometers away and the plants in the agricultural field were not looking so healthy. In the forest the plants are full of microbes: viruses, fungi and bacteria, whereas in crops farmers try to eliminate the microbes. Perhaps there is a connection.” Indeed, one plant virus that was found frequently in the forest was also found in nearby melon crops. In the melons it was causing severe disease, while in the wild plants there were no symptoms. Analyzing the viruses suggested that they were moving from the crops into the wild plants, but somehow the wild plants remained healthy. Roossinck said she is curious about how the wild plants avoid disease, and if there is a way this can be used in agriculture.


Energy and the Environment ›

Engineering Biofilms Understanding how bacteria function in communities could lead to a host of new applications. By David Pacchioli

Research | Penn State 2013


nywhere there’s a surface and water in the liquid state,” Tom Wood confirms, “you’re going to

have biofilms.”

In riverbeds and showerheads. On the hulls of ships and inside pipelines. On contact lenses and joint prostheses and the gleaming white surfaces of your teeth. Biofilms, says Wood, professor of chemical engineering and biochemistry at Penn State, “are communities of bacteria that have the ability to cement themselves to a solid surface, and then—if you picture them in a river, say—rather than going with the flow they anchor down to a rock, and as the river goes by they get the nutrients they need and they’re able to thrive.” “Communities” is the operative word. The biofilm that coats your teeth harbors more than 300 species of bacteria, working in concert. Most of these microbes either do no harm or are actually beneficial, but the few bad actors can saddle you with tooth decay and gum disease. Biofilms cause corrosion, a huge economic drain on industry and infrastructure, and are also increasingly recognized as a leading culprit in chronic disease, from childhood middle-ear infections to cystic fibrosis. Hospital infections are largely due to their ubiquitous presence. “In joint replacement surgery,” Wood says, “if an infection takes hold, there’s no drug they can add to get rid of it. They have to go back in, take out the original prosthesis, and put another one in—and hope the same thing doesn’t happen all over again.” Over 65 percent of all microbial infections are attributable to biofilms, according to the National Institutes of Health.


Tom Wood

Professor of Chemical Engineering and Biochemistry

These complex microbial communities, in short, cause a variety of problems, both inside the human body and out. But they also have the potential to do great good, from wastewater treatment to oil-spill clean-up to producing alternative fuels—if their biochemistry can be controlled. Wood believes that it can. “The whole idea of my lab,” he says, “is that if we can understand the genetic basis of biofilm formation, then we can either get rid of a biofilm, or promote it to do whatever we want.”

Sleeper Cells The Dutch scientist Anton van Leeuwenhoek first noticed biofilms back in 1683. When placing a scraping of plaque from his own teeth under one of his first-generation microscopes, he spotted a host of “very little living animalcules, very prettily a-moving.” For most of the next 300 years, however, biofilms were largely

ignored, as microbiology focused on individual organisms in their free-floating, or planktonic, state. “But bacteria do have this desire to hunker down and form an attachment to a solid surface,” Wood says. “That’s the way they are in nature, primarily—living in communities. Over the last couple of decades, scientists have started to look at that state, and there’s been an exponential increase in biofilm literature and studies. There’s now even a mouthwash that talks about including anti-biofilm compounds—so the public’s waking up to it, too.” Living in communities, bacteria are much hardier than when floating around free. They’re far more resistant to antibiotics—up to a thousand times more resistant, according to common estimates. “They’re much harder to kill,” Wood acknowledges, “but they’re even trickier than that.” Standard antibiotic treatments, he notes, target bacteria that are growing, dividing,

evolving. “But in a biofilm, up to 10 percent of the population is not actively metabolizing.” Under antibiotic attack, Wood explains, these bacteria in effect “put themselves to sleep” to avoid destruction. “If a cell is asleep, not dividing, the antibiotic has no effect,” he says. Then, when the coast is clear and the drug has run its course, these sleeper cells have the ability to wake themselves up and kick off a whole new infection. Appropriately, they’re called persisters. Their discovery is fairly recent, and when and how they work are hot topics among researchers of infectious disease. “What’s really fascinating to me,” Wood says, “is that they don’t undergo genetic change at all. There’s no mutations, no change in the DNA. It’s the opposite of building up genetic resistance.”

Chemical Messages Wood arrived at Penn State in January 2012, to fill the Endowed Biotechnology Chair in chemical engineering, with a joint appointment in biochemistry and molecular biology. “I’m a microbiologist in practice,” he likes to say, “but an engineer by training.” As such, he has always had an eye for downthe-road applications.

Complex microbial communities have the potential to resolve practical problems in ways that combine the laws of nature with engineering principles.

“Right out of college,” he remembers, “I was working in industry, making things that kill bacteria.” At Rohm and Haas, the chemical manufacturing giant headquartered in Philadelphia, Wood developed cosmetics. “Then I just got interested in understanding more about how bacteria live,” he says simply. He made the decision to go to graduate school at North Carolina State, focusing on biotechnology. “At first, I was just interested in trying to clean up the world—engineering bacteria to get rid of toxic waste,” Wood recalls. Then he started thinking more broadly, about sustainable practices for other types of chemical manufacture. “We got to wondering, how could we use these bacteria we were creating to do remediation, and also to do green chemistry? And we figured it would have to be in biofilm reactors”—engineered systems for growing and exploiting bacterial communities. But in order to build successful reactors, Wood knew, he first had to get a better handle on how biofilms form.

Biofilms are, he suggests, “basically the beginning of a tissue—the beginning of us. What I mean is that as these bacterial cells join together and grow, they differentiate, like the cells of higher organisms. It’s not like one group of cells becomes a tooth and another group becomes an ear, as in a developing mouse or a human. But they differentiate themselves by turning on different genes at different times, according to what’s needed.” Acting in the common interest requires communication, something bacteria achieve by cell-to-cell signaling. “Bacteria cells, whether free-floating or attached, are constantly secreting chemical signals,” Wood explains. As cells aggregate, however, held together by the slime that makes up the biofilm matrix, the concentration of signals increases. “The chemical will build up and up, and eventually you’ll reach a threshold,” he says. At that point, the signal crosses back into the cell and spurs it to act in some appropriate fashion. “It’s called quorum sensing. This is how the cell monitors what’s going on around it.” Experiments have shown that quorum sensing figures in a remarkable range of coordinated behaviors. As Wood notes, “Some cells in the disease state will hide until they reach critical numbers and realize they can overwhelm the immune system. Then they attack.” Other biofilms will “agree” to a division of labor: one group of cells will remove oxygen and another group will secrete building blocks for the community.

Cell-Cell Communication Bacterial cells living in biofilm communities “talk” to each other through chemical signals. As more and more cells aggregate, the concentration of signals increases. “The chemical will build up and up, and eventually you’ll reach a threshold,” Tom Wood says. “This is how the cell monitors what’s going on around it.”

In still other cases, a cell may be programmed to attack itself. As Wood explains it, bacterial cells contain enzymes called toxins, and, typically, corresponding antitoxins that—under normal circumstances—hold the toxins in check. In E. coli, the most-studied of all bacteria, researchers have so far identified 37 of these toxin/antitoxin pairings.


Energy and the Environment › Bacteria Bound by a Biofilm

Research | Penn State 2013

Janice Haney Carr/CDC

An electron micrograph shows round Staphylococcus aureus bacteria, bound by a biofilm, on the surface of an indwelling catheter. Biofilms shield bacteria from antibiotic attacks and are responsible for most hospital infections.

Under certain stresses, however, the antitoxins can be eliminated, freeing the toxin to damage the cell. Intriguingly, in the case of persister cells this is not a calamity but a survival choice. “The cell is not trying to kill itself,” Wood explains, “but just to slow down its growth rate, or make itself go to sleep.” Figuring out the nuts and bolts of how a persister actually achieves this feat and then manages to ‘wake up’ again when the time is right, he adds, “has been one of the thrusts of my lab.”

you’ve lost your genetic blueprint. You could gobble up the protein itself—but the protein is changing all the time anyway, to adapt to conditions.

When first approaching the problem, Wood says, he figured the cell’s options under the circumstances are somewhat limited. “If you look at the classic steps of protein synthesis, you go from DNA to messenger RNA, and then the messenger RNA becomes protein,” he notes. If you’re a toxin, and you want to stop the cell from producing a protein, “You could eat up all of your DNA, but if you do that you’re never going to wake up, because

In 1999, at the University of Connecticut, Wood became one of the first researchers to engineer a biofilm for a real-world application, putting a protective film on a mild steel to prevent corrosion.

“That leaves the thing in the middle— RNA. What the toxin will do is eat that intermediate message, and thereby put the cell to sleep.” A series of experiments confirmed his hunch: “We found the specific enzymes that eat the RNA.”

Harnessing the Potential

“We knew biofilms are going to form naturally on metal,” Wood remembers, “so we engineered the ‘good’ bacteria that were present to secrete small antimicrobial peptides that would inhibit the ‘bad’

bacteria—in this case sulfate-reducing bacteria that grab electrons from the metal and make it corrode. That was the beginning—the first inkling that we could control biofilm reactions.” A decade later, he and colleagues discovered that fluorouracil, well known as an anticancer compound, could also be deployed to prevent quorum sensing. “There’s a company now in Canada that is using it to coat hospital catheters,” he says. The protective layer is intended to slow the inevitable formation of biofilm on the catheter surface, lessening the risk of infection. Wood has studied another family of cellsignaling disruptors, called furanones, in a species of seaweed that lives off the coast of Australia. Marine biologists had noted that this species doesn’t have a slimy feel to it like most seaweed does. Subsequent work showed that the seaweed secretes furanones to turn off signaling between bacteria and prevent a biofilm from forming on its surface and interfering with its process of photosynthesis. “We figured out how it worked,” Wood says, “and we showed that it could decrease signaling in E. coli by tens of thousands of times.” Furanones are now being considered as a possible alternative to standard antibiotics in the aquaculture industry, where antibiotic resistance is a serious problem.

Dispersal on Command: See a visualization of bacterial strains engineered to dissolve “on command” via cell-signaling.


Engineering Biofilms

Janice Haney Carr/CDC

In a Nature Communications paper published in January 2012, Wood and Arul Jayaraman of Texas A&M reported still another important advance. After characterizing a previously unknown signaling protein called BdcA, they engineered it to make biofilms disperse on command. As Wood explains, the researchers first laid down a biofilm of a certain bacterial strain; then they introduced a second biofilm of a different strain, programmed to release a signal that would cause the first biofilm to break up. When it had done so, they activated another chemical signal to make the second biofilm dissolve. “What this experiment shows,” Wood says, “is that we can control bacteria in consortia—more than one at a time. That means we can hope to control biofilm formation for more complicated applications.”

“In my profession,” he says, “we need to manufacture chemicals. With conventional chemistry, that often requires harsh, polluting processes and solvents, and results in lots of waste. “But what’s becoming clear is that just about anything you can make by conventional means, you can also make with a bacterium, with enzymes, and you can do it all in water. At the end of the day, then, everything is biodegradable. In essence, you can make the same chemicals for the same price without hurting the environment. “That’s green chemistry. And that’s the kind of thing that I envision.”

“The most basic part of our research,” Tom Wood says, “is the toxin/ antitoxin systems that are key to persistence with antibiotics. Twenty years ago, nobody knew why they were there. Why would the bacteria cell incorporate something that could hurt it? “Our research has since been able to link these systems to stress resistance, to biofilm formation, even to a kind of molecular altruism. Under viral attack, some cells will actually kill themselves to save the cells around them.” Amazingly, bacteria have apparently pilfered the machinery of toxin/antitoxin from their arch enemy—viruses. “There’s always been this war between viruses and bacteria,” Wood explains. “It’s been going on for at least two-and-a-half billion years—and the viruses are winning.” When a virus invades a bacterial cell, it incorporates its DNA into the cell’s DNA, so that whenever the cell divides, a copy of the virus will be made. “That’s its whole reason for being: to reproduce itself,” Wood says. “If the cell stops dividing, the virus will jump out, kills its host—it’s not a very polite guest—and go off in search of other healthy cells to invade. “Meanwhile, though, spontaneous mutations are always happening in the cell’s DNA—that’s part of the way the bacteria evolves. Well, every once in a while a mutation will occur within the region of the embedded virus, before that virus has a chance to jump out.” With its genetic instructions scrambled and thus disabled, the virus in effect is captured. Frozen in time. “So,” Wood continues, “you have some viruses that are inside the bacterial chromosome that have been stuck there unchanged for 50 million years—we call them viral fossils. And when we look at the genes of these fossils, we find they code for toxin/antitoxins. “This is the kind of thing that continues to fascinate me. The cell is clever. It takes and adapts the tools of its enemy in order to control its own metabolism.” —DP


Thomas K. Wood, Ph.D., is Endowed Biotechnology Chair and professor of chemical engineering and professor of biochemistry and molecular biology,

Microbial Cunning

The next hurdle is to be able to dictate the positions of individual bacteria within a given biofilm. “We’re trying to create biofilms with hundreds of different microenvironments,” he explains, “and different kinds of chemistries occurring at different positions. If we can pull that off, we will have gone a long way to show how biofilms could be used in a biorefinery.” Doing so would also bring Wood closer to his old graduate-school dream of a sustainable chemistry.

Magnified image of an untreated water specimen from a wild stream, with biofilm.

Energy and the Environment ›

Microbes make ‘clean’ methane

M Research | Penn State 2013

icrobes that convert electricity into methane gas could become an important source of renewable energy, according to scientists from Stanford and Penn State universities. Researchers at both universities are raising colonies of microorganisms—methanogens—with the remarkable ability to turn electrical energy into pure methane, the key ingredient in natural gas. The goal is to create large microbial factories to transform clean electricity from solar, wind or nuclear power into renewable methane fuel and other valuable chemical compounds for industry. “Most of today’s methane is derived from natural gas, a fossil fuel,” says Alfred Spormann, professor of chemical engineering and civil and environmental engineering at Stanford. “And many important organic molecules used in industry are made from petroleum. Our microbial approach would eliminate the need for using these fossil resources.” While conceptually simple, there are significant hurdles to overcome before electricity-to-methane technology can be deployed at a large scale, notes Bruce Logan, Evan Pugh Professor and Kappe Professor of Environmental Engineering, Penn State. “That’s because the underlying science of how these organisms convert electrons into chemical energy is poorly understood,” Logan says. Methanogens cannot grow in the presence of oxygen. Instead, they regularly dine on atmospheric carbon dioxide and electrons borrowed from hydrogen gas. The byproduct of this microbial meal is pure methane, which methanogens excrete into the atmosphere.


Bruce Logan

Curtis Chan

Evan Pugh Professor and Kappe Professor of Environmental Engineering

In the ideal scenario, cultures of methanogens would be fed a constant supply of electrons generated from emissionsfree power sources, such as solar cells, wind turbines and nuclear reactors. The microbes would use these clean electrons to metabolize carbon dioxide into methane, which can then be stockpiled and distributed via existing natural gas facilities and pipelines when needed.

The researchers plan to use this methane to fuel airplanes, ships and vehicles. In the ideal scenario, cultures of methanogens would be fed a constant supply of electrons generated from emissions-free power sources, such as solar cells, wind turbines and nuclear reactors. The microbes would use these clean electrons to metabolize carbon dioxide into methane, which can then be stockpiled and distributed via existing natural gas facilities and pipelines when needed. At Penn State, Logan’s lab is designing and testing advanced cathode technologies that will encourage the growth of methanogens and maximize methane production. The Penn State team is also

studying new materials for electrodes, including a carbon-mesh fabric that could eliminate the need for platinum and other precious metal catalysts. “Our ultimate goal is to create a costeffective system that reliably and robustly produces methane from clean electrical energy,” Logan says. “It’s high-risk, high-reward research, but new approaches are needed for energy storage and for making useful organic molecules without fossil fuels.” —A’ndrea Elise Messer

Andrew Carleton Professor of Geography

Jets’ contrails contribute to heattrapping high-level clouds


ondensation trails that airplanes produce mean not only a white-streaked sky on some days, but an increase in the amount of high-level clouds and by extension, warming temperatures, Penn State Professor of Geography Andrew Carleton has found. By comparing National Oceanic and Atmospheric Administration satellite images showing contrail occurrence with data from eastern U.S. stations that record sky-coverage for different levels in the atmosphere, Carleton was able to confirm that contrails contribute to the occurrence of high-level clouds. Contrails form when jet engines emit sooty particles and moisture into cold air high in the troposphere. Water vapor already present in the atmosphere collects and freezes around those particles and forms linear ice crystal clouds. To probe the relation between contrails and trends in sky coverage, Carleton plotted the spatial occurrence of contrails identified on the satellite images for the periods 1977-79 and 2000-02. Satellite contrail and surfaceobserved sky-cover data was overlaid and separated according to high versus low frequencies of contrails.

Climate change had political, human impact on ancient Maya


he role of climate change in the development and demise of classic Maya civilization, ranging from AD 300 to 1000, has been controversial for decades because of a lack of well-dated climate and archaeological evidence. But an international team of archaeologists and earth science researchers has compiled a precisely dated, high-resolution climate record of 2,000 years that shows how Maya political systems developed and disintegrated in response to climate change. “Unusually high amounts of rainfall favored an increase in food production and an explosion in the population between AD 450 and 660,” says Douglas Kennett, professor of anthropology at Penn State and lead author of the study. The researchers reconstructed rainfall records from stalagmite samples collected from Yok Balum Cave, located nearly three miles from ancient city of Uxbenka, in the tropical Maya Lowlands in southern Belize. They compared their findings to the rich political histories carved on stone monuments at Maya cities throughout the region. The increased food production led to the proliferation of cities across the Maya lowlands. The new climate data show that this salubrious period was followed by a general drying trend lasting four centuries, punctuated by major droughts that triggered a decline in agricultural productivity and contributed to societal fragmentation and political collapse. The cities’ population declined and Maya kings lost their power and influence. “The linkage between an extended 16th-century drought, crop failures, death, famine, and migration in Mexico provides a historic analog, supported by the cave stalagmite samples, for the sociopolitical tragedy and human suffering experienced periodically by the Classic Period Maya,” Kennett notes. —A’ndrea Elise Messer

The findings showed that high frequencies of contrails didn’t equate to an increase in total cloud amount or an increase in low-lying clouds, but they did mean a significant increase in high-level cloudiness observed from the surface since about the mid-1960s.

—Anne Danahy

Douglas Kennett


Persisting contrails present the greatest impact on climate because instead of dissipating relatively quickly, they trap heat beneath them. While contrails do block the sun to some extent, when they persist they also spread and become thinner, which means they reflect less solar energy away while still trapping heat. The net effect tends to be to warm the earth’s surface, rather than to cool it.

Materials ›

Fold Up that Television and Put it Away?

by Alan S. Brown

Research | Penn State 2013

Flexible electronics open the door to fold-away smartphone displays, solar cells on a roll of plastic, and advanced medical devices— if we can figure out how to make them.


early everyone knows what the inside of a computer or a mobile phone looks like: a stiff circuit board, usually green, crammed with chips, resistors, capacitors, and sockets, interconnected by a suburban sprawl of printed wiring.

It may sound like an interesting laboratory curiosity, but not to Enrique Gomez, an assistant professor of chemical engineering at Penn State. “It could transform the way we make and use electronic devices,” he says. Gomez is one of many scientists investigating flexible electronics at the University’s Materials Research Institute. Others are doing the same at universities and corporations around the world.

Flexible electronics are in vogue for two reasons. First, they promise an entirely new design tool. Imagine, for example, tiny smartphones that wrap around our wrists, and flexible displays that fold out as large as a television. Or photovoltaic cells and reconfigurable antennas that conform to the roofs and trunks of our cars. Or flexible implants that can monitor and treat cancer or help paraplegics walk again. Second, flexible electronics might cost less to make. Conventional semiconductors require complex processes and multibillion dollar foundries. Researchers hope 22

to print flexible electronics on plastic film the same way we print ink on newspapers. “If we could make flexible electronics cheap enough, you could have throwaway electronics. You could wear your phone on your clothing, or run a bioassay to assess your health simply by wiping your nose with a tissue,” Gomez says. Before any of this happens, though, researchers have to rethink what they know about electronics.

Victim of Success That means understanding why conventional electronics are victims of their own success, says Tom Jackson, Kirby Chair Professor of Electrical Engineering. Jackson should know, because he helped make them successful. Before joining Penn State in 1992, he worked on IBM’s industry-leading laptop displays. At Penn State, he pioneered the use of organic molecules to make transistors and electronic devices. Modern silicon processors integrate billions of transistors, the semiconductor version of an electrical switch on tiny

Patrick Mansell

But what if our printed circuit board was not stiff, but flexible enough to bend or even fold?

slivers of crystalline silicon. Squeezing so many transistors in a common location enables them to handle complex problems. As they shrink in size, not only can we fit more transistors on a chip, but the chip gets less expensive to manufacture. “It is hard to overstate how important this has been,” Jackson explains. “Remember when we paid for long-distance phone calls by the minute? High-speed switching drove those costs way down. In some cases, we can think of computation as free. You can buy an inexpensive calculator at a store for a dollar, and the chip doesn’t even dominate the cost. The power you get is amazing.” That, says Jackson, is the problem. Semiconductor processors are so good and so cheap, we fall into the trap of thinking they can solve every problem. Sometimes, it takes more flexibility to succeed. Consider surgery to remove a tumor from a patient’s liver. Even after following up with radiation or chemotherapy, the

surgeon is never sure if the treatment was successful. “But suppose I could apply a flexible circuit to the liver and image the tissue,” Jackson says. “If we see a new malignancy, it could release a drug directly onto that spot, or heat up a section of the circuit to kill the malignant cells. And when we were done, the body would resorb the material. “What I want is something that matches the flexibility and thermal conductivity of the body.” Conventional silicon technology is too stiff and thermally conductive to work. Similarly, large, flexible sensors could monitor vibrations on a bridge or windmill blade and warn when they needed maintenance. “If you want to spread 100 or 1,000 sensors over a large area, you have to ask whether you want to place all the chips you need to do that, or use low-cost flexible electronics that I can apply as a single printed sheet,” Jackson says. None of the flexible electronics now under development would match the billions of transistors that now fit on silicon chips, or their billions of on-off cycles per second. They would not have to. After all, even today’s fastest televisions refresh their displays only 240 times per second. That is more than fast enough to image cancer in the body, reconfigure an antenna, or assess the stability of a bridge.

So how, exactly, do we make flexible electronics, and what kind of materials do we make them from?

Printing To explain what draws researchers to printing flexible electronics, Jackson walks through the production of flat panel displays in a $2-3 billion factory. The process starts with a 100 square foot plate of glass. To apply wires, the factory coats the entire plate with metal, then covers it with a photosensitive material called a resist. An extremely bright light flashes the pattern of the wires onto the coating, hardening the resist. In a series of steps, the factory removes the unhardened resist and metal under it. Then, in another series of steps, it removes the hardened resist, leaving behind the patterned metal wires. Factories repeat some variant of this process four or five times as they add LEDs, transistors, and other components. With each step, they coat the entire plate and wash away unused materials. While the cost of a display is 70 percent that of a finished device, most of those materials get thrown away. “So it’s worth thinking about whether we can do this by putting materials where we need them, and reduce the cost of chemicals and disposal. It is a really simple idea and really hard to do,” Jackson says.

An ideal way to do that, most researchers agree, would be to print the electronics on long plastic sheets as they move through a factory. A printer would do this by applying different inks onto the film. As the inks dried, they would turn into wires, transistors, capacitors, light emitting diodes, and all the other things needed to make displays and circuits. That, at least, is the theory. The problem, as anyone who ever looked at a blurry newspaper photograph knows, is that printing is not always precise. Poor alignment would scuttle any electronic device. Some workarounds include vaporizing or energetically blasting materials onto a flexible sheet, though this complicates processing. And then, of course, there are the materials. Can we print them? How do we form the precise structures we need? And how do we do dry and process them at temperatures low enough to keep from melting the plastic film?

Material World Fortunately, there are many possible materials from which to choose. These range from organic materials, like polymers and small carbon-based molecules, to metals and even ceramics. At first glance, flexible ceramics seem like a stretch. Metals bend, and researchers can often apply them as zigzags so they deform more easily. Try flexing a ceramic, though, and it cracks. Yet that has not deterred Susan Trolier-McKinstry, a professor of ceramic science and engineering and director of the W.M. Keck Smart Materials Integration Laboratory. Ceramics, she explains, are critical ingredients in capacitors, which store voltage on electronic circuits. In displays, transistors use capacitors to provide an instant jolt of voltage when they switch on a pixel, rather than waiting for power from a distant source.

Kirby Chair Professor of Electrical Engineering

Industry makes capacitors from ultrafine powders. The tiniest are 500 nanometers, 40 times smaller than a decade or two ago. Even so, there is scant room for them on today’s overcrowded circuit boards, 23

Patrick Mansell

Tom Jackson

Materials › especially in smartphones. Although industry can shrink them further, they become too fragile to place on circuit boards. Trolier-McKinstry thinks she can deposit smaller capacitors directly onto flexible sheets of plastic, then sandwiching the plastic in the middle of a flexible circuit board. That way, the capacitors do not hog valuable surface area.

Yuanyaun Li, Ph.D. candidate in electrical engineering, working with a scanning electron microscope in one of the Electronics Research Group facilities.

As long as she can keep capacitor sizes 500 nanometers or smaller, Trolier-McKinstry need not worry about capacitor flexibility. She could easily fit 1 million evenly spaced rows of capacitors across a 1-meter-wide sheet of plastic. This is the equivalent of putting a gumdrop on a 15-mile-wide asteroid. No matter how radically the asteroid curves, the gumdrop will not flex. Nor will a capacitor. Of course, not every element placed on a flexible substrate will be that small. So what happens if your transistors need to bend? One way to solve that problem is to make electronics from organic materials like plastics. These are the ultimate flexible materials. While most organics are insulators, a few are conductive.


Professor of Ceramic Science and Engineering

Trolier-McKinstry thinks she can deposit smaller capacitors directly onto flexible sheets of plastic, then sandwiching the plastic in the middle of a flexible circuit board. That way, the capacitors do not hog valuable surface area.

They may even have some potential advantages, says Gomez, who is trying to tame them for flexible electronic use. “Unlike silicon, which needs neighboring atoms to line up as a perfect crystal, organic molecules are less picky about the arrangement of their neighbors,” he explains. “My group’s goal is to turn these molecules into transistors and photovoltaic cells.”

Patrick Mansell

Research | Penn State 2013

One approach is to print a ceramicalcohol mixture on a plastic film and spot heat each capacitor with a laser to crystallize the ceramic into a capacitor. Another approach is to use a high energy beam to sand blast molecules off a solid ceramic and onto a plastic substrate.

Susan TrolierMcKinstry

Easier said than done. Molecules may not be picky about their neighbors, but they still need to form the right type of structures to act as switches or turn light into electricity. Gomez attacks the problem by using a technique called self-assembly. It starts with block copolymers, combinations of two molecules with different properties bound together in the middle. “Think of them as a dog and a cat tied together by their tails,” Gomez explains. “Ordinarily, they want to run away from each other, but now they can’t. Then we throw them into a room with other tied dogs and cats. What happens is that all the cats wind up on one side of the room and the dogs on the other, so they don’t have to look at each other.” Gomez believes this process could enable him to build molecules programmed to self-assemble into electronic structures at very low cost. “The overarching problem,” Gomez continued, “is figuring out how to design the molecule and then tickle it with pressure, temperature, and electrical fields

to form useful structures. We don’t really understand enough to do that yet.” Despite the challenge, flexible electronics promise changes that go beyond folding displays, inexpensive solar cells, antennas, and sensors. They could veer off in some unexpected directions, such as helping paraplegics walk again.

Mimicking Jell-O That is the goal of Bruce Gluckman, associate director of Penn State’s Center for Neural Engineering. To get there, he must learn how the brain’s neurons collaborate. “Computations happen at the level of single neurons that connect to other neurons. Half the brain is made up of the wiring for these connections, and any cell can connect to a cell next to it or to a cell across the brain. It’s not local in any sense,” he explains. Scientists measure the electrical activity of neurons by implanting silicon electrodes into the brain. Unfortunately, Gluckman says, the brain is as spongy as Jell-O and the electrode is as stiff as a knife. Plunging the electrode into the brain causes damage immediately. Every time the subject moves its head, the brain pulls away from the electrode on one side and makes better contact on the other. It takes racks of electronics to separate the signal from the noise of such inconsistent output.

Fold Up that Television and Put It Away?

Partnering with Dow Chemical

“This is why we need something other than silicon,” Gluckman says. Flexible electronics would better match the brain’s springiness. While some researchers are looking at all-organic electrodes, Gluckman believes they are too large and too slow to achieve the resolution he needs. Instead, he has teamed with Jackson to develop a flexible electrode based on zinc oxide, a faster semiconductor that can be deposited on plastic at low temperatures. The work is still in its early stages, but Gluckman believes they can develop a reliable electrode that lasts for years and produces stronger, clearer signals. Researchers have already demonstrated that humans can control computer cursors, robotic arms, and even artificial voice boxes with today’s problematic electrodes. Yet the results are often short-lived. “No one is going to let you operate on their brain twice,” Gluckman says. “If you want to directly animate limbs with an implant, the implant has to last the life of the patient. If we can do that, we can enable paraplegics to get around on their own.” As Jackson notes, computers and smartphones may have powered silicon’s development, but the results are visible in everything from cars and digital thermometers to toys and even greeting cards.


enn State’s interest in flexible and printed electronics is not just theoretical. In October 2011, the University announced a multi-year research project with Dow Chemical Corporation. The project received $1 million in its first year, and will continue for at least five years. Tom Jackson, Kirby Chair Professor of Electrical Engineering, is the principal investigator. Other team members include electrical engineer Chris Giebink, chemist John Asbury, chemical engineers Enriquez Gomez and Scott Milner, and materials science and engineering faculty Susan TrolierMcKinstry, Qing Wang, and Mike Hickner. Dow is one of the world’s largest manufacturers of the sophisticated chemicals used to fabricate semiconductors and the flexible plastics that could replace the stiff circuit boards, silicon wafers, and other rigid substrates currently used for electronic devices and their displays. “Some Dow research leaders believe that printing electronics on plastic film is the low-cost way to go for many applications, and the manufacturers in the supply chain want to understand where the technology is heading and what the critical material requirements will be,” explains Carlo Pantano, director of Penn State’s Materials Research Institute. Pantano had been keeping Dow up to date on Penn State’s materials capabilities for the past decade. The multidisciplinary skills and tools required to advance flexible electronics are brought together at the new Millennium Science Complex, where faculty researchers and their students interact with one another in state-of-the-art facilities for synthesis, fabrication, and evaluation of electronic materials and devices. “As they spoke with us, they realized we had people working on many of the relevant materials and processes, from organic and transparent semiconductors to low-temperature, low-cost nanofabrication,” he says.

Displays and solar cells are likely to power the new generation of flexible electronics, but brain implants are just one of the many unexpected directions they may take.

If the partnership sounds ambitious, so is the challenge. Organic molecules, for example, conduct electrons more slowly than silicon. To boost organic transistor performance, researchers must better understand the roadblocks and discover ways to speed things up.

Enrique Gomez, Ph.D., is assistant professor of chemical engineering, edg12@ Thomas N. Jackson, Ph.D., is Kirby Chair Professor of Electrical Engineering, Susan TrolierMcKinstry, Ph.D., is professor of ceramic science and engineering and director of the W.M. Keck Smart Materials Integration Laboratory, Bruce Gluckman, Ph.D., is associate director of Penn State’s Center for Neural Engineering,

Then they must find a way to produce those organic semiconductors and integrate them with electrodes, capacitors, resistors, interconnects, and other circuit elements that can be manufactured reliably and at low cost.

—ASB A flexible sample is placed on a plasma etching tool.


Patrick Mansell

Fortunately, Dow is a large, successful company, Jackson points out. “They are not telling us that they won’t be able to buy groceries if the research doesn’t come to fruition next year,” he says.

Materials ›

Research | Penn State 2013

Mussels inspire innovative new adhesive for surgery Mussels can be a mouthwatering meal, but the chemistry that lets mussels stick to underwater surfaces may also provide a highly adhesive wound closure and more effective healing from surgery.

ocean. They can hold on tightly without getting flushed away by the waves because the mussel can make a very powerful adhesive protein. We looked at the chemical structure of that kind of adhesive protein.”

In recent decades bio-adhesives, tissue sealants and hemostatic agents became the favored products to control bleeding and promote tissue healing after surgery. However, many of them have side effects or other problems, including an inability to perform well on wet tissue.

Yang, working with colleagues at the University of Texas-Arlington, took the biological information and developed a wholly synthetic family of adhesives. They incorporated the chemical structure from the mussel’s adhesive protein into the design of an injectable synthetic polymer. The bio-adhesives, called iCMBAs, adhere well in wet environments, have controlled degradability, improved biocompatibility and lower manufacturing costs, putting them a step above current products such as fibrin glue and cyanoacrylate adhesives.

“To solve this medical problem, we looked at nature,” said Jian Yang, associate professor of bioengineering at Penn State. “There are sea creatures, like the mussel, that can stick on rocks and on ships in the

Smart Fiber electronics included


odays’ high-speed fiber-optic communications marry electronics and optics—requiring semiconductor chips to convert light and optical fibers to carry the signal. But that technological merger can be cumbersome and inefficient. What if optical fibers could have the electronics built right in? Potential applications would include improved telecommunications and laser technologies, more accurate remotesensing devices, and similar upgrades in other optical-electronic hybrids. That’s the promise of new crystalline materials developed by an international team of researchers led by John Badding, a professor of chemistry at Penn State.


“The optical fiber is usually a passive medium that simply transports light, while the chip is the piece that performs the electrical part of the equation,” Badding explains. A ‘smart fiber,’ by contrast, would have the electronic functions already included. That kind of integration, however, has been difficult to achieve. For one thing, Badding says, optical fibers are round and cylindrical, while chips are flat, so simply shaping the connection between the two is a challenge. “An optical fiber is 10 times smaller than the width of a human hair,” he adds, and light-guiding pathways built onto chips are up to 100 times smaller than that. “So imagine just trying to line those two devices up.”

Tested on rats, the iCMBAs provided 2.5 to 8.0 times stronger adhesion in wet tissue conditions compared to fibrin glue. They also stopped bleeding instantly, facilitated wound healing, closed wounds without the use of sutures and offered controllable degradation, and are non-toxic. The iCMBAs could eventually be used in a wide range of surgical disciplines from suture and staple replacement to tissue grafts to treat hernias, ulcers and burns. —Jennifer Swales

Instead, Badding and his colleagues used high-pressure chemistry techniques to deposit semiconducting materials directly, layer by layer, into tiny holes in optical fibers. “The big breakthrough here is that we don’t need the whole chip as part of the finished product,” says Pier J. A. Sazio of the University of Southampton in the United Kingdom, another of the team’s leaders. “We have managed to build the junction— the active boundary where all the electronic action takes place—right into the fiber.” “Moreover, while conventional chip fabrication requires multimillion-dollar clean-room facilities, our process can be performed with simple equipment that costs much less,” Sazio notes. One of the key goals of research in the field is to create a fast, all-fiber network, he says. “If the signal never leaves the fiber, then it is a faster, cheaper, and more efficient technology. If we can actually generate signals inside a fiber, a whole range of optoelectronic applications becomes possible.” —Katrina Voss

Tony Jun Huang

Acoustic cell-sorting chip may lead to cell phone-sized medical labs A technique that uses acoustic waves to sort cells on a chip may lead to

miniature medical analytic devices that could make Star Trek’s tricorder seem a bit bulky in comparison, according to a team of researchers.

The device uses two beams of sound waves to act as acoustic tweezers and sort a continuous flow of cells on a dime-sized chip, explains Tony Jun Huang, associate professor of engineering science and mechanics. By changing the frequency of the acoustic waves, researchers can alter the paths of the cells. Since the new device can sort cells into five or more channels— compared with two allowed by current devices—it can allow more cell types to be analyzed simultaneously.

“Eventually, you could do analysis on a device about the size of a cell phone,” says Huang. Biological, genetic and medical labs could use the device for various types of analysis, including blood and genetic testing. “Today, cell sorting is done on bulky and very expensive devices,” Huang says. “We want to minimize them so they are portable, inexpensive and can be powered by batteries.” Using sound waves for cell sorting also is less likely to damage cells than current techniques.

Hybrid tunnel may help guide severed nerves back to health Building a tunnel made of hard and soft materials to guide the reconnection of severed nerve endings may be the first step toward helping patients who have suffered extensive nerve trauma regain feeling and movement, according to a team of biomedical engineers.

Mohammed Reza Abidian

Nerve injury in both central nervous system and peripheral nervous system is a major health problem. The National Spinal Cord Injury Statistical Center reports that 290,000 individuals in the U.S. suffer from spinal cord injuries, with about 12,000 new injuries occurring each year.

Illustration of a hybrid conduit aimed at aiding nerve regeneration.

Unfortunately, spontaneous nerve regeneration is limited to small lesions within the injured peripheral nerve system and is actively suppressed within central nervous system. When a nerve in the peripheral nervous system is cut slightly, nerve endings

Huang’s team created the acoustic wave cell-sorting chip using a layer of silicone. Two parallel transducers, which convert alternating current into acoustic waves, were placed at the sides of the chip. As the acoustic waves interfere with each other, they form pressure nodes on the chip. As cells cross the chip, they are channeled toward these pressure nodes. The transducers are tunable, which allows researchers to adjust the frequencies and create pressure nodes on the chip. —Matthew Swayne

can regenerate and reconnect. However, if the distance between the two endings is too far, the growth can go off course and fail to connect. Mohammad Reza Abidian, assistant professor of biomedical engineering at Penn State, is among researchers who recently developed a novel hybrid conduit consisting of a soft material called a hydrogel as an external wall along with an internal wall made of an electrically-active conducting polymer to serve as a tunnel that guides the regrowth and reconnection of the severed nerve endings. Abidian says that the method could offer advantages over current surgeries, which take the nerve from another portion of the body and graft it onto the injured nerve. The researchers used agarose, a hydrogel that is permeable and more likely to be accepted by the body, and added a conducting polymer to the design to form a wall that can mechanically support and reinforce the agarose. —Matthew Swayne


Materials ›

A New Frontier Task Force Identifies Unique Strengths for Biomedical Research

Research | Penn State 2013

Led by Tony Jun Huang, an associate professor of engineering science and mechanics, the task force identified seven areas in which Penn State has a critical mass of expertise to make important contributions to biomedicine.

Tony Jun Huang

1. Nanomedicine—Novel biomedical nanomaterials can be used within the body to deliver drugs, track and treat diseases, as coatings for implants, in neural recording and stimulation, and in tissue regeneration. The use of nanoliposomes and carbon nanomaterials (e.g. carbon nanotubes, graphene, doped carbon nanotubes, fullerenes) as delivery vehicles for imaging agents and drugs are being developed.

By Walt Mills


ifty years ago Penn State began its experiment in interdisciplinary science with the creation of the Materials Research Laboratory, founded on collaborations that involved physicists, chemists, geoscientists, mathematicians, mechanical and electrical engineers. The field of materials science and materials engineering is inherently collaborative, drawing on the expertise of multiple disciplines, their tools and techniques.

Now a new collaboration is developing, this time between physical scientists and engineers and their counterparts in the life sciences and medicine. The Penn State Task Force on the Convergence of Materials Science and Life Sciences was created to identify opportunities for the University in this new field. The members represented a cross section of engineers and scientists whose research programs were already at the interface of life and materials


science. They included two bioengineers, a neural engineer, a pharmacologist using nanomedicine, a chemist creating artificial cell models, and a materials physicist who synthesizes nanostructures for use in the body. Together they identified more than 90 Penn State faculty groups having the expertise necessary to contribute to convergence science and engineering.

An example of nanomedicine is the work of Mohammad Abidian (bioengineering), who uses conducting polymer nanotubes to deliver targeted anticancer agents to brain tumors. As a member of the Penn State Center for Neural Engineering, Abidian’s goal is to create a microchip that can be implanted next to a brain tumor to deliver controlled doses of chemicals directly to the tumor while providing real-time monitoring of the therapy using biosensor nanodevices. 2. High-Throughput Micro/Nano Systems—These systems will be able to achieve single cell/molecular resolution, while analyzing at a speed (throughput) far beyond current available technologies. Penn State faculty are developing devices for measuring neural communications, lab-on-a-chip analysis, blood pressure sensing, circulating tumor cell capture and blood counting. A team of researchers led by Stephen Benkovic (chemistry) and Tony Jun Huang uses a device called acoustic tweezers to precisely manipulate cellular scale living objects for study. (See previous page.) Benkovic would like to use the acoustic tweezers to study how living cells respond to pulses of chemicals and

3. Ultrasound Medicine—Ultrasound has been used to image the human body for at least 50 years and has become one of the most popular diagnostic tools in modern medicine. Established medical applicaitons include cardiac, abdominal, and fetal monitoring, blood flow mapping, disease diagnosis, tissue elastography, and osteoporosis testing. Ultrasound is also crucial in other biomedical applications such as focused ultrasound surgery, targeted drug delivery, and lab-on-a-chip. The Penn State electroceramics faculty has led the field of piezoelectrics for more than 35 years and has pioneered many groundbreaking technologies in the field, such as high-strain piezoelectric single crystals, new high transition-temperature morphotropic phase boundaries, high-strain polymer piezoelectrics, copper metallization for piezoelectric fuel injectors, and thin-film piezoelectrics for micro-electromechanical systems. Penn State researchers play a key role in a recently announced $18.5 million National Science Foundation Engineering Research Center focused on new piezoelectric materials and devices for health monitoring. The Penn State portion of the center will “utilize core Penn State strengths in materials, nanofabrication, low power circuits and biobehavioral health to advance human health,” according to Susan Trolier-McKinstry (ceramic science), one of the Penn State researchers. The Penn State team includes members from the Colleges of Engineering, Earth and Mineral Sciences, Education, and Health and Human Development and will develop miniaturized devices that are powered by body heat and motion.

Tony Jun Huang

Associate Professor of Engineering Science and Mechanics

Opposite page: An assembled flow cytometry chip created in Huang’s lab. The chip may soon enable inexpensive, portable devices that can rapidly screen cells for leukemia or HIV. Walt Mills

pressure that mimic similar processes taking place within the body. Recently, the researchers used acoustic tweezers to study the transparent roundworm known as C. elegans, a model system for diseases and development in higher animals, including humans. This is the first technology capable of touchlessly trapping and manipulating C. elegans. The same technology is capable of manipulating tens of thousands of cells at a time.

4. The Material Brain—Much of our insight into the functional properties of the brain, what we sense and how we model it, stems from cartoon models of structure that ignore the fine structure and the multiscale nature of the anatomy. Multiscale modeling of the brain, interfaces for brain sensing and actuating, materials for brain interfaces, and 3D tissue scaffolds are four proposed areas for exploration.

Leadership in this area lies in the Penn State Center for Neural Engineering, which coordinates the research of more than a dozen faculty and provides the bridge between University Park and Penn State Milton S. Hershey Medical Center clinicians and researchers. Under the direction of Steven Schiff, (engineering science and mechanics, neuroscience), the center aims to model the brain’s material properties by

Nanomedicine High-Throughput Micro/Nano Systems Ultrasound Medicine The Material Brain MultiModal Imaging Tissue Engineering and Regenerative Medicine Cells from Scratch


Materials › constructing theoretical and computational models that account for the brain’s extreme inhomogeneity and multi-scale anatomy. Multiple imaging technologies will be employed that will allow them to reconstruct the brain’s complex anatomy at all relevant length scales.

Penn State has unique strengths in many of the technologies listed above. Many Penn State technologies and materials have novel properties that can be simultaneously used for more than one imaging modality. Recently, the design of theranostic applications (technologies engineered to treat and diagnose via imaging), has been coupled to the burgeoning field of nanomaterials and nanotechnology, another area of Penn State strength. A recent proposal to the National Institutes of Health by a group that includes James Adair (materials science and engineering), Thomas Neuberger (bioengineering) and Mark Kester, Michael Smith, and Qing Yang (medicine), would use nanocolloids (nanoparticles dispersed in solution) in combination with both near-infrared and magnetic resonance imaging to detect tumors. 6. Tissue Engineering and Regenerative Medicine—As the population continues to age, tissue engineering is a fast emerging multidisciplinary field involving biology, medicine, and engineering that is likely to revolutionize the ways we improve the health and quality of life for millions, by restoring, maintaining, or enhancing tissue and organ function. Several fields


Christine Keating

Research | Penn State 2013

5. MultiModal Imaging—Multimodal imaging platforms such as CT and PET, CT and mass-based spectrometry, MRI and PET, or MRI and Optical are presently being explored for their ability to enhance anatomical, functional, and molecular imaging methods. Uses include the ability to diagnose and image cancerous lesions early via multimodal imaging, the ability to functionally assess brain function, and the ability to image and potentially treat vulnerable plaque (the next heart attack).

Images of primitive artificial cell created with lipid membrane and two large molecules. Christine Keating (chemistry) uses the tools of molecular self-assembly to construct primitive models of biological cells to learn how cells function and for medical applications, such as drug delivery and vaccine development.

of regenerative medicine are being developed through collaborations with bioengineers, materials scientists, and medical researchers at Penn State. Examples include the Abidian group’s 3D scaffolding for neural tissue engineering, Justin Brown (bioengineering), who creates scaffolds for bone and musculoskeletal tissues regeneration, Sheereen Majd (bioengineering), whose group uses soft lithography and microcontact printing for 3D cell culture growth, and Erwin Vogler (materials science and engineering), whose bioreactor device grows highly complex bone tissue that can be used to study how cancer metastasizes in bone.

7. Cells from Scratch—Single cells are the simplest units of life. They perform carefully choreographed biochemical reactions that enable them to replicate themselves, adapt to new environments, and convert sunlight, organic molecules, or chemical redox reactions into usable energy. They routinely biosynthesize organic, inorganic, and hybrid materials of staggering complexity; these materials often display functional properties unmatched by man-made materials. Potential medical outcomes include artificial cells that could serve as replacement cells for failing biological cells in a human disease state, or that could be used for culture of pathogens

A New Frontier

The origin of the first living cells and their early evolution to become recognizable progenitors of modern cells is an area of fundamental scientific interest that captures the imagination of scientists and nonscientists alike. Geoscientists such as Christopher House and James Kubicki study the abiotic synthesis of early macromolecules necessary for life. Christine Keating (chemistry) uses the tools of molecular self-assembly to construct primitive models of biological cells to learn how cells function and for medical applications, such as drug delivery and vaccine development.

Patrick Mansell

such as the malaria parasite. Additional potential outcomes include new routes to constructing materials with the desirable properties (increased strength, flexibility, reduced mass, etc.) based on materials formed by living cells, or new cellinspired methods for energy production, collection, and/or conversion from environmental sources.

Fifty years after the emergence of multidisciplinary science at the University, the opening of the Millennium Science Complex, which joins the Materials Research Institute and the Huck Institutes of the Life Sciences, marks the emergence of collaborative convergence science. Tony Jun Huang, Ph.D., is associate professor of engineering science and mechanics, and served as head of the Penn State Task Force on the Convergence of Materials and Life Sciences,

A Place to Come Together: Take a video tour of Penn State’s Millenium Science Complex, where materials and life sciences converge for cutting-edge research.

ď ľ


Social and Behavioral Sciences ›

The Mexican Children of Immigrants Project I nterpreting


N umbers

By Holly Swanson

Research | Penn State 2013


anguage barriers, fear of deportation, and community prejudice are problems often faced by Mexican immigrants that contribute to dramatic disparities in health and well-being between immigrants and the native-born U.S. population. These problems may be compounded by family and community characteristics, making it especially difficult for the children of immigrants to receive the basic services needed to promote a healthy future. Such complexities are the focus of a three-pronged program of investigation by a research team at Penn State’s Penn State’s Population Research Institute (PRI).

The numbers themselves are daunting—there are more than 40 million documented immigrants in the United States today, including 11.6 million from Mexico, according to the 2011 American Community Survey by the U.S. Census Bureau. The Pew Hispanic Center in Washington, D.C., estimates that an additional 6 million undocumented Mexicans are currently living in the United States.

Immigrants Program is an interdisciplinary project involving eight primary researchers that examines the health, development, and healthcare access of this growing group of children. Started in 2011, and funded through 2015 by a grant from the Eunice Shriver National Institute of Child Health and Human Development, the study is already beginning to produce results.

Much of the media attention surrounding immigration focuses on how to stem the flow, and how to keep track of the immigrants who are already here. Much less attention is paid to the health and well-being of this population group, and to the fact that many children of immigrants were born within U.S. borders and as such are citizens from birth.

The three branches of the project examine health and development, obesity, and healthcare access, focusing on Mexican children of immigrants from birth to 18 years of age, and accounting for migration patterns, legal residency status, and acceptance within the resident community. Most of the data are being amassed from national longitudinal studies such as the Survey of Income and Program Participation, a statistical survey by the Census Bureau; the Early Childhood Longitudinal Study, a national study focused on education conducted by the

Led by Nancy Landale, Liberal Arts Research Professor of Sociology and Demography, the Mexican Children of


National Center for Education Statistics; and the Mexican Family Life Survey, a multi-investigator longitudinal database on children in Mexico. These surveys have already gathered the basic data; the project’s challenge is to identify information in the data sets which can be used creatively to address important unanswered questions about the health and health-related behaviors of Mexican children of immigrants. “There is a lot of information in these existing data sets, and we’d like to use this information to deliver the best possible answers to some important questions,” Landale says.

Charting health outcomes Landale’s focus is on health and development in early childhood, specifically on how child health outcomes are related to family changes that come about with immigration and assimilation. Evidence from the Early Childhood Longitudinal Study Birth Cohort, which follows a nationally representative group of children from birth to kindergarten, shows that children of Mexican origin are similar to non-Hispanic white children in terms of their overall physical health. However, their cognitive skills rank lower, suggesting the potential for future developmental and educational problems. The question, Landale says, is why does this disparity exist? The most obvious answer is the language barrier: The tests themselves are administered in English, which may not be the primary language the child hears at home. But is language the only issue? To Landale, Mexican-origin children’s relatively low cognitive test scores point to additional socioeconomic factors, such as

the limited education of their parents and a high poverty rate. On average, foreignborn Mexicans have completed eight and a half years of education, compared with about twelve years for native-born Mexicans and more than thirteen years for native-born whites. Combined with their limited English proficiency and frequently unauthorized legal status, the low education of Mexican immigrant parents limits their opportunities for stable, well-paid employment. Today, also, over one-third of Mexican children of immigrants are poor, compared with less than 10 percent of white children of natives. Together, these socioeconomic disadvantages contribute to relatively low levels of enrollment in preschool, and environment which generally enhances cognitive development and school readiness. Landale notes, “Mexican-origin children enter school with lower levels of school readiness than their non-Hispanic white peers, which creates real challenges in terms of their early academic performance.” With team members Marianne Hillemeier and Sal Oropesa, Landale is also studying how migration and assimilation bring about changes in children’s family circumstances that may influence their health. Compared with Mexican children with native-born parents, Landale explains, Mexican children of immigrants are more likely to live with both parents at the time of their birth and less likely to transition out of a two-parent family to a singleparent family. But children of immigrants are more likely than children in Mexico to transition from a two-parent to a single-parent family and from an extended family household to a simple household during the preschool years. In future research, the team will investigate how these changes in family structure that occur with immigration influence children’s daily circumstances and health. “What we are hoping for is a more accurate and complete portrait of Mexican children of immigrants,” Landale says. “We’d like to better understand the roles of families and communities in children’s early health and development.”

Examining the roots of obesity Childhood obesity is a concern in most U.S. communities, but it poses particular challenges among Mexican immigrants. According to a 2012 article in the Migration Information Source by Penn State investigator Jennifer Van Hook and her colleagues Elizabeth Baker, Claire Altman, and Michelle Frisco, simply being exposed to American society increases the chances that a Mexican-origin child will become overweight. “Children in Mexico tend to be leaner than children in the United States, especially in regions of Mexico that send immigrants to the United States,” says Van Hook, who is PRI’s director and a professor of sociology and demography. “But there is a pattern where Mexican children who have been in the United States tend to weigh much more.” In fact, secondgeneration Mexican children, those born in the United States to immigrant parents, have the highest obesity rates of any ethnic group, she says.

“Mexican children of immigrants are more likely to live with both parents at the time of their birth and less likely to transition out of a two-parent family to a single-parent family. But children of immigrants are more likely than children in Mexico to transition from a two-parent to a single-parent family and from an extended family household to a simple household during the preschool years.”


Social and Behavioral Sciences › Poor nutrition is one likely factor in this increase. The easy availability of high calorie, low nutrient foods in their adopted country is well-documented, Van Hook notes. But low levels of physical activity also seem to play an important role in immigrant children’s weight gain.

Research | Penn State 2013

Van Hook says these children tend to be more sedentary than their nonMexican counterparts. “The surveys we’ve reviewed show that these children are less likely to participate in after-school activities like soccer and they are less likely to play outside,” she says. “It could be partly due to poverty and the fact that their parents are working long hours. But children in Mexico aren’t particularly heavy, so this also has something to do with them coming to the United States.” In fact, weight gain in Mexican immigrant children begins soon after the family initiates the process of migrating to the United States, Van Hook has observed. She believes this may be related to the staged migration pattern of many families, where one or both parents leave Mexico ahead of the children in order to get established in the U.S. “We aren’t sure exactly what’s happening, but it may be that they have less supervision since the parent or parents are missing,” Van Hook says. “The children may also be receiving remittances that

“About half of Mexican children of immigrants live in households that are food insecure. So this may be a psychological dynamic where they don’t limit their eating because they are worried about having enough food.”

the parents are sending home. This money may go toward eating more food or buying preprocessed food. “About half of Mexican children of immigrants live in households that are food insecure,” she adds. “So this may be a psychological dynamic where they don’t limit their eating because they are worried about having enough food.” Once the children are in the United States, television viewing could be yet another contributor to the weight gain. “I think that when Mexican children come here, they may watch a lot of television because they are new to the area and don’t know

anybody,” Van Hook says. Immigrant families with working parents may also rely on the television as a way to entertain their children in place of childcare. Documenting access to health care Mexican immigrants have historically migrated to states such as California, Texas, New York, and Florida, drawn by job opportunities and a sense of community with earlier arrivals. These states have had the time to establish the health and social services needed to effectively care for migrant populations. In California, for example, many health clinics acted proactively to head off a potential tuberculosis outbreak in its immigrants during the 1990s, says PRI research associate Deborah Graefe. Changing 21st-century labor markets have created migration streams to areas unfamiliar with the unique challenges posed by large-scale immigration. In some cases, Mexican immigrants are being directly recruited to new areas where low-skill workers are needed, notably in northeastern and midwestern states, including Pennsylvania, notes Graefe. At other times, patterns change by word of mouth, as a small number of immigrants become familiar with a new area and communicate opportunities to distant family members. “When I talk about new destinations, I’m referring to areas that are going to be less familiar with the specific needs of immigrant families,” she says. “We are looking at the availability of low-cost clinics and physicians who are culturally sensitive and who speak Spanish,” she adds. “But we have little information on whether or how well the health care needs of Mexican immigrant children are being met. We hypothesize that children of undocumented parents will have a harder time getting access to health care, even for children who are born here. The family’s legal status can make a big difference.” Another important factor may be the resident community’s receptivity toward immigrants. Graefe, with colleagues Gordon De Jong and John McCarthy, is working with a team of students on a content analysis, sampling local newspapers across the country and their reporting on immigration as a way of gauging community attitudes.


The Mexican Children of Immigrants Project

“Our hypothesis is that hostility toward immigrants sets up a situation where people might not be inclined to take advantage

Steve Tressler, Vista Professional Studios

of routine medical care.”

Deborah Graefe, Nancy Landale, and Jennifer Van Hook (left to right), are key researchers for the Mexican Children of Immigrants Program at Penn State.

“We’re looking at whether a newspaper indicates a negative or positive climate,” she says. “Our hypothesis is that hostility toward immigrants sets up a situation where people might not be inclined to take advantage of routine medical care.” In cases of trauma, she explains, immigrant parents might be expected to take their children to the emergency room regardless of climate. But if they are not getting routine care, parents might not realize that a child is overweight or behind on immunizations. “Those kinds of things may be deterred by a negative attitude,” Graefe says.

Moving forward with the data As the study progresses through its second year, Landale and her colleagues hope their findings will aid in developing understanding of the needs of Mexican immigrants and the communities in which they settle.

Nancy Landale, Ph.D., is Liberal Arts Research Professor of Sociology and Demography, Jennifer Van Hook, Ph.D., is professor of sociology and demography and director of the Population Research Institute, jxv21@psu. edu. Deborah Graefe, Ph.D., is a research associate in the Population Research Institute,


“We are still early in the project, but we hope to contribute to new ways of looking at important questions,” she says. “Our main goal is to conduct innovative research and to publish our findings, but we have an agreement with the Migration Policy Institute to disseminate them beyond the academic journals. Their goal is to take these scientific research findings and use them to inform policy makers.”

Social and Behavioral Sciences ›

Exposure to violence has long-term stress effects among adolescents

Research | Penn State 2013

New research shows that children who are exposed to community violence continue to exhibit a physical stress response up to a year after the exposure, suggesting that exposure to violence may have long-term negative health consequences. “We know that exposure to violence is linked with aggression, depression, post-traumatic stress symptoms and cognitive difficulties in the short term, but little is known about the longterm effects of such exposure,” says Elizabeth Susman, Jean Phillips Shibley Professor of Biobehavioral Health, one of the Penn Staters who worked on the study with colleagues from University College London. Melissa Peckins, a biobehavioral health graduate student at Penn State, notes that most studies of the effects of exposure to violence look at children who live in inner cities and urban communities. “Our study is unique,” she says, “because we focused on children who live in small towns, so they are not children you would normally expect to

be exposed to a lot of violence. Also, these were healthy children without a history of reported maltreatment.” The researchers identified adolescents subjects’ lifetime exposure to violence and exposure within the previous 12 months, and also elicited a stress response in a laboratory setting. The team measured the children’s stress responses by comparing the cortisol levels present in samples of their saliva collected before and after the stress test was administered. In males, researchers found that as exposure to violence increased, cortisol reactivity decreased. The finding was not present in females. “In enduring stressful conditions, we may have adapted evolutionarily to suppress our cortisol levels because higher and more prolonged levels of cortisol in the bloodstream can lead to negative health consequences, such as autoimmune disorders, lowered immunity, and arthritis. This may explain why cortisol reactivity was lower for males,” Susman says. “However, there is a theory that females may react to stressful situations by talking about it, which may be their way of reducing the negative effects of cortisol in the bloodstream. If parents and other adults are available to discuss episodes of violence with children, it might help the children, especially females, to reduce their cortisol levels.” —Sara LaJeunesse


Reactions to everyday stressors predict future health Contrary to popular perception, stressors don’t cause health problems. Instead, how people react to stressors determines whether they will suffer long-term health consequences, according to research by David Almeida, professor of human development and family studies. “If you have a lot of work to do today and you are really grumpy because of it, then you are more likely to suffer negative health consequences ten years from now than someone who also has a lot of work to do today, but doesn’t let it bother her,” Almeida says. Using a subset of people who are participating in the MIDUS (Midlife in the United States) study, a national longitudinal study of health and well being, Almeida and his colleagues investigated the relationships among stressful events in daily life, people’s reactions to those events, and their health ten years later. The team found that people who become upset by daily stressors and continue to dwell on them after they have passed were more likely to suffer from chronic health problems—especially pain, such as that related to arthritis, and cardiovascular issues— ten years later. The researchers also found that certain types of people are more likely to experience stress in their lives. Younger people, for example, have more stress than older people; people with higher cognitive abilities have more stress than people with lower cognitive abilities; and people with higher levels of education have more stress than people with less education. —Sara LaJeunesse

Susan McHale

Time with parents is important for teens’ well-being

Professor of Human Development and Director of the Social Science Research Institute


eenagers are famous for seeking independence from their parents, but research shows that many teens continue to spend time with their parents and that this shared time is important for teens’ well-being. In fact, researchers have discovered that well into the adolescent years, teens continue to spend time with their parents and that this shared time, especially shared time with fathers, has important implications for adolescents’ psychological and social adjustment. “The stereotype that teenagers spend all their time holed up in their rooms or hanging out with friends is indeed just a stereotype,” says research team member Susan McHale, professor of human development and director of the Social Science Research Institute. McHale and her colleagues examined changes in the amount of time youths spent with their parents from early to late adolescence, starting with families in which the oldest children were about 11. According to youths’ daily reports of their time, although parentteen time when others were present declined from the early to late teen years, parent-teen time with just the parent and the teen present increased—a finding that contradicts the stereotype of teens growing apart from their parents. “This suggests that, while adolescents become more independent, they continue to have one-on-one opportunities to maintain close relationships with their parents,” McHale says. Furthermore,

teens who spent more time with their fathers with others present had better social skills with peers, and teens who spent more time alone with their fathers had higher self-esteem. The researchers also found that the decline in the time teens spent with parents and others was less pronounced for second-born than for first-born siblings. —Sara Lajeunesse

Family Dynamics: Watch Susan McHale, an expert on relationships within families, discuss aggression between siblings.


Social and Behavioral Sciences ›

An Ounce of Prevention by Krista Weidner

“A Research | Penn State 2013

nd how many adolescents do you have?” Doug Coatsworth asks, leaning back in his chair with a knowing smile. Clearly, he can tell from my expression that I am having no problem relating to the topic at hand: emotionally charged interchanges between parents and teens. Although my three kids are over 18 and “semi-launched,” those tense parenting moments from the not-very-distant past are all too memorable.

Coatsworth is co-leader of family science and intervention programs and the Program on Empathy, Awareness, and Compassion in Education (PEACE) initiative at Penn State’s Prevention Research Center. He is the principal investigator for a study that uses mindfulness techniques to help parents monitor their thoughts and feelings while they’re interacting with their adolescent children. “When parents get caught up in that moment—and any parent of a pre-teen or teen knows this—they can slip into familiar patterns that lead to escalating emotions and lost tempers,” he says. “Mindfulness involves training your mind in ways that allow you to step back,slow down, and be really present to what is happening—to note your thoughts and emotions with a sense of dispassion. We work with parents to get them out of their heads and into their lives.”


When parents and kids can talk to each other without succumbing to escalating emotions, their relationship will benefit— and that, Coatsworth and his colleagues, Mark Greenberg, Larissa Duncan and Rob Nix, believe, can reduce the risk that adolescents will get involved in dangerous activities like drinking, drugs, and sex.

also recognized as one of the best. “What sets us apart is our broad range of focuses,” says interim director Ed Smith. “While we’re similar to other prevention centers in that we promote effective programs and work in communities to ensure they’re being done well, we incorporate a strong research component.”

Coatsworth’s mindfulness study is one of many projects under way through Penn State’s Prevention Research Center. Established in 1997 with an endowment by alumna Edna Bennett Pierce, the center is the largest of its kind in the nation. It is

At one end of the center’s spectrum of activity is basic child development research—how children’s brains develop, how parents, peers, and teachers can influence that development, what can derail normal growth and maturation. “This part

When parents and kids can talk to each other without succumbing to escalating emotions, their relationship will benefit—and that can reduce the risk that adolescents will get involved in dangerous activities like drinking, drugs, and sex.

of our work doesn’t involve any programs or interventions,” says Smith. “It’s purely about understanding—whether it’s early childhood, adolescence, or early adulthood. Then the other extreme is about delivery. We’ve applied our understanding to a program, we’ve tested that program and know it works, and now we want to figure out the best way to deliver it.” What happens in the middle—testing a program to find out if it works—is key to the Prevention Research Center’s mission. Prevention programs are delivered on a full scale only after they’ve been proven effective through clinical trials and evaluation. “Whatever the objective—be it reduced substance use or reduced fighting and aggression—we keep tinkering until we think we have the right product,” Smith says. “Only after we’ve published our findings and they’ve been reviewed and critically assessed do we promote our programs.”

In its 16 years of existence, the Prevention Research Center has evolved along with advances in prevention science, a relatively new field that grew out of a shift in thinking about public health. The release of the 1964 Surgeon General’s report that alerted the nation to the health risks of smoking was the impetus for this shift. The report suggested that, in addition to traditional public health measures—such as administering immunizations and monitoring drinking water—changing people’s behavior could help them live longer and healthier lives.

thread is promoting well being and preventing disorders, and this is how prevention science emerged.”

“How do you push for those behavioral changes?” Smith says. “Therapy is one way, but it’s costly and it’s done on an individual basis. Plus you’re waiting until the person develops a problem before you intervene. A growing number of professionals started seeing a common thread tying together many fields: psychology, epidemiology, psychiatry, social work. That

Talking to teens about drinking

With support from the National Institutes of Health and its National Institute on Drug Abuse, as well as the National Institute on Childhood Health and Human Development, the National Institute on Alcohol Abuse and Alcoholism, and the National Institute on Mental Health, the Prevention Research Center continues to expand its breadth of programs and projects. Here’s a sampling:

“When it comes to teen drinking, there’s a traditional view that parents don’t have much of an impact, but here’s the deal—they do,” says psychologist Rob Turrisi. “Evidence shows that when parents intervene in the right ways, they can affect their teens’ decisions to drink.”

What Causes ADHD? See neuroscientist Lisa Gatzke-Kopp discuss the role the neurotransmitter chemical dopamine plays in the behavior patterns of children with ADHD. 


Social and Behavioral Sciences ›

Research | Penn State 2013

Turrisi developed a program for parents on how to communicate with teenagers about dangerous drinking, with a focus on college freshmen leaving home for the first time. His research shows that a number of factors are especially relevant when parents communicate with their teens: Teens need to believe their parents are giving them good advice, and they need to believe their parents are looking out for their best interest. As well, it’s important for the teen to see the parent as available and accessible. “A teenager who sees his parent as being too busy and generally unavailable won’t seek out that parents’ advice,” Turrisi says. Communication style matters, too. When parents talk with their kids, it’s important for them to show empathy and understanding, stay calm and relaxed, and be clear, direct, responsive, and supportive. Finally, after parents convey their expectations about drinking, they need to follow up to see if those expectations are being met, and then respond in a way that keeps communication channels open. “Other approaches simply encourage a conversation or two, but they don’t show parents how to follow up,” Turrisi says. “Parents who use our program learn how to ‘check in’ to see if what they talked about is being translated into action and then what to do after the check-in. Our approach encourages parents to be vigilant. And if they put all this into practice, their discussions with teens will be that much more effective.”

Bullies and brain activity Watching kindergarteners watch cartoons can help scientists understand aggressive behavior. Neuroscientist Lisa Gatzke-Kopp is interested in how young children’s brains are organized, and in a recent study of kindergarteners in the Harrisburg (Pa.) School District she and her colleagues compared aggressive children, who get into fights easily or bully others, to nonaggressive kids. “We wanted to find out how aggressive children experience and manage their emotions— would it be different from the methods their nonaggressive classmates use?” says Gatzke-Kopp.

Symptoms of stress

Bullying affects both bystanders and target


hildren who repeatedly witness bullying may suffer more physical and emotional trauma than those who have more limited exposure to bullying, and that trauma may have a lifelong influence, say Penn State researchers. Children who witness bullying may have difficulty acquiring a sense of safety and affiliation with others, both of which are crucial human needs, according to Associate Professor of Counselor Education JoLynn Carney, whose research with Professor of Counselor Education Richard Hazler focuses on the effects of bullying on bystanders. “Bullying can also cause people who witness it to demonstrate physical stress symptoms of increased heart rate and perspiration as well as high levels of self-reported trauma even years after bullying events,” Carney says. Carney and Hazler employed a variety of approaches to study the effects of bullying on bystanders. In one study they used the School Bullying Survey, which Hazler developed with other colleagues, to survey sixth-grade students from a rural Midwestern school. They found that all students surveyed had been exposed to repetitive bullying, either as a target or as a witness. They also found that the ability of children to trust others was significantly related to less bullying exposure and more witnessing of interventions by others. Carney and Hazler also investigated how exposure to bullying at school is associated with students’ anxiety levels and adrenocortical activity. They found that the amount of combined bullying exposure from victimization and bystanding was related to lower cortisol levels at a time, just before lunch, when the potential for bullying was about to increase. “This study is groundbreaking in that it demonstrates that there are physiological impacts related to being exposed to bullying that relate to physical symptoms and influence behavior, as well as potential future physical and social implications,” says Carney. According to Carney and Hazler, the general theme emerging from their research is that bullying doesn’t just affect victims. “Everyone is impacted,” says Hazler, “both in school and outside of school, and the influences can be life-long. Our research emphasizes the widespread impact of bullying and implies the need for individual and group interventions to more effectively deal with the problems.” —Suzanne Wayne


An Ounce of Prevention

The researchers collected teacher ratings of each child’s behaviors, including aggression, disobedience, and self-control. They also brought in a mobile research laboratory to measure the children’s heart rate, skin conductance, and brain activity while they were watching cartoon video clips depicting fear, sadness, happiness, and anger. The goal was to see how both the aggressive and the nonaggressive children reacted to different emotions, and whether those reactions are linked to aggression.

Steve Tressler, Vista Professional Studios

The team found that 90 percent of the aggressive kids in the study fell into one of two categories: They were either low in verbal ability, or they were more easily aroused physiologically. “The kids with lower verbal ability have a harder time extracting what other people are feeling,” Gatzke-Kopp says. “They don’t have a nuanced sense of emotions—everything is either happy or sad to them—so they might not be good at seeing how their behavior is making another child feel. Because they have a hard time communicating verbally, hitting is the easier solution when they’re frustrated.” The second group of kids, who were more physiologically aroused, might have trouble distinguishing between a minor annoyance and a major threat. They are more apt to act on impulse, easily losing control of their behavior. “Kids who hit their classmates when they’re frustrated or disrupt the class are at especially high risk for long-term consequences, including delinquency, violence, dropping out of school, abusing substances, and even suicide,” Gatzke-Kopp says. “Research tells us that the earlier we can intervene, the better the chances of getting these children back on track. This study shows us there are different underlying causes for problem behavior, and we might need different types of treatments to change that behavior.”

Yoga for kids The words “yoga class” probably don’t evoke a group of inner city ten-year-olds practicing poses and breathing. But it’s happening in Baltimore, as part of the Prevention Research Center’s PEACE component. PEACE encompasses a broad range of programs that share the

Patricia Jennings, Lisa Gatzke-Kopp, Douglas Coatsworth, and Edward A. Smith

goal of promoting health and wellbeing in children, youth, and families through awareness, compassion, and empathy. One of those programs, led by psychologist Mark Greenberg, focuses on teaching yoga and mindfulness techniques. With partners from Johns Hopkins University and the Holistic Life Foundation—a Baltimorebased nonprofit organization that initiates human and environmental health programs—Greenberg and colleagues set up the program in four Baltimore public schools. The idea was to try yoga as a way of helping fourth and fifth graders from low-income families deal with stress by building self-regulation skills. “Poverty is

stressful, and stress can impair kids’ ability to regulate their thoughts and emotions,” says Greenberg. “Yoga can help.” Classes started with yoga positions, so the kids could be active, then ended with stillness and silence. Although kids weren’t sure what to make of the whole thing at first, they caught on quickly. Says Greenberg, “We began to see them really having fun with the poses and then becoming relaxed and quiet—abnormally quiet!” A pilot study shows that after completing yoga classes, kids were at lower risk for developing anxiety and depression because they were better at managing their thoughts and emotions. In particular, kids


Social and Behavioral Sciences ›

“And I’ll be honest with you—this is the best thing I’ve done in my life. You can’t do anything bigger or more impactful than this. We’re having an effect on kids’ health, and that’s exciting.”

Research | Penn State 2013

reported lower levels of emotional arousal, intrusive thought patterns, and rumination (thinking about the same thing over and over). Currently, a second, larger study is under way to expand the project and measure more factors, including improvement in overall health and well-being.

Caring for teachers Another program that falls under the center’s PEACE initiative focuses on teachers. Prevention scientist Patricia Jennings and her colleagues Christa Turksma and Richard Brown created a professional development program for teachers in collaboration with the Garrison Institute, a not-for-profit organization that explores the intersection of contemplation and engaged action in the world. CARE for Teachers (Cultivating Awareness and Resilience in Education—shows teachers how they can use mindfulness techniques to slow things down in the midst of a hectic classroom. “Being a teacher is challenging,” Jennings says. “Teachers are under a lot of pressure to fulfill curriculum and testing requirements. They have to deal with their stress in a classroom in front of twenty or thirty kids, and they are not supposed to get angry. It’s like having your brake and accelerator going at the same time, and it wears you out. In my own teaching days I remember being all too aware of how stress interfered with my ability to be present and teach in a way that allowed minds to open.” Combining mindfulness practices and emotional skills training, CARE has been demonstrated to work. A two-year pilot project shows that the program helps reduce various kinds of stress.


“I was particularly excited to see that teachers reported a sense of reduced time urgency,” Jennings says. “We hear all the time that teachers feel they don’t have enough time to get things done. But after participating in our pilot project, teachers told us—and they were actually amazed by this—that they felt a sort of psychological space. I think what happens when you’re under time pressure is that you spend so much time thinking about what you have to do that you’re not doing it. By deliberately slowing down, teachers perceived they had more time and were accomplishing more.”

Fighting boredom “I’m bored.” It’s a common refrain among kids, and apparently it’s not just an American thing. Prevention scientist Ed Smith and his wife, Linda Caldwell, professor of recreation, park, and tourism management, are conducting a long-term study in South Africa, using school-based prevention techniques to reduce substance abuse and sexual risk among eighth and ninth graders. The study focuses on getting kids to make smart choices when it comes to their free time. “South Africa has the highest rate of HIV/ AIDS in the world and we’re working with very high-risk, low-income populations,” Smith says. “We think it’s critical to reach teens with the message that they do have choices. When we started talking to these kids it became clear that teen boredom is a universal phenomenon. That sense of ‘whatever’ resonated so well with the South African kids.” Through role play and other activities, teachers delivering the program help kids recognize risky situations and come up

with alternative things to do that are exciting but don’t get them into trouble. “These kids don’t have the option of learning how to scuba dive or skateboard,” Smith says. “But we want them to realize they do have options. We’ve heard some kids say, ‘Well, I never thought about hiking up Table Mountain.’” Smith and Caldwell are in the eleventh year of their study. They work with two South African universities to run the program in 48 high schools in former apartheid-era townships. Clinical trials show the program works as a school curriculum—teens that were followed through eleventh grade reported less risky behavior. The next step is to adapt the program so that teachers can deliver it even more efficiently, and researchers are tweaking delivery variables such as training and support. Some teachers are simply handed the program, others get an afternoon training, and still others get a two-day, in-depth training. Similarly, some teachers receive support via text message or personal visit, while others are on their own. This research design, developed in conjunction with Linda Collins (Methodology Center) and John Graham (Biobehavioral Health), employs cutting edge methods to solve real life problems. “It’s complicated because we end up with this huge matrix of who is getting what,” Smith says. “But our purpose is relatively simple: We want to take this program, which we now know is effective, and maximize its efficiency and cost/benefit ratio. Is the enhanced training and support worth the cost? Will we see more changes in kids’ behavior as a result? We’ll find out. “And I’ll be honest with you—this is the best thing I’ve done in my life. You can’t

do anything bigger or more impactful than this. We’re having an effect on kids’ health, and that’s exciting.” Douglas Coatsworth, Ph.D., is professor of human development and family studies, Edward A. Smith, Ph.D., is interim director of the Prevention Research Center, eas8@psu. edu. Robert Turrisi, Ph.D., is professor of biobehavioral health, Lisa Gatzke-Kopp, Ph.D., is assistant professor of human development, lmk18@ Mark Greenberg, Ph.D., is Edna Peterson Bennett Endowed Chair in Prevention Research and professor of human development and psychology, Patricia Jennings, Ph.D., is research assistant professor, Prevention Research Center for the Promotion of Human Development, The Prevention Research Center is based in the College of Health and Human Development and encompasses program areas that include Family Science and Intervention, Emerging Adulthood, School-based Prevention Research, and the Program on Empathy, Awareness, and Compassion in Education (PEACE). For more information, visit http://

EPIS Center

An Ounce of Prevention

“Scientists, even the best in the world, are not the best promoters of products,” says Ed Smith, interim director of Penn State’s Prevention Research Center. That’s where the Evidence-based Prevention and Intervention Support (EPIS) Center, directed by Brian Bumbarger, comes in. After researchers have tested a program and know that it works, they hand it off to the EPIS Center for promotion throughout the state. If a school is interested in running an anti-bullying program or a program on preventing drug abuse, for example, the EPIS Center will recommend options. “This goes back to our focus on evidence-based programs,” Smith says. “We can say, Here are three or four really effective programs, and we have scientific evidence showing that they work.” EPIS Center staff provide schools and other organizations with the materials they need as well as trainers who go on site to teach effective delivery. Evidence-based programs featured through the EPIS Center include Big Brothers Big Sisters, Strengthening Families, and a bullying prevention program. For more information on the EPIS Center, and a complete list of programs available, visit —KW


Health and Life Sciences ›

A Cure for Leukemia? Research | Penn State 2013

Wait a minute. Cured? Really?

by Sara LaJeunesse


wo college researchers in the Department of Veterinary and Biomedical Sciences have cured leukemia in mice without any side effects, and they think their therapy will work in humans, too.

As a science journalist, I have been trained to be skeptical of such dramatic claims. To me, finding a cure for cancer that doesn’t leave patients nauseous, bald, and downright exhausted is as elusive as finding Shangri-La. Yet, professors Robert Paulson and Sandeep Prabhu have cured leukemia in mice, and they seem to have done it without the side effects and relapse risks that come with surgery, chemotherapy, and radiation treatments. Here’s how.

A Partnership is Born It was a frosty morning in January when Robert Paulson knocked on Sandeep Prabhu’s office door with a paper that, unknown to either of them at the time, would change the course of their research and their lives. “There was one sentence in this paper stating that the compound 15d-PGJ2 is part of a class of compounds that may be able to treat leukemia stem cells,” Paulson,


a professor of veterinary and biomedical sciences, tells me as he, Prabhu, and I walk through his laboratory in Henning Building. “So I wondered if the compound Sandeep worked on, which is similar to 15d-PGJ2, would also treat leukemia stem cells.” Paulson had been reading the paper because he studies leukemia, a cancer of the blood cells.

His lab is filled with several high-tech machines and dozens of chemicalcontaining glass jars stacked high on the shelves. Students staring at computers and surrounded by stacks of papers are so immersed in their work, they seem not to notice us as we pass by. Paulson and his lab group are especially interested in understanding the differences between leukemia cells and leukemia stem cells, which make leukemia cells. They also are working to improve models of leukemia in animals, and are devising a method for growing leukemia stem cells in the lab so they can study them in detail. Prabhu, an associate professor of immunology and molecular toxicology whose lab shares a wall with Paulson’s lab, studies various molecules that are created by the body in order to figure out what they do. One of his recent studies focused on the molecules that our bodies make when they metabolize fish oil. “We were looking at the metabolism of omega-3 fatty acids to examine the production of new and novel metabolites by immune cells because omega-3s are known to have all kinds of health benefits, such as protecting against heart attacks and strokes and assisting with brain development,” says Prabhu. “What we found is that if you give your cells omega-3s from fish oil, they make this compound called ∆12-PGJ3 [delta-12-prostaglandin J3—or J3, for short], which no one ever knew existed before.”

Although the researchers had worked for years on opposite sides of a wall, the pair had never before worked together. Yet armed with Prabhu’s knowledge that human cells produce J3 when given fish oil, and also knowing that J3 is similar to the compound that had could be useful in treating leukemia cells, the scientists teamed up in a two-year quest to determine whether giving fish oil to mice would cure their leukemia.

leukemia, but no one really knows for sure what causes the disease.

What they found surprised them both.

Shailaja Hegde, a graduate student in pathobiology, has been helping them from the beginning.

The Big C Cancer. It’s one of the most feared diagnoses in human health, and rightly so because half of all people with the disease die from it or its treatment. But some cancers are worse than others. Lung cancer, pancreatic cancer, and esophageal cancer, for example, are often deadly, while testicular cancer, breast cancer, and skin cancer, if caught early, can be cured with surgery, radiation, chemotherapy, or a combination of these treatments. Leukemia sits somewhere in the middle. Some forms of it are curable, but others are more resistant to treatment. For instance, 90 percent of people diagnosed with chronic myelogenous leukemia (CML) are still alive five years after diagnosis, while only 40 percent of people diagnosed with acute myelogenous leukemia (AML) make it past five years. In total, some 245,000 people in the United States have a form of leukemia, with 40,000 new cases added and more than 20,000 deaths each year. Since leukemia is a disorder of the white blood cells, which normally are responsible for fighting off pathogens, individuals with the disease may experience higherthan-normal rates of infection, such as infected tonsils or pneumonia. They also may bruise or bleed more readily than people without the disease because as abnormal white blood cells proliferate in their bone marrow, their platelets—those ever-important blood clotters—become displaced. Some patients also suffer from fevers, fatigue, and other flu-like symptoms.

Killing Stem Cells While plenty of researchers worldwide are investigating leukemia’s origins, even more are searching for cures. Paulson and Prabhu are focusing specifically on curing AML and CML, two of the four forms of leukemia.

when they’re told to die. That’s how the body gets rid of abnormal cells that could cause problems. But cancer subverts this process. We found that by treating the cells with the J3 compound, we turn on the p53-dependent cell death pathway, which causes not only the leukemia cells to die, but the leukemia stem cells to die as well.” Killing the leukemia stem cells, he adds, is essential if you want to cure leukemia.

Part of Hegde’s job is to generate leukemia stem cells in culture and then inject them into mice. Once they have the disease, she injects 600 nanograms of the J3 compound into the abdomens of the mice every day for a week.

“Leukemia stem cells make what are referred to as bulk leukemia cells,” he explains. “Part of the problem with chemotherapy is that it kills the bulk cells because they’re dividing rapidly, but the stem cells, which divide less rapidly, can hide. Killing the stem cells is important because stem cells can divide and produce more cancer cells, as well as create more stem cells.”

According to Paulson and Prabhu, once inside the rodents’ bodies, the J3 compound activates the p53 tumor suppressor gene, which is responsible for maintaining genomic stability and regulating how cells respond to DNA damage.

Prabhu says the current therapy for CML and AML extends the patient’s life by keeping the number of leukemia cells low, but the drugs fail to completely cure the disease because they do not target leukemia stem cells.

“If there’s a problem with a cell, p53 gets induced and the cell dies by programmed cell death,” Paulson says. “Cells in our bodies are pretty altruistic. They die

“We were able to show in the cell culture dish and in our mouse models that J3 works quite well at killing both leukemia cells and leukemia stem cells,” he says.

Although the researchers had worked for years on opposite sides of a wall, they had never before worked together.

Robert Paulson, left, and Sandeep Prabhu


Steve Williams

Experts have suggested radiation, viruses, chemicals, tobacco use, and just plain crummy genetics as possible causes of

Health and Life Sciences ›

Research | Penn State 2013

Inspiring Collaboration: Watch Robert Paulson and Sandeep Prabhu explain how they connected their seemingly separate study areas during a weekly faculty lunch, resulting in a possible cure for leukemia.

“The compound really cures mice of leukemia. It worked for every single mouse we tried it on. And it cured them without any side effects and without relapse.” Prabhu notes that the lack of side effects is due to the fact that fish oil is nontoxic, though he warns that taking too much of it can be somewhat dangerous as fish oil is known to contain heavy metals, like mercury. “We were completely surprised by what we found,” he says. “We had no idea that the J3 compound would work so well.” Craig Jordan, the Philip and Marilyn Wehrheim Professor at the University of Rochester Medical Center who is collaborating with Paulson and Prabhu, also was surprised by the results. “The J3 compound is a very novel approach to leukemia therapy,” Jordan says, “and the findings in the mouse models were quite impressive.”

Cure in Hand—Now What? So, Prabhu and Paulson have cured leukemia in mice using a compound derived from fish oil. I wonder, “Does this mean that popping a drugstore fish-oil capsule can help protect against leukemia?”


“Yes, it may,” Prabhu says, “depending on whether or not your body produces J3 when it metabolizes fish oil.” According to Prabhu, there is variation in the human population regarding how people metabolize fish oil. While some people take fish oil and produce J3, others may not be able to synthesize the compound. To avoid this discrepancy in their mice models, the researchers injected the mice with the purified compound rather than with fish oil. Now that the team knows that J3 cures leukemia in mice when they inject it into the rodents’ abdomens, they want to know if the compound will work if administered orally. “We are investigating how stable the compound is and whether it will break down in the stomach, so we are trying some experiments using simulated gastric juice,” Prabhu says. “We also want to know if you give it to a mouse in a pill form does enough get in to the right place or do you have to do an intravenous injection?” In addition, the scientists want to know if the compound will work the same way in humans. Craig Jordan has helped with this part of the project by providing the Penn State team with human leukemia cells. Paulson and Prabhu are growing these cells in culture dishes and then adding J3 to see if it kills them. 

After investigating the effects of J3 on human leukemia cells in cell culture dishes, the team plans to grow human leukemia cells in mice that lack an immune system to see if the compound will work in vivo. If it works, they hope to partner with a company set up a clinical trial. “Both of us know that a lot of people have cured cancer in mice and that most of those cures never made it to the clinic,” says Paulson. “But we’re optimistic that our therapy will translate into humans. If it does and we can help people live even a little bit longer, it would be an incredible thing.” Robert Paulson, Ph.D., is professor of veterinary and biomedical sciences, rfp5@ Sandeep Prabhu, Ph.D., is associate professor of immunology and molecular toxicology, This article originally appeared in Penn State AgScience magazine.

From Mice to Men

A Cure for Leukemia?

Will the therapy work in humans?

Paulson and Prabhu have cured leukemia in mice, but the most important question remains: Will the therapy work in humans? “The results are very promising,” says Thomas Loughran, director of the Penn State Hershey Cancer Institute and an expert on leukemia, “but there is still much more work to do.” Loughran says making the jump from mice to humans requires multiple timeconsuming steps and about $1 billion. “Before a cancer therapy can begin clinical trials, it must go through a preclinical development phase,” he says. “This includes testing the therapy in culture to see if it works on human cancer cells and then testing to see if it works in larger animals, such as pigs, dogs, and nonhuman primates. If the therapy works safely in these animals, the researchers must get approval from the FDA [U.S. Food and Drug Administration] to begin clinical trials. The FDA wants to make sure the therapy makes perfect sense before it allows scientists to test it in humans.” Once the FDA has approved the therapy, the three phases of clinical trials can begin. “In phase I, you give a low dose of the medication to three or four patients,” Loughran says, noting that these patients typically are volunteers who have relapsed and are seeking a new medication that might help them. “If there are no side effects, you increase the dose. Once you find the maximum safe dose for patients, you move into phase II of clinical trials.”

Phase II includes giving the maximum safe dose to a larger number of patients, typically about 30 people. “The goal here is to further examine the drug’s safety and effectiveness,” he says. Phase III includes a randomized study in which some patients are given the drug while others are given the best standard therapy. Neither the doctor nor the patient knows who has received the drug. Such a study usually includes hundreds or thousands of patients. “If the new medicine significantly improves survival or remission rates among patients, then the FDA will set up a panel to vote on whether the data is valid and whether the medicine merits becoming commercially available,” Loughran says. Loughran explains that Paulson and Prabhu’s J3 compound will have to go through all of these steps before it can be sold as a drug. And because the scientists are relatively poor university researchers compared to industry researchers, the team will have to attract industry support to move through the clinical-trial process. Loughran’s own research, some of which he conducts in collaboration with Paulson, also involves the development of leukemia therapies. The first discoverer of large granular lymphocytic (LGL) leukemia, a chronic but rare form of leukemia, Loughran currently is seeking a therapy that targets cancer cells among LGL patients rather than “assaulting their entire bodies the way chemotherapy does.” “Curing cancer in humans is an expensive, time-consuming endeavor requiring a significant amount of interdisciplinary collaboration,” Loughran says. “But it’s worth it. I see patients every day who are suffering from leukemia. They are our parents, our children, our friends, and our neighbors. We need to do all that we can to help improve their outcomes.” —SL


Health and Life Sciences ›

Gypsy moth caterpillars hormonal slaves to virus gene

Research | Penn State 2013

Gypsy moth caterpillars infected with baculovirus forfeit safety and stay in the treetops during the day because a virus gene manipulates their hormones to eat continuously and forego molting, according to entomologists. The caterpillars die where they climb and infect other gypsy moth caterpillars. “Baculoviruses have been known to induce climbing behavior in their caterpillar hosts for over 100 years,” says Kelli Hoover, professor of entomology at Penn State. “Until recently, determining the evolutionary basis for these altered behaviors has proven difficult in the absence of a mechanistic explanation.”

One hundred years ago, researchers could not look at either the virus’ genetic material or the metabolic pathways in the caterpillar. Hoover and her team identified a specific viral gene, egt, that codes for an enzyme, EGT—UDP-glycosyltransferase —that inactivates the hormone that triggers molting. EGT also induces the caterpillar to climb to the treetops, hang onto the leaf or bark with their prolegs and die. Then, they liquefy and rain viral particles over the leaves for other caterpillars to ingest and become infected. Genes that influence hormones are perfect targets to change complex behaviors, Hoover noted. The viral gene egt blocks molting by inactivating the molting hormone ecdysone, keeping the insect in a feeding state. “It is good for the virus because if host spends 24 hours not feeding while they prepare to molt, this is time that the host is not getting bigger to maximize the host’s biomass to make into more virus,” says Hoover. “In this case we’ve found that that the gene also somehow induces the caterpillars to go to just the right location to enhance transmission of the virus to new hosts.”

James McNeil

The researchers are not completely certain why the caterpillars climb or stay aloft during daylight when they are infected. One possibility is that without the molting cue, the caterpillars simply have an urge to eat continuously and so remain in the treetops. Gypsy moth caterpillar.

Hoover notes that this is one of the first studies to identify the gene of the parasite responsible for altering the behavior of the host animal. Many parasites manipulate their hosts, but in most cases, how this occurs is not known. —Matthew Swayne


Baculoviruses have been known to induce climbing behavior in their caterpillar hosts for over 100 years. Until recently, determining the evolutionary basis for these altered behaviors has proven difficult.

Injecting mice with a critical component of several human malaria vaccines now undergoing trials can create ecological conditions that favor the evolution of parasites that cause more severe disease in unvaccinated mice, research at Penn State has shown. “We are a long way from being able to assess the likelihood of this process occurring in humans,” says Andrew Read, Alumni Professor of Biological Sciences, “but our research suggests the need for vigilance. It is possible that more-virulent strains of malaria might evolve if a malaria vaccine goes into widespread use.”

Centers for Disease Control

How more virulent malaria parasites evolve in response to vaccination is still a mystery, but it’s not due to changes in the part of the parasite targeted by the vaccine, Read notes. No malaria vaccine ever has been approved for widespread use. “Effective malaria vaccines are notoriously difficult to develop because the malaria parasite is very complex,” Read says. “Hundreds of different malaria strains exist simultaneously within any local region where the disease is prevalent.” Most vaccine developers use only small sections of the malaria parasite to produce an antigen molecule that then becomes a key ingredient in a highly purified malaria vaccine. Read’s lab tested the antigen AMA-1, a component of several such vaccines now in various stages of clinical trials. “We were surprised to find that more-virulent strains of malaria evolved even while the gene encoding the key antigen remained unchanged,” says Victoria Barclay, the postdoctoral scholar in Read’s lab who conducted the laboratory experiments. “We did not detect any changes in the gene sequence.” The researchers conclude that evolution must have taken place somewhere in the parasite’s genome. Read’s lab now is hunting for the exact locations on the parasite’s DNA where the mutations occurred.


oating a bone graft with an inorganic compound found in bones and teeth may significantly increase the likelihood of a successful implant, according to Penn State researchers. Natural bone grafts need to be sterilized and processed with chemicals and radiation before implantation into the body to ensure that disease is not transmitted by the graft. Human bones have a rough surface. However, once a graft is sterilized the surface changes and is not optimal for stimulating bone formation in the body. “We created a method for resurfacing bone that had been processed, and resurfacing that bone so that it is now nearly as osteogenic as unprocessed bone—meaning it works nearly as well as bone that hadn’t been processed at all,” says Henry J. Donahue, Michael and Myrtle Baker Professor of Orthopaedics and Rehabiliation, Penn State College of Medicine. “That’s the bottom line.” Donahue, who is also a faculty member of the Huck Institutes of the Life Sciences, and Alayna Loiselle, postdoctoral fellow in orthopaedics and rehabilitation, Penn State College of Medicine, teamed up with Akhlesh Lakhtakia, Charles Godfrey Binder Professor of Engineering Science and Mechanics. They developed a way to create a rough surface on bone grafts that is similar in texture to the surface of an untreated bone. This similarity promotes healing in the bone.

Bone graft covered with inorganic material

The researchers found that by coating a bone with the inorganic compound hydroxyapatite, using physical vapor deposition, they could closely mimic the rough surface of an untreated bone. To find the optimum thickness of hydroxyapatite, Donahue and Loiselle sterilized the graft samples in their lab at Penn State Hershey Medical Center. After sterilization, the samples went to the University Park campus, where physical vapor deposition layered different amounts of hydroxyapatite on the grafts. Then the samples were returned to Hershey for Donahue and Loiselle to test. “I thought we wouldn’t need to coat the bone more than a couple of hundred nanometers. As it turns out, it was much less than that,” says Lakhtakia. A hundred nanometers is about the size of a single viral particle. —Victoria M. Indivero


—Barbara Kennedy

New method of resurfacing bone improves odds of successful grafts

Henry Donahue

More virulent malaria parasites evolve when vaccine is used

Health and Life Sciences ›


touch with …

Peter Hudson Research | Penn State 2013

Huck Institutes head champions expanding research capabilities and a broad approach to the life sciences.


eter Hudson came to Penn State a decade ago from the University of Stirling, in his native Great Britain, to serve as the Verne Willaman Chair of Biology. In 2005 he became director of the Huck Institutes of the Life Sciences. The Huck provides a foundation for research collaboration across colleges and departments in the University, offering resources for faculty and students to carry out collaborative research. We sat down to talk with him to learn more about the Huck.

What is an example of cross-college collaboration within the Huck? If you look at just one site—our new Millennium Science Complex—we have faculty from six colleges housed within the building. We have a center for neuroengineering, which brings in engineers, medics, biologists, agriculturalists, and bioengineers—who are working together on aspects of how the brain works, looking at the interface of materials and the brain and infections of the brain. We also have a big group working on infectious diseases that spans scales from proteins to pandemics and from vectors to vaccines. Specific issues include such challenges as how to control vector-borne diseases like malaria. Vector-borne diseases are still causing massive mortality throughout the world, and it’s going to take humans at least 20 years to solve that problem. We have researchers working on a wide variety of biological and social aspects of malaria.

Why was the Huck Institutes founded? The Huck has existed for almost 15 years and was set up when current University President Rod Erickson was vice president for research. The life sciences are more than just biology and biochemistry—the Huck is where these disciplines sit at the crossroads between engineering and medicine. They encompass the physical sciences, including computer science. This broad interdisciplinary approach allows us to address such pressing issues as the emergence of new diseases, developing crops and plant systems that can feed the world given the impact of climate change, and how we can develop personalized medicine for all. We want to help break down the silo effects that existed between the academic colleges and identify the synergy that emerges with interdisciplinary teams.

How are researchers from the liberal arts working with infectious diseases? If you have an infection, such as HIV, there is a certain stigma associated with it—what are the consequences of that? We are also interested in how you can roll out vaccines in an efficient and effective way, so we must identify when epidemics are going to occur and how people will behave—that’s social science. We’ve undertaken modeling approaches to predict when measles are going to break out in Niger, utilizing Landsat imagery, virology, and the behavior of people, and then given simple predictions to vaccine organizations like Medecins Sans Frontier and the World Health Organization.

What is the role for undergraduates at the Huck Institutes?

Patrick Mansell (3)


Huck researchers are predominantly from the hard science and agricultural science areas, but there are also people from the liberal arts and other areas who are interested in some of the social science components of these issues.

Undergraduates are the next generation of scientists, so their involvement in research is imperative. It’s an area that I believe needs to be made stronger at this university. We have designed part of the Millennium Science Complex to accommodate undergraduates who come and work with different projects. In my own lab, for example, I have eight

In Memoriam undergraduates who are undertaking field work, lab work, and entering data to figure out the role of infections in the massive explosion of mice we have seen in the Northeast over the past two years.

What got you interested in a career as a scientist? I was born a biologist. From day one I was interested in nothing but birds and insects and flowers—totally different from the rest of my family—but my life and enjoyment is centered on my lifelong fascination with wildlife and biological processes. I got my first bird book when I was four years old, and I still spend every minute I can studying biology. Check out my wildlife photographs on to see what I do on vacation.

So your personal research interests are nature-based? Totally, and while my training is as a population biologist and a behavioral ecologist, I am now using an ecological approach to study how diseases spread. I’m particularly interested in emerging diseases that come from wildlife and spill over into humans and the impact of diseases on wildlife populations. I want to know why, when, and where the critical processes of transmission occur—a poorly understood area of infectious disease biology.

What are the Huck Institutes’ research strengths? Just to pick three: we excel at genomics, infectious diseases, and plant sciences. In the last few years the Huck has been strategic in building up these areas, and improving our capability through a series of cluster hires. For example, we have brought in nearly 30 new faculty to the genomics area—including faculty in anthropology and political science, as well as some real stars in genomic analysis. Another strength of the Huck is that we help to build the strategic vision for research across the University. We are interested in more than what happens in a single department or a single college— we are interested in the broader issues. The Huck can help build the strategic aspects of research, particularly when we partner with colleges, and I think that’s a huge benefit to the University. Before the Huck existed, faculty in different colleges were unlikely to speak with each other very often, but now we provide the space, the resources, and the opportunities for those people to get together, work together, and address important issues.

What accomplishments are you particularly proud of? I see my role very much as a champion for the faculty in representing their views, but I also try to provide science leadership and work with faculty to build our research strategy. For example, with my colleagues in the Huck I have helped to secure funding for instruments and equipment that allow researchers to do things they didn’t even know they could do. We’ve built new facilities and buildings—like the impressive Millennium Science Complex—to work on the

J. Lloyd Huck 1922 - 2012


n 2002, the Life Sciences Consortium was renamed as the Dorothy Foehr Huck and J. Lloyd Huck Institutes for Life Sciences in recognition of the Hucks’ leadership and generosity in support of the life sciences at Penn State. J. Lloyd Huck, retired chairman of the board of pharmaceutical firm Merck & Company, with his wife and fellow Penn State Class of 1943 member Dorothy Foehr Huck, had previously established endowments in fields ranging from molecular biology to nutrition that ultimately led to the creation of the Huck Institutes.

interface of materials and life sciences. We are building new capabilities in areas such as metabolomics and retaining our presence in genome analysis. We continue to invest in new instruments, like next generation sequencers, and we’ve revolutionized our proteomic capability. We are in the process of getting two new electron microscopes for the MSC to improve our imaging capabilities. I think this all helps to attract and retain the very best faculty. —VicToria M. Indivero


I am the Verne Willaman Professor of Biology. Mr. Willaman endowed my position, which gives me the flexibility to explore novel areas of science. I am currently working on disease in wolves,

pneumonia in bighorn sheep, and desert tortoises’ disease systems, examining the process of disease invasion to get new insights into how we can control emerging diseases. Income from the endowment that Verne created has allowed me to initiate that. Sadly Verne died recently, but his legacy lives on at Penn State.

Health and Life Sciences › Wafik El-Deiry

Professor of Medicine and Chief of Hematology/ Oncology, Penn State College of Medicine

Compound stimulates tumor-fighting protein in cancer therapy

Research | Penn State 2013

A compound that stimulates the production of a tumor-fighting protein may improve the usefulness of the protein in cancer therapy, according to a team of researchers, including several from the Penn State Hershey Cancer Institute and College of Medicine. TRAIL is a natural anti-tumor protein that suppresses tumor development during immune surveillance—the immune system’s process of patrolling the body for cancer cells. This process is lost during cancer progression, which leads to uncontrolled growth and spread of tumors. The ability of TRAIL to initiate cell death selectively in cancer cells has led to ongoing clinical trials with artificially created TRAIL or antibody proteins that mimic its action. Use of the TRAIL protein as

a drug has shown that it is safe, but there have been some issues, including stability of the protein, cost of the drug, and the ability of the drug to distribute throughout the body and get into tumors, especially in the brain.

TRAIL protein is sustained in both tumor and normal cells, with the normal cells contributing to the TIC10-induced cancer cell death through a bystander effect. It is effective in cancer cell samples and cell lines resistant to conventional therapies.

“The TRAIL pathway is a powerful way to suppress tumors but current approaches have limitations,” says Wafik El-Deiry, professor of medicine and chief of the hematology/oncology division, Penn State College of Medicine.

“Using a small molecule to significantly boost and overcome limitations of the TRAIL pathway appears to be a promising way to address difficult to treat cancers using a safe mechanism already used in those with a normal effective immune system,” El-Deiry says.

Researchers have identified a compound called TRAIL-inducing Compound 10 (TIC10) as a potential solution. TIC10 stimulates the tumor suppression capabilities of TRAIL in both normal and tumor tissues, including in the brain, and induces tumor cell death in mice. Stimulation of

Sainburg Lab

New model of how brain functions are organized may revolutionize stroke rehab

Overlap images showing locations of lesions between groups with right and left hemisphere damage (color scale of magenta to red shows increasing overlap). Lesions are confined to either left or right hemisphere.


—Matthew Solovey


new model of brain lateralization for movement could dramatically improve the future of rehabilitation for stroke patients, says Huck Institutes researcher Robert Sainburg, who proposed and confirmed the model through novel virtual reality and brain lesion experiments. Since the 1860s, neuroscientists have known that the human brain is organized into two hemispheres, each of which is responsible for different functions. Known as neural lateralization, this functional division has significant implications for the control of movement and is familiar in the phenomenon of right- and left-handedness.

Overweight pregnant women not getting proper weight-gain advice


verweight women are not receiving proper advice on healthy weight gains or appropriate exercise levels during their pregnancies, according to Penn State College of Medicine researchers.

pounds, and obese women are advised to gain less than 20 pounds.

“Excessive weight gain during pregnancy is particularly concerning for overweight and obese women given their already increased risk for pregnancy complications,” Associate Professor of Medicine and Public Health Sciences Cynthia Chuang points out.

“Women received little, if any, feedback regarding whether their weight gain during pregnancy was healthy or not,” Chuang says. “Some women who received their care at obstetrical group practices and were seen by different providers in the same practice even received conflicting advice.”

Guidelines for weight gain are based on the weight of the woman at the start of pregnancy. Women of a normal weight are advised to gain 25 to 35 pounds, overweight women are advised to gain 15 to 25

Researchers interviewed both overweight and obese women after the birth of their first child, and found that care providers advised half of all women to gain too much weight, did not discuss weight gain, or gave nonspecific advice.

The reasons why women are not being given proper advice are unclear, according to the researchers, who theorize that providers may find it awkward to acknowledge that a patient is overweight and do not want

Understanding the connections between neural lateralization and motor control is crucial to many applications, including the rehabilitation of stroke patients. While most people intuitively understand handedness, the neural foundations underlying motor asymmetry have until recently remained elusive, according to Sainburg, professor of kinesiology and neurology. Investigations by Sainburg and his colleagues have revealed a new model of motor lateralization that accounts for the neural foundations of handedness. “Each hemisphere of the brain is specialized for different aspects of motor control, and thus each arm is ‘dominant’ for different features of movement,” said Sainburg. “These

to cause embarrassment. Some doctors also may not calculate a pre-pregnancy BMI to better advise their patients. Providers need tools to address weight gain and exercise levels, Chuang says. Officebased tools like BMI calculators may help to provide preconception counseling and accurate weight gain targets. It also may be beneficial to offer educational materials prior to a first prenatal visit. —Matthew Solovey

Understanding the connections between neural lateralization and motor control is crucial to many applications, including the rehabilitation of stroke patients. specialized control mechanisms are seamlessly integrated into everyday activities. Our research has shown that this integration breaks down in neural disorders such as stroke, which produces different motor deficits depending on whether the right or left hemisphere has been damaged.” Traditionally, physical rehabilitation professionals have used the same protocols to practice movements of the paretic arm, regardless of the hemisphere that has been damaged. Sainburg’s research shows that each arm should be treated for different control deficits, and it also indicates that therapists should directly retrain patients in how to use the two arms together in order to recover function. —Seth Palmer


Health and Life Sciences ›

Transforming health care through personalized medicine


By Dawn Costantini

Research | Penn State 2013

t’s been nearly ten years since scientists completed the Human Genome Project, sequencing all 3 billion DNA letters of the human genome. While this was a groundbreaking feat, the biomedical community is now beginning to realize the promise of genomic transformation through an approach known as personalized medicine. Penn State Hershey is taking the next step forward in this important area of research and clinical care with its new Institute for Personalized Medicine, which will use a multifaceted approach to understand the correlation among a person’s biologic framework, the environment in which he or she lives, disease predisposition, and treatment options. By pursuing translational research—the kind of research that directly applies the latest scientific technologies to a patient’s clinical condition—physicians and scientists can tailor health care to individual patients and help improve medical outcomes.

Understanding the genome The human genome is all of an individual’s DNA-based information, including our genes, as well as regulatory sequences that control gene expression and DNA for which no function has yet been established. With technology growing at such a rapid pace, many discoveries have been made and many more are on the horizon. Knowledge of a person’s genome allows researchers to understand how their genetic make-up and metabolic profile affects his or her susceptibility to specific diseases or response to specific therapies. Physicians can use this knowledge to outline predictive and preventive health strategies and to prescribe the right therapy for the right person at the right time.


“We now have a much more precise means of classifying and stratifying patients because we have much more information to accumulate on them,” says James R. Broach, director of the Penn State Hershey Institute for Personalized Medicine. “With that information we are now able to provide better correlations between patient outcome and the genetic and metabolic markers we can identify early on.” “We’re reaching a few tipping points in the development of biomedical research,” adds Daniel A. Notterman, associate vice president for health sciences research at Penn State. “One of those tipping points is that the costs for developing very large genomic and metabolomic data sets about an individual are rapidly decreasing. This is matched by a corresponding increase in computational power and the rapid adoption of electronic medical records.”

Scientists expect that in the near future, the cost for sequencing the entire genome will be low enough that it will be practical to do this for all individuals. One of the institute’s major initiatives is to create a bio-repository to collect specimens such as blood or saliva from patients treated at Penn State Milton S. Hershey Medical Center and its ambulatory practices. From this bio-repository and other data sources, the institute staff will use biologic and lifestyle information to correlate gene-environment interactions and connect this information with outcome data contained in the electronic medical record (EMR). The advent of the EMR is another tipping point in the development of personalized medicine. “The bio-repository by itself is just a bank of samples. It becomes valuable when we can link it to a person’s EMR,” explains Glenn

S. Gerhard, administrative director of the bio-repository. “That’s where you can put it in the context of what happened to them in clinical care and when it becomes an engine of discovery for personalized medicine.” Capturing this amount of biologic data for research purposes is a massive undertaking. Beyond building the computational structure, there’s a need for computer scientists who must figure out how to analyze all this new data. “New methods have to be developed from an informatics standpoint to analyze the data and to integrate all of it into clinical care using EMRs,” says Gerhard. Informatics, which refers to the technology necessary to store, compute, and retrieve both the biological and clinical data, will be a critical component in propelling personalized medicine to the forefront of health care. However, Penn State has already committed to acquiring the technology and experts to fulfill its vision for personalized medicine. “When it comes to the computational side of this, we have terrific computer engineers and some of the best bioinformatics in the world at University Park,” adds Notterman. “The scientific community is very excited about what this means to tailoring medicine to individuals.”

Helping patients now Personalized medicine will help provide the narrowest treatment focus for patients and to evaluate their risks for developing problems such as cardiovascular disease, dementia, autism, diabetes, and obesity. For some patients, there may be some immediate benefits to this undertaking.

William Freeman, a faculty member in the Department of Pharmacology and faculty advisor to the institute’s genome sciences core, holds up a single flow cell on which a whole human genome can be sequenced. Human genome sequencing that once took months or years can now be finished in just a few days.

chemotherapy. Scientists discovered one specific change in a particular protein, B-Raf, which was driving the growth of the tumor. A drug was developed to inhibit that hyperactive protein. “It has been an important advance. People on their deathbeds are showing a significant response, although cure remains elusive.” says Broach. “However, if you don’t have this mutation, it won’t work, and it may even be harmful,” adds Notterman. This is where genome sequencing and genotyping play such vital roles. By simply knowing the genotype–or particular genetic traits–of the patients or the tumor, clinicians are able to recommend more targeted treatment plans. For example, Carla Gallagher, assistant

professor of Public Health Sciences is currently collaborating with Philip Lazarus and Joshua E. Muscat of the Penn State Hershey Cancer Institute. They are studying how genetic variations affect cancer risk and their treatment plans. “We study a family of genes, known as UGTs, that detoxify carcinogens and metabolize the treatment drugs,” explains Gallagher. “By identifying people who have a particular genotype for developing cancer, we can develop a personalized approach to intervention through more frequent screenings and other preventative programs.” “Another way we’re trying to personalize care is by reducing the toxicity of chemotherapy. We’re all different, and the way our bodies handle medicine is different,”

Personalized medicine is already producing remarkable results for cancer patients, and in particular, melanoma. In the past, patients who had melanoma that had already metastasized rarely responded to

Informatics, which refers to the technology necessary to store, compute, and retrieve both the biological and clinical data, will be a critical component in propelling personalized medicine to the forefront of health care.


Health and Life Sciences ›

Research | Penn State 2013

says Wafik El-Deiry, the Institute for Personalized Medicine’s associate director for clinical translation. “Within our clinics, we are carefully monitoring care. Without knowing a person’s genetic variation, we can give too little or too much of a dose.” However, through personalized medicine, researchers and clinicians can determine how an individual is metabolizing a drug and thus determine a far more accurate dosage. This not only reduces the toxicity, but maximizes the benefits of anti-cancer therapy. Knowing what kind of drugs will work on specific tumor mutations is an insight that could make clinical trials more effective and less costly. According to El-Deiry, “If only 10 percent of new patients respond to a drug in a clinical trial, then that may not be an exciting thing. However, if we knew ahead of time which tumor would respond to this drug, the trial could be much more selective in who enters it and the results would more efficiently bring forward the most effective drugs.” In fact, Gallagher is also working with Elliot Epner on a clinical trial for the cancer treatment SAHA, which is FDA-approved to treat cutaneous T-cell lymphoma and is in clinical trials to treat many other types of cancer. “We know that if people have a particular genotype, they cannot metabolize SAHA. We are measuring levels of drug metabolites in patients to see if the genotype affects the response,” Gallagher says. This tailored approach to medicine is not just touching patients with cancer. The blood-thinning drug, Coumadin, is used to treat patients with blot clots. In the past, it was difficult to prescribe the right dosage. Often, patients would experience excessive bleeding or develop another blood clot. “Recently, the FDA, in its label, began recommending doing a genotype of the patient to learn how quickly they metabolize the drug. You can demonstrate that this decreases their chance of being readmitted to the hospital,” explains Notterman. The original science that led to this discovery goes back to an article by Penn State Hershey’s Elliott S. Vesell, founding chair of the Department of Pharmacology. (See sidebar on page 57.)


The future of medicine The early goal of the Institute for Personalized Medicine is to draw correlations between genotype and outcome. “As we gain more information, the shift will be from being a research effort to make those correlations to being a clinical service where, as a patient comes in, we get their genetic or metabolic profile and inform the clinician immediately that the patient will benefit from this treatment and not that treatment,” Broach says. Penn State Hershey is poised for success in this field because of the many synergies between the College of Medicine, the Medical Center, and University Park.“We have a superb group of scientists and physicians who can help realize this vision.

I also think we have an outstanding infrastructure that includes electronic medical records (EMR) to make the relevant host and tumor genetic and genomic information readily available to the practicing clinicians, so we can improve the way we provide personalized care,” says El-Deiry. “We are doing this now at Penn State Hershey.” In addition to defining response to therapy for people with existing conditions, personalized medicine and genetics can also help physicians and their patients define the risk of acquiring certain diseases. This information can help focus efforts at prevention. For example, risk of breast and ovarian cancer, depression, Alzheimer’s disease, and even the extent of brain injury

after repeated concussion, each seem to be moderated by specific genetic variants, and knowing the genetic status of individuals may eventually provide guidance that can mitigate this effect. Throughout its history, Penn State has fostered a culture of interdisciplinary work. When it comes to the future of personalized medicine, there are numerous teams and institutes collaborating to discover new life-saving options for patients. Whether it’s identifying someone’s susceptibility to a disease and aggressively screening for it or designing a specific treatment plan that will respond to a particular genotype, each group contributes a rich depth and breadth of expertise that collectively have the ability to achieve the full potential of the genomics revolution. Through the new institute and in particular, the bio-repository, Penn State Hershey will be advancing medicine for the entire region by studying the local population and finding results that are applicable to the patients treated here. “There is an opportunity to put into practice what most of us believe is possible. I was excited to come here and be a part of the effort to substantiate it,” says Broach. James R. Broach, Ph. D., is director of the Penn State Hershey Institute for Personalized Medicine and chair of the Department of Biochemistry and Molecular Biology at Penn State Hershey Medical Center. Daniel A. Notterman, M.A., M.D., is vice dean for research and graduate studies, professor of pediatrics, biochemistry and molecular biology, and associate vice president for health sciences research at Penn State. Glenn S. Gerhard, M.D., is professor in the Departments of Biochemistry and Molecular Biology and Pathology and Laboratory Medicine, and administrative director of the Institute for Personalized Medicine’s bio-repository. Carla Gallagher, Ph.D., is assistant professor in the Department of Public Health Sciences. Wafik El-Deiry, M.D., Ph.D., F.A.C.P., is Rose Dunlap Division Chair in Hematology/Oncology, associate director for translational research at the Penn State Hershey Cancer Institute, and associate director for clinical translation in the Institute for Personalized Medicine.

From the Beginning

Transforming Healthcare

Even in its infancy, Penn State College of Medicine recruited the best minds in medicine and research. In 1968, College leadership recruited a young Harvard graduate, Elliot S. Vesell, to serve as the founding chair of the Department of Pharmacology–a role he kept for thirty-two years. His internationally recognized work in human twin studies, which led to seminal papers assessing the role of genetic factors in contributing to large individual variations in drug response, was critical in establishing the scientific field of what has become known as pharmacogenetics and pharmacogenomics. “These studies showed that the pharmacokinetic variations in unrelated people were prominent, but if you had the same drug levels in the blood or serum, your response to the drug was very similar,” Vesell explains. “So the idea was to overcome these pharmacokinetic differences by giving individuals different doses at different intervals to make the levels in the blood more alike.” This idea is now used in practice when prescribing doses for drugs like Coumadin. At the same time, Vesell was studying environmental factors, such as age and other drugs, and saw that they could impact and change an individual’s genetic effect. It was Vesell’s work, in part, that laid the foundation for personalized medicine. “Personalized medicine is an easy thing to say, but it’s a big challenge to bring about,” says Vesell. “That’s why we need an institute.” Vesell remains active in his work as he continues to write reviews and collaborate with other pioneers in his field. He’s written 355 scientific papers throughout his career. “I’m still inspired by the vision and the ideas of this medical school, which are far ahead of the pack in terms of medical education,” he says. “It inspires me to think of the future, because the future holds so much promise.” Elliott S. Vesell, M.D, Sc.D., is founding chair of the Department of Pharmacology at Penn State Hershey Medical Center. ­—DC


Arts and Humanities ›

Understanding an Era, Understanding Ourselves Research | Penn State 2013

Richards Center prepares next generation of Civil War scholars

By Doug Stanfield


t 4:30 a.m. on a Friday in April 1861, a shell from a 10-inch rebel mortar burst 100 feet over Fort Sumter in the harbor of Charleston, South Carolina, beginning a bombardment that lasted nearly 34 hours. The Union commander of the fort, Major Robert Anderson, surrendered at 4:30 p.m. the next day. No one on either side died during the battle. That could not be said of what came next. Between 1861 and 1865, at least 620,000 Americans died in the Civil War. These casualties exceed the nation’s loss in all its other wars combined, from the Revolution through Vietnam. As a percentage of today’s population, the death toll equates to 6.2 million. Yet these gory statistics don’t reveal the scope and depth of the real story. “You cannot understand who we are today without knowing the journey we have made through the decades before, and after, the War,” says William A. Blair, professor of American history and director of Penn State’s George and Ann Richards Civil War Era Center.


“The heart of our mission is jumpstarting new scholarship about the entire era. That process includes encouraging and facilitating research at the graduate level.” “Many of the discussions we have today concerning race relations and the size of government have their origins in the Civil War Era,” Blair says. “The Civil War ended slavery and expanded black rights in a ‘new burst of freedom’. The Fourteenth Amendment established citizenship as a right of birth of all people in the United States. “Additionally,” Blair says, “the war put into place the ingredients for the industrial expansion that helped turn the United States into a world power. And for Penn Staters, it created the Land-Grant Act upon which Penn State was founded.”

A Center Apart Penn State’s Richards Center, housed in Pond Laboratory on the University Park campus, was founded in 1998. Its mission, Blair says, is to answer questions about events over much of the 19th century that were receiving scant persistent attention by scholars.

“There were centers of study of the colonial era, and others focused on the Gilded Age and the early years of the 20th century,” he says. “And while much good scholarship was being done on the Civil War, there were few places that could gather scholars in an integrated program, train graduate students, and publish findings.” From its original mission of interpreting and reflecting on the transformative experience of the Civil War itself, the center’s scholarship now encompasses themes spanning Atlantic World slavery, emancipation, constitutionalism, the expansion of democracy, religion and social movements, women’s rights, immigration, western expansionism, war and society, and the struggles of labor. “So much of what we find,” Blair says, “overturns what was thought to be true.” One new measure of the center’s success is the launching of a major academic journal, the Journal of the Civil War Era, in 2011. (See sidebar.) In addition to fostering

William A. Blair

Professor of American History, Director of the George and Ann Richards Civil War Era Center

this forum, the center is training scholars who are making significant contributions in the field. Former students have published 15 books with leading academic presses since 1998, Blair says, “with a couple more on the way.” “The heart of our mission is jumpstarting new scholarship about the entire era,” Blair explains. That process includes encouraging and facilitating research at the graduate level. Herewith a sampling of work from a few promising young scholars.

Arming Slaves Antwain Hunter, originally from Leominster, Massachusetts, is investigating the balancing act required of both slaves and their masters when it came to the use of firearms. Hunter, advised by Associate Professor of History Anthony Kaye, has focused on the extent to which white owners in North Carolina allowed their slaves to carry and use firearms, and how that tracked against increasingly fearful sentiments among whites following attempts to overthrow the slaveholding order. Hunter uses traditional sources such as newspapers, legal records of indictments and other public records, to tease out the voices of free and enslaved black people, most of whom left little record on their own. “There were far more free black people and slaves who had access to firearms both legally and illegally than we oftentimes acknowledge,” he says. The South was an agricultural society, and arming slaves was seen as a necessary risk. Hunting allowed slaves to supplement their diets, which owners favored because it meant they did not have to augment slaves’ rations themselves. In addition, slaves were given guns

(under strict supervision, usually) to protect their masters’ lands and livestock during property disputes between landowners. Hunter has found that firearm possession and use by slaves was outlawed in most, if not all slave states before the war. However, slaves, free blacks, and—often—slave masters, promoted the use of firearms anyway. “This is in large part because blacks sought out what was best for themselves and their families despite the laws,” he says, “and further because armed black people could be useful to white people as well.”

Missions of Change Another Ph.D. candidate, Kelly Marie Knight, is focusing on the way abolitionists used foreign missions to try to undermine white American beliefs in racial inferiority, which they saw as the main reason why slavery was able to thrive. In their publications, Knight says, organizations like the American Missionary Association (AMA) highlighted the achievements of black societies around the world. “For many people, the only way to learn about foreign cultures was through these publications,” she adds. Thus the AMA, founded in 1846 in response to the Amistad incident, “had the freedom to paint whatever picture it wanted of the black communities it worked in.” The missionary model also convinced some northerners that slavery wasn’t just an oppressive labor system, but was something that endangered the eternal souls of slaves, Knight says. “Thus AMA publications emphasized how Christianity and slavery were diametrically opposed, and that true Christianity could only flourish in a community, black or white, when slavery was eliminated.”

Journal of an Era


n 2011 the Richards Civil War Era Center launched the Journal of the Civil War Era, a peerreviewed periodical of 19th century American history. According to associate editor Anthony Kaye, the new journal is playing a substantive role in revitalizing the field. “[Founding editor] Bill [Blair] had this very good insight that there were all these little subfields of 19th century history where people weren’t talking to one another,” Kaye says. “So creating this journal where they can all get into a conversation is a very useful thing.” Others agree. The Library Journal selected the Journal of the Civil War Era as one of the ten best new periodicals of 2011. The Society of Civil War Historians adapted the journal as its official publication, thus providing a substantial readership that gives authors broader visibility. Published in partnership with the University of North Carolina Press, the journal is guided by founding editor Blair, associate editors Judith Giesberg, Kaye, Aaron SheehanDean, and managing editor Matthew Isham. Its editorial board is comprised of distinguished historians from the Smithsonian Institution, Brown University, Yale University, Duke University, and the Universities of Virginia, Pennsylvania, Michigan, Texas, and Iowa, among others.


Arts and Humanities › “This was a fairly radical message,” she notes, “because it went against what most of the churches, north and south, believed the Bible had to say on the subject of slavery.” The AMA had some limited success in convincing northern whites to support their organization, she says, but it was still a fringe group until after the Civil War, when AMA leaders used their experience in setting up mission schools abroad to create an extensive educational system for newly freed slaves. Research | Penn State 2013

Debt, Imperial ambitions and the War Andrew Prymak’s doctoral research focus is on imperialism and debt during and after the Civil War. Growing up, Prymak, from Greenville, South Carolina, was steeped in the history of the South. Before coming to Penn State, he served a year’s internship at Furman University, home of extensive Civil War archives. His research documents connections between economics and politics, especially as they relate to questions of race and privilege.

Specifically, Prymak is looking at publications and other archived private and government records showing how federal politicians linked debt and national expansion, and how that connection led to political and policy decisions. “Politicians argued over whether mounting federal debt would undermine Reconstruction and weaken the country,” he says, “or conversely, could be used to fuel expansion and imperial ventures in the hemisphere that they believed would expand the country’s power and augment its wealth.”

“…as long as we have an interest in freedom and liberty, we should care about the Civil War era. It made us who we are.”

What’s Behind the Beard Sean Trainor, from Trenton, New Jersey, came to Penn State following undergraduate work at George Washington University and studied at Oxford University’s Pembroke College in England. One of Trainor’s research interests is in male fashion during the Civil War Era, particularly the wearing of beards. Amy Greenberg, Edwin Erle Sparks Professor of History, is his advisor.

Beards were not in fashion before the mid19th century, he says, and had not been common since at least the 1600s. Men went to barbers because razors weren’t cheap, and most men couldn’t afford to keep one at home. Due to the Industrial Revolution, which made inexpensive shaving equipment available, and also to rising racial tensions— many barbers were black, and whites became more uncomfortable around black men with sharp razors—shaving became an at-home activity, Trainor says. “But it was still a dangerous and bloody do-it-yourself activity, and gradually beards became more acceptable,” he says. “They came to be part of ‘what it meant to be a man.’ It was part of a general societal embrace of ‘more natural’ behavior. Even the great Charles Darwin wrote treatises on the evolutionary aspects of beards.” Trainor has been awarded research fellowships at the Library Company of Philadelphia, Historical Society of Pennsylvania, and Louisiana State University in support of his work.

Patrick Mansell

Why study this era?

Graduate students in Penn State’s Richards Civil War Research Center, Andrew Prymak, left, Antwain Hunter, Sean Trainor and Kelly Marie Knight.

At the end of a conversation in his office, when asked why modern Americans should care about an era they know so little about, Blair pauses. It’s a question he has heard many times. “Look at any number of important issues and you can see why,” he points out. “It is important in almost every area, including


women’s rights, the story and struggle of African Americans, political change and how it’s achieved; rebellion, economics, fashion, liberty, the whole concept of freedom, and race relations. “In the end, though, the war, much of what preceded it and significant social issues that continued to play out for nearly 100 years more, was about race. Nothing else. “I know I repeat myself, but as long as we have an interest in freedom and liberty, we should care about the Civil War era. It made us who we are.” William Blair, Ph.D., is the College of Liberal Arts Research Professor and Professor of American History. He directs the George and Ann Richards Civil War Era Center.

In 2002, George and Ann Richards gave $3 million to provide Penn State’s Civil War Era Center with a permanent source of income that would help fund graduate and faculty research, as well as outreach programs that would influence students and educators around the country. In recognition of the impact of their gift, the

University Center the George and Ann Richards Civil War Era Center. renamed the

Both of George Richards’ greatgrandfathers served in the Union Army during the Civil War, one in the California Volunteer Infantry and the other in the First Missouri Light Artillery.

Alarm in the Neighborhood

Understanding an Era, Understanding Ourselves


hen Anthony Kaye, associate professor of history, began looking into the Nat Turner slave rebellion, he says, “I started out thinking this would be a quick, small book. But the more I learned, the longer it got. I wanted to really understand this from Turner’s point of view. It’s taken me a long time, and I’ve really had to dig deep.” In 1831, Turner led about 60 slaves and free blacks in an uprising that killed between 55 and 65 white men, women and children on farms just west of the Chesapeake Bay and near the North Carolina border. In the backlash, white militias and mobs killed between 100 and 200 blacks, most of whom had taken no part in the rebellion. Turner briefly escaped, but was eventually captured, tried, and hanged. Kaye has been drilling deep into original materials, including newspaper accounts, deeds, militia rolls, and court records of every type. He wants to piece together the most complete and accurate record of an event that sent seismic shocks of alarm though the South that reverberated for years. “Turner did what he did because he believed it was what God wanted him to do,” Kaye says. “It took him nine years to finally accept the commands he said he received from ‘the spirit who spoke to the prophets in former times.’ At first he was misunderstood by other slaves, who thought he was telling them to obey their white masters. But he is saying, ‘I have to obey the Master—God.’ A concept Kaye developed in an earlier work about slave society was that of “neighborhoods,” geographical and psychic terrain that slaves and slave owners had to share, involving all sorts of dealings. “Owners are figures to contend with in every part of life,” he says. “We need to stop thinking about slave revolts in terms of revolution,” Kaye says. “A lot of slave ‘rebels’ thought of it as going to war against their owners. Not a general uprising in a military or political revolutionary sense. It was something more personal.”

His book—the working title is Alarm in the Neighborhood—will be published by Hill and Wang. —DS


Arts and Humanities ›

Gettysburg guidebook adds new research perspective to historic battle

However, newly discovered sources, current battlefield restoration efforts and fresh approaches to a well-established narrative have prompted Reardon and retired U.S. Army Col. Tom Vossler to co-author A Field Guide to Gettysburg (University of North Carolina Press, June 2013).

Patrick Mansell

Research | Penn State 2013

For three days in July 1863, thousands of Gettysburg civilians and tens of thousands of soldiers were caught up in a battle that left often conflicting data and anecdotes. Carol Reardon, George Winfree Professor of American History, says this makes writing about the battle both a dream and a nightmare for Civil War historians.

Rehabilitation experts are restoring important terrain features throughout the Gettysburg National Military Park to better match what it looked like during the battle. The restoration effort provides historians with new ways to look at key facets of the battle and enables them to better understand one of the war’s pivotal events.

Carol Reardon


For example, trees have been felled that once blocked the view from the cupola on top of Schmucker Hall, which at the time of the battle served as a dorm and classroom at the Lutheran Seminary. Union cavalry general John Buford used the cupola as an observation post. According to Reardon, the newly unimpeded view offers historians and visitors a chance to see what Union generals saw on the first day of the battle.

As more records become digitized and placed online, historians can more easily access personal information about the soldiers who fought in the battle. The state of New York recently placed online the rosters of Civil War regiments and batteries from the state, some of whom fought at Gettysburg. Reardon and Vossler relied on those details when they wrote sections of the book that covered the actions of the New York troops during the battle. They also used newly discovered letters from soldiers who fought at Gettysburg, from some of the first visitors to the battlefield and from civilians who endured the battle. Reardon says historians and Civil War history buffs tend to focus on army commanders Robert E. Lee and George Meade and their high-ranking subordinates; but during the course of her research, she often found the accounts of the regular soldiers and the civilians to be the most compelling. “We made sure we paid attention to the commanders, but some of the most fascinating vignettes come from the common soldiers there, as well as the stories of the civilians,” she says. “We wanted to make sure we told their stories, too, and we tried to find something new for each of the stops.” —Matthew Swayne

Religions play positive role in African AIDS crisis


hile the Western press often targets religious groups for their roles in handling the African AIDS crisis, these groups tend to play positive— and critical—roles in fighting the epidemic, according to sociologists.

The researchers conducted extensive fieldwork in Malawi, made shorter visits to other African countries, including Kenya, Ghana, Mozambique and Tanzania, and analyzed survey data from 30 African countries. Some religious groups in Africa are criticized for prohibiting condom use, a practice that can prevent the transmission of HIV, the virus that causes AIDS. However, most people do not avoid condoms because of religious teachings, according to the researchers, who report their findings in Religion and AIDS in Africa (2012, Oxford University Press). While religious leaders told the researchers they prefer abstinence and faithfulness in halting the spread of HIV, they had more complex stances on condom use. Trinitapoli, who wrote the book with Alexander Weinreb, associate professor of sociology, University of Texas, said religious groups are also accused of condemning AIDS patients. She says this criticism is incomplete.

Gregory Collins

“There’s no doubt that religions have done some good and some bad confronting AIDS in Africa,” says Jenny Trinitapoli, assistant professor of sociology, religious studies and demography at Penn State. “But the negative side is often exaggerated, while the good that religious groups do is often overlooked.”

Religious groups in Africa, particularly Christian congregations, have played a critical role as care-giving organizations during the AIDS epidemic in Africa. Besides the family network, religious communities are the most effective and important providers of care for the sick. “In many parts of Africa, you will hear public messages and sermons arguing that AIDS was sent by God as some form of a punishment,” says Trinitapoli. “But these messages are usually geared to the collective and do not reflect how individuals with HIV are treated in their communities.” Another faulty impression in the Western press is that religious beliefs prevent Africans from seeking medical treatment for AIDS-related conditions. The researchers found that, on the contrary, many religious groups actively promote medical solutions.

Religious groups in Africa, particularly Christian congregations, have played a critical role as care-giving organizations during the AIDS epidemic in Africa, Trinitapoli says. “Besides the family network, religious communities are the most effective and important providers of care for the sick.” —Matthew Swayne


Arts and Humanities ›

In Touch With… Helen O’Leary Research | Penn State 2013

The strength and transformation of midlife is fodder for art that explores what can be constructed anew from the familiar and mundane, says Helen O’Leary.

Nobody gets through life without facing challenges, but an artist’s ups and downs are often made tangible by what he or she creates to express and transform the experiences. For Helen O’Leary, a professor of art born and raised in County Wexford, Ireland, life’s recent changes—including a divorce, followed by being selected to receive a 2010-2011 John Simon Guggenheim Fellowship— have inspired her to “root in the ruins and failures” of her personal and national history to visually map the relationship between language and literature and art. “My work uses my life as subject matter, at middle age and mid career, post nuclear family, my continued unpacking and packing, belonging and retraction of homes between countries,” writes O’Leary. We joined her in her studio recently to discuss her recent work and the Guggenheim Fellowship term that took her to New York, Paris, and Berlin to investigate the texts and letters of Samuel Beckett and shape a material response to them.


“making things”

Q: What themes and forms are you exploring in your recent work? A: I’ve been writing stories for the last few years. I’m trying to work with language—both written language and painting language—that has an informal diary feel. That’s very important to me. Lately I’m working off a dating site and Craigslist to find seeds of stories. The themes I’m interested in are people who downsize, people who change their minds somehow, and also in things that are offered that won’t come through. I’m interested in that kind of uncertainty, using the Irish economy as a model, so I’m putting that all together from the very domestic to the larger armature of a country. On a larger level, I’m interested in the much-mythologized culture of loss in Ireland and I’m interested in using that in a very contemporary and offhand way through the person. God knows how I’ll put all that together, but I’m doing it. I’m trying!

Q: Tell me more about your childhood and its impact on your growth as an artist. A: My father died when I was very young. We got hit by a tornado first, then we got struck by lightning, then my father got a brain tumor. It was “bam, bam, bam!”—three things. And my mother was left with four girls in a culture where girls shouldn’t own land. It was expected that we would sell off the farm. But our project for the next eight years was to keep it going. We rented rooms to tourists before tourists were really a thing in Ireland. It brought the world to us, but it also made land really important to us. Land, to me, is also the canvas or the table, the tangible. Our world became very unconventional very quickly. I learned that you live by the skin of your teeth. It trained me for art school to realize that conventions might not work for you and could be broken, so when I came to making art—oh, I hate the word art—let’s say, making things, I would always look for another way to do it. I loved drawing on the kind of insubordination I grew up with, and using it.

Q: What is behind the emphasis on deconstruction in your recent visual art?

Patrick Mansell (4)

A: While I revel in painting—its rules, its beauty, its techniques—I need to fold my work back into the agricultural language I grew up with. I’m interested in the personal, my own story, and the history of storytelling. So I take things apart, forgetting conventions and reapply my own story to the form. In “Where Things Matter,” one of my latest works, I was interested in painting that would stand up without the usual structures of support. I am looking at my own life, the history of Sean-nós singing in Irish music, Beckett’s pared-down language, and the “currency of need” found in most houses when I was growing up.

Q: Why do you hate the word art? A: It’s so grandiose and above. I didn’t grow up with the idea that art was within my reach, but human expression was. Art to me was something that belonged in the “big house.” Growing up in rural Ireland in the 1960s through ‘80s, my mother often spoke about the “big house” and the class system that was clearly in place. She’d end each story with “Their ways weren’t our ways” and, later, when my family ran our boarding house, she would dismiss the tourists who stayed with us with the same comment. My childhood was defined by a culture where making things—food, shelter, ornament—and “making do” were central to both the physical and emotional survival of the family. So art is my world and my life and I believe in it, but I would just call it our need to speak rather than that word.

Q: Do you consider the writing you’re doing fiction or memoir?

Q: D  o you still primarily consider yourself a painter?

—Melissa Beattie-Moss

A: I consider myself a visual poet. If I was to find a niche, that’s where I’d say I’m sitting. I make postcards with found text on them that are like concrete poems. I make photographs of historically made books. And I make paintings that are large and kind of look like they’re becoming undone. I’m interested in things that leak, things that tear, things that come apart. I’m taking apart my house bit by bit and making it into slivers and then reconstructing it as armature for painting. 65

A: I would say I’m working with memoir, stories of growing up on the farm in Wexford and my life now in the States. I’ve called my work “invention out of need,” using my own displacement as fodder for meaning. In terms of the writing, I started out just writing my story of growing up and my mother’s survival after she was

widowed. And it turned out that I was writing about that as Ireland was losing its economic foothold, going from the second richest country in the European Union to the humiliation of our banking crisis, so I was looking at my youth, the Irish situation and my situation now and I was trying to put all those things together—but not in some “sad bastard” way. I’m interested in the kind of optimism that’s always present in the Irish psyche. Laughter is a huge part of it. When I’m trawling through these online sites, I’m looking for laughter more than anything else: people selling weird things or people with strange notions of businesses or…kind of more of a Samuel Beckett approach to life more than Oprah.

I think there’s another layer as well: You spend the first 50 years gathering stuff and you spend the next 50 years getting rid of stuff. So I’m getting rid of stuff. And I’m interested in what happens when you slice it up. I have disassembled the wooden structures of previous paintings—the stretchers, panels, and frames—and have cut them back to rudimentary hand-built slabs of wood, glued and patched together, their history of being stapled, splashed with bits of paint, and stapled again to linen clearly evident. At this point, I want painting to be picked clean; I like its thinness, its touch. I whittle for meaning. I’m literally culling my house, splintering things and re-assembling them as armature. It’s hopeful: through the process of deconstruction and reassembly, these structures imagine the possibility that painting might take root and find a place to press forward into fertile new terrain. Ultimately I’m after painting that gives joy to the eye and substance to the spirit.

Arts and Humanities ›

Indomitable Will

Research | Penn State 2013

When Charlie Kupfer, associate professor of American studies and history at Penn State Harrisburg, began research for his second book, Indomitable Will: Turning Defeat into Victory from Pearl Harbor to Midway, he sometimes felt like a time traveler. Kupfer spent hundreds of hours over the course of years at the National Archives and the Library of Congress listening to radio news broadcasts of World War II from CBS, NBC, and the Mutual Broadcasting System, then identifying common themes from communications, historical, and cultural perspectives. His goal—a new one in the study of World War II—was to try to determine why Americans viewed the war as a victorious march from beginning to end, when in fact the first six months after Pearl Harbor were ones of military defeat and bad news. Indomitable Will reveals why Americans remained steadfast in their belief that the war would end in victory and lays out what Kupfer calls the “big ideas” that helped steel U.S. resolve. Among these were the importance of a robust Navy and the deepening alliance between the U.S. and Great Britain. Kupfer says he’s gained new respect for the connectedness engendered by radio news that made it the social media of its time. “I was astonished by the scope, depth, and quality of the media analysis in the ’40s and amazed at how plugged in people were then thanks to radio,” he says. “We are flattering ourselves unduly when we think that we’re the first connected generation. The people then felt themselves connected to events by coverage that was immediate, accurate, and frequently dazzling. A person in small-town Texas or New York City knew that the sounds they heard in the background when Edward R. Murrow broadcast from a London rooftop were the sounds of British and German aerial combat, happening at that very moment. Americans listened to the same broadcasters, so they had common listening experiences, which were discussed broadly in the culture.” —Yvonne Harhigh


Humanities mini-courses for doctors sharpen thinking and creativity Mini-courses designed to increase creative stimulation and variety in physicians’ daily routines

can sharpen critical thinking skills, improve job satisfaction, and encourage innovative thinking, according to Penn State College of Medicine researchers.


he courses are an outgrowth of a pilot initiative called the Penn State Hershey Physician Writers Group, founded and facilitated by Kimberly Myers, associate professor of humanities and English. The group met every other week for three months and explored how medically related topics are featured in different literary genres. Participants wrote original pieces, which they discussed and edited with each other and Myers. “The process of literary analysis, which is both methodical and intuitive, helps to sharpen the cognitive processes inherent in medical diagnosis and treatment that are so vital in medical practice,” says Myers. “Group discussions also provide a refreshing opportunity for collaboration, which help to form new alliances among colleagues.” Many physicians’ writings were published in professional journals, and the physicians reported overwhelming satisfaction with the experience. As a result of the pilot program’s success, the researchers and their colleagues in the Department of Humanities developed and conducted eight mini-courses on different topics. The overarching goal was to provide humanities-related, clinically relevant learning opportunities for health care practitioners.

Participants including physicians, nurses, administrative and support staff, medical and nursing students, and health researchers reported a high degree of satisfaction with learning new disciplines outside of biomedicine, using their training in uncustomary ways, forming new camaraderie with their colleagues, and enjoying a respite from the stressful flow of the workday. “These courses offer an opportunity for intellectual and social ‘play’ to those who participate, which fosters workplace satisfaction and creative, innovative thinking,” says Daniel George, assistant professor of humanities. “Efforts that implement programs like these in other medical settings could potentially contribute to reviving the health care system, which would ultimately benefit both practitioners and their patients.” —Matthew Solovey

These courses offer an opportunity for intellectual and social ‘play’ to those who participate, which fosters workplace satisfaction and creative, innovative thinking.

Nina Jablonski Distinguished Professor of Anthropology

Labor origins of modern American poetry

Evolution helped turn hairless skin into a canvas for self-expression Hairless skin first evolved in humans as a way to keep cool— and then turned into a canvas to help them look cool, according to a Penn State anthropologist. About 1.5 to 2 million years ago, early humans, who were regularly on the move as hunters and scavengers, evolved into nearly hairless creatures to more efficiently sweat away excess body heat, says Nina Jablonski, Distinguished Professor of Anthropology. Later, humans began to decorate skin to increase attractiveness to the opposite sex and to express, among other things, group identity. “We can make a visual impact and present a completely different impression than we can with regular, undecorated skin,” says Jablonski. Over the millennia, people turned their skin into canvases of self-expression in different ways, including permanent methods, such as tattooing and branding, as well as temporary ones, including cosmetics and body painting, according to the researcher.

Jablonski says both males and females use forms of skin decoration to become more attractive to the opposite sex. “We can paint a great design on our bodies and use those designs to send all sorts of messages or express group memberships,” she adds. Prior to the evolution of mostly naked skin, humans were furry creatures, not unlike chimpanzees are now, Jablonski says. Skin decoration would not be possible if humans were still covered with fur. Jablonski says that she and other researchers based their estimate on when humans evolved hairless skin on the study of the fossil record and an examination of the molecular history of genes that code proteins that help produce skin pigmentation. —Matthew Swayne

Workers and their problems may not be the first thing one thinks of in contemplating the origins of modern American poetry. But perhaps we should, as John Marsh points out in his recent book, Hog Butchers, Beggars, and Busboys: Poverty, Labor, and the Making of Modern American Poetry. Marsh, an assistant professor of English at Penn State, considers the relationship of prominent American poets to the so-called “labor problem” in the first two decades of the twentieth century, then brings that story forward into the turbulent 1930s. At the heart of Marsh’s book are eight poets who helped form the core of the American poetical canon: T.S. Eliot, Wallace Stevens, William Carlos Williams, Carl Sandburg, Robert Frost, Edna St. Vincent Millay, Langston Hughes, and Claude McKay. Hughes and McKay represent a leftward inclination of the early civil rights movement, Williams wrote about militant labor struggles in Paterson, New Jersey, and Sandburg celebrated the gritty Chicago hog butchers. But others seem less likely choices as labor poets. Labor and poverty were truly the starting point for Eliot, Stevens and Millay, who made “poetry out of the unpoetical,” as Marsh puts it. For example, he shows how young Eliot’s explorations of the working-class slums adjoining Harvard University led to much of the imagery of his early poems, which led in turn to “The Waste Land.” All eight, Marsh demonstrates, contributed significantly to making labor central to the making of modern American poetry. “For poetry to be modern, it must represent figures…who themselves represent modern culture,” he writes. “In short, workers represent modern culture, and modern poetry is represented by workers.”


—Melissa Beattie-Moss

Cyberscience and Information Technology › to physically hold and examine models of data, rather than just observing the information on a video screen.

Research | Penn State 2013


Cell Lab By Matthew Swayne


hether intelligence analysts are trying to predict the next moves of an insurgent group or determining how to best deliver aid after a hurricane, an excess of information can often cause just as many problems as a lack of information.

Red Cell Analytics Lab, a laboratory in Penn State’s College of Information Sciences and Technology, uses technology, including web and social media, and the latest understanding of intelligence analysis to turn information into intelligence during fluid, complex situations that are as timely as today’s headlines. The name “Red Cell Lab” refers to teams of military personnel, often called “red cells,” who are trained specifically to test the effectiveness of American military tactics. During the Cold War, for example, the U.S. Navy created the Top Gun school to train aviators against other American pilots who had been schooled to use the tactics and strategies of Soviet and Warsaw bloc pilots. The success of those programs has continued with current programs that train groups of soldiers to fight like Afghan and Iraqi insurgents; these groups then train fellow U.S. troops that will be deployed in those combat areas.


Similarly, Red Cell members at Penn State investigate threats and opportunities in a range of scenarios and test the effectiveness of possible responses. Red Cell Analytics has collaborated with the Marine Corps Warfighting Lab, the Office of the Secretary of Defense for Intelligence, the Penn State Office of Emergency Management, Boeing Corporation, and the U.S. Army War College. Col. Jake Graham, professor of practice in information sciences and technology, said that while the lab can easily adapt its research to military, counterterrorism, and security missions, lab members are also using their expertise to analyze possible outcomes in emergency and natural disaster relief operations. The lab features advanced technology for information visualization, including 3-D visual displays. Researchers can even print out three-dimensional models using a 3-D printer. This allows researchers

In one recent project, Red Cell students helped the military understand how insurgents might use unconventional tactics to defend against aircraft and helicopter attacks. Graham said Red Cell students are also devising a game that will analyze how communities will cooperate—or compete— for resources during natural disasters. “Every professor tries to bring real problems and challenges to the classroom,” says Graham. “What I try to do is bring those real problems to life.” Researchers and students have collaborated with the staff of Raytheon Intelligence and Information Systems, a defense contractor, to design a series of games that helped detect personal biases. Since the September 11 terrorist attacks, Graham explains, the intelligence community has studied bias as a serious handicap in making decisions based on intelligence. For example, many critics blame the failure to detect warning signs that Al Qaeda was preparing to attack the United States on the personal biases of intelligence agents and flawed problemsolving methods—heuristics. “It isn’t just the 9-11 attacks that we’re studying,” says Graham. “There were several national intelligence failures, including the determination that there were weapons of mass destruction in Iraq, that were fraught with heuristic errors and biases.” Another mission for the lab is to investigate how to best integrate human intelligence with computer-based intelligence, often referred to as hard and soft sensor analysis. As computer sensors, cameras and other intelligence-gathering devices become ubiquitous, enormous amounts of data are now available to analysts. However, Graham says the availability of data does not always translate to intelligence.The way people process information is distinct from how computers offer that information, he says. For instance, a camera that shows a person walking down the street cannot easily ascertain the motive of that person. Human observers, on the other hand, may offer information that is clouded by their own biases and less concrete than that given by

Matthew J. Lesniewski, a research assistant in the Red Cell Lab, says his experience as a member of the lab has given him opportunities he probably would not have had at other universities, including working with Penn State Emergency Management on how social media may influence how emergency workers respond to a disaster.

Real-Time Decision-making: See students talk about what it takes to analyze and respond effectively to a disaster scenario.

hard sensors. “For example, you may ask a person the distance of an object and they may tell you it was ‘a ways away,’” explains Graham. “That could mean a lot of different things to different people.” In the lab, researchers can run complete experiments in the lab to offer new ways to leverage information from both types of sensors—human and computer. Graham has also developed robust game-based tools for intelligence research. The Synthetic Counter Insurgency, or SYNCOIN, is a dataset of fictitious, but realistic messages and intelligence reports about insurgent activity in Baghdad.

and medical care—and negative uses— identifying targets for possible looters. David Hall, dean of the College of Information Sciences and Technology, says that Red Cell Lab goes to the heart of the College and the University’s mission. It not only provides research and service to the community, but involves students as researchers. “What Jake has done here at Red Cell Lab is interesting,” says Hall. “The students are the researchers and, in a way, they also become the research subjects.”

“This semester, we were able to participate in a joint Emergency Operations Center exercise where men and women from a number of Penn State offices and Centre County emergency services responded to a mock disaster,” says Lesniewski. “In doing so, we learned a great deal about how our social media technologies play a role during a crisis.” Lesniewski said the flexibility of the Red Cell Lab learning environment allows him to study current situations in near-real time. “If I see a trend or pattern forming in current events, or if I happen upon an interesting case study that offers a challenging analytical vignette, I have the freedom to investigate with the full support of the Red Cell Lab,” Lesniewski says. Col. Jacob Graham, USMC (Ret), MPA, is Professor of Practice in Security and Risk Analysis in the College of Information Sciences and Technology, jgraham@ist.

Recently, intelligence analysts at Raytheon used the SYNCOIN data to design a game-based exercise to determine if detecting confirmation bias was possible. The researchers at Raytheon says they expect the game may one day be used to develop training to help intelligence avoid confirmation biases.

Col. Jake Graham with students in the Red Cell Lab


Red Cell students tested how people may use Twitter during an emergency, such as the touchdown of a tornado. Graham says the teams experiment with both positive uses of the social network—distributing aid

Patrick Mansell

As the tumultuous, grass-roots oriented uprisings during Arab Spring seem to indicate, social networks, like Facebook and Twitter, increasingly play a central role in communicating goals and organizing responses to everything from popular unrest to natural disasters.

Cyberscience and Information Technology ›

Michelle RodinoColocino Assistant Professor of Communcations and Women’s Studies

Research | Penn State 2013

Technology only a tool in search for solutions to poverty Technology can serve as a tool to bridge the digital divide, but it is unlikely to be a complete solution in helping people find jobs and escape poverty. “People really want to believe that the latest technology will help us do all these great things and liberate us,” says researcher Michelle Rodino-Colocino, assistant professor of communications and women’s studies. “But it’s also a way of putting off the big problems and saying, ‘Let’s not touch these big problems because Internet access will turn it all around for us.’” Rodino-Colocino examined a plan in Walnut Hills, a diverse low-income community in Cincinnati to provide a wireless Internet connection—WiFi— service and computer training to poor, mainly female residents who do not own cars to help them become more employable and escape poverty.

“It’s a classic pick-yourself-upby-your-bootstraps—or internet connections, in this case—type of program,” says Rodino-Colocino. “But it doesn’t address these big problems.” She notes there are few online jobs that pay a living wage. Without a car or childcare, the participants would still find it difficult to find a job. Rodino-Colocino maintains political action, rather than an overreliance on technology, would help low-income people to solve problems associated with poverty, such as low wages and limited access to childcare. —Matthew Swayne

Factors identified that influence willingness to use new information technology

Michael D. Michalisin

People are more willing to use new technology when they perceive it to be low in complexity and high in relative advantage and “trialability,” according to a team of researchers that included Michael D. Michalisin, professor of management and business program coordinator at Penn State Worthington Scranton.

Cognitive barriers of individuals who are Professor of reluctant to use new Management, information technology Business Program can cost an organization Coordinator millions of dollars, the researchers pointed out. A better understanding of those barriers could improve efficiency and effectiveness, as well as the quality of resulting information underlying management decisions. Despite the importance of understanding what determines the acceptance of new systems, and ultimately the success of their implementation, empirical investigation into this issue is still in the developmental stages. The researchers’ findings suggest that when introducing complex new information technology, it may be important to allow individuals the opportunity to try out the technology before implementing it. Managers should assess the likelihood of employee acceptance prior to investing large sums of capital in such technologies. Otherwise the technology may not be optimally used. Tailoring IT demonstrations, training programs and other interventions that illustrate positive criteria can help users make better-informed technology-adoption decisions, which in turn can increase the success of implementing critical IT acquisitions. —Amy Gruzesky


S. Shyam Sundar

Distinguished Professor of Communications, Co-Director of the Media Effects Research Laboratory

No LOL matter:

Tween texting may lead to poor grammar skills Text messaging may offer tweens a quick way to send notes to friends and family, but it could lead to declining language and grammar skills, say researchers in the College of Communications. Tweens who frequently use language adaptations—techspeak—when they text performed poorly on a grammar test, notes Drew Cingel, who conducted the research as undergraduate student, working with S. Shyam Sundar, Distinguished Professor of Communications and co-director of the Media Effects Research Laboratory. When tweens write in techspeak, they often use shortcuts such as homophones, omissions of non-essential letters, and initials to quickly and efficiently compose a text message.

“They may use a homophone, such as gr8 for great, or an initial, like, LOL for laugh out loud,” Cingel explains. “An example of an omission that tweens use when texting is spelling the word would, w-u-d.” He says the use of these shortcuts may hinder a tween’s ability to switch between techspeak and the normal rules of grammar. Cingel collected data that assessed grammar skills and texting habits of middle school students in a central Pennsylvania school district. Overall, there is evidence of a decline in grammar scores based on the number of adaptations in sent text messages, controlling for age and grade. Not only did frequent texting negatively predict the test results, but both sending and receiving text adaptations were associated with how poorly they performed on the test. “If you send your kid a lot of texts with word adaptations, then he or she will probably imitate it,” Sundar says. “These adaptations could affect their off-line language skills that are important to language development and grammar skills, as well.” —Matthew Swayne

Technology convergence may widen the digital divide


echnology is helping communication companies merge telephone, television, and Internet services, but government deregulation may leave some customers on the wrong side of the digital divide during this convergence, Penn State telecommunications researcher Rob Friedens has found. “Moving away from copper lines is an example of abandoning obsolete technology and embracing technology that is faster, better, cheaper, and more convenient,” says Frieden, Pioneers Chair in Cable Television and professor of telecommunications and law. “But the risk is that we may be creating a digital divide—not necessarily a divide between the rich and poor, but between the information rich and information poor.” Telephone companies are lobbying to have regulators to free them of their traditional role as a public utility. The companies cite convergence and availability of new communication technologies, such as cellular

phones and fiber optic cable, that make copper-based telephone land lines obsolete. But Frieden points out that not all these alternatives are as affordable, ubiquitous, or reliable as copper landlines, a problem that could leave many rural residents underserved. Cable companies are classified as information service providers by the government and face less regulation than telephone companies, which are more heavily regulated as utilities. As utilities, phone companies are obligated to provide service to customers and binds them to high-cost,

labor-intensive telephone landline technology. Frieden says telephone companies’ push to release them from that obligation may be the first step toward complete deregulation. While the companies suggest that the market forces will ensure that all customers will eventually receive equal service in a deregulated environment, Frieden is skeptical. “Everyone wants to say, the marketplace is great,” Frieden said. “But there’s also something called market failure, particularly in rural and low-income areas.” —Matthew Swayne


Cyberscience and Information Technology ›

Research | Penn State 2013


Astrophysical Multimessenger Observatory Network

AMON! !"#$%&'()*#' +,"-".#' /()01'2)03#' 435-(%."#'

An Eye On the Universe


By Eileen Wise

mong the creation gods of ancient Egypt, Ra, the Sun God, was master of the physical, concrete world. Amon represented all of the subtle or unseen elements of creation. Together they formed a composite god called Amon-Ra meaning “hidden light.” That is what the AMON project seeks to reveal—subtle aspects of the universe that have never been observed before. AMON stands for Astrophysical Multimessenger Observatory Network. Its mission is to form a network of high-energy observatories across the globe that will search for previously unseen astrophysical signals and send alerts to more traditional telescopes in order to corroborate the possible celestial events. Until the early 20th century, astronomers relied almost exclusively on visible light to view the sky. Their telescopes, though steadily increasing in power, were no different in this respect from the ones used by Galileo in 1610. Today we see much more of the universe by observing light from all across the electromagnetic spectrum. Gamma-ray-, x-ray-, infrared-,


and radio-astronomy have revolutionized astronomical observation, as have the advent of space-based telescopes to complement those on the ground. These new ways of seeing allowed the observation of violent cosmic events which, because of their transience, were previously undetectable, such as supernovas, gamma ray bursts, and collisions between black holes. In addition, the past 50 years have seen tremendous progress in the sensitivity of instruments to detect cosmic rays—highenergy charged particles from outer space, such as protons and charged nuclei. Particle accelerators have enabled physicists to create, detect, and analyze other sub-atomic particles, such as neutrinos. These alternative messengers—particles that survive across vast distances in space—presented whole new avenues of exploration. Neutrinos are useful messengers because of their tiny mass and lack of charge, which enable them to pass through normal matter relatively unimpeded. These phantom particles can be used as probes to study distant events such as supernovae, the explosions that end the lives of highly massive stars. In this instance, the only

particle that is able to escape the extremely dense and energetic nature of such a collapse is the neutrino. But it was not until 1987 that the concept of multimessenger astrophysics was born. At that time experiments deep underground detected a steady stream of neutrinos coming from the Sun. Shortly after that a burst of neutrinos was detected from a supernova in the Large Magellanic Cloud (LMC), a satellite galaxy of the Milky Way. Establishing the direction and location of these events gave astrophysicists the first correlation of two messenger particles—neutrinos and photons—and the promise of searching for other messengers that could be expected to be produced in such transient events, including high energy neutrons and disturbances in the fabric of spacetime, known as gravitational waves. Penn State and the members of the AMON research consortium are engaged in a unique global collaboration, sharing resources, expertise, and data in multimessenger detection. The IceCube Observatory in Antarctica is detecting neutrinos with instruments buried deep in the ice over a broad (1 cubic kilometer) volume. In the Mediterranean Sea at the ANTARES observatory, neutrino detection is occurring deep within the sea. High energy cosmic rays are being picked up at the Auger Observatory in the plains of Argentina. Gravitational waves are the still-elusive quarry sought by the LIGO observatories in Washington State and Louisiana, and the Virgo observatory in Italy. And high energy gamma rays continue to be detected at multiple facilities, like the HAWC Observatory in Mexico and the Swift and Fermi Observatories in space. AMON takes advantage of the ability of “triggering” telescopes at these and other observatories to collect data from broad swatches of sky (noted above) and quickly share it within the network so that a rapid response with narrow-field “follow-up” telescopes can map the event in time and space. Data collected from all the “subtle messengers” combined will greatly increase the probability of observing phenomena that have never been observed before.

Tier III, with 99.999 percent up-time (less than five minutes of downtime annually).

Patrick Mansell

The second challenge is the need for vast amounts of computing power in “bursts” of time that can run rapid simulations on received probable coincidence data and quickly send “alerts” to narrow field-ofview follow-up instruments for confirmation. HPC has the flexibility to deliver those bursts of CPU power. Doug Cowen

“This is cutting edge database work.” “AMON serves as the connective tissue—the network that stitches these global partners together,” says Derek Fox, associate professor of astronomy and astrophysics and science coordinator for the consortium. Penn State has unique advantages that make it an ideal hub for the AMON project. The Research Computing and Cyberinfrastructure (RCC) unit of Information Technology Services enable scholars to do large-scale computations through linked services, including hardware, software, and personnel. The High Performance Computing (HPC) system within RCC is a shared resource among dozens of researchers in a host of departmental and interdisciplinary units at Penn State that meets the dual data challenges presented by the AMON project. First, there is the need to continuously receive data from the triggering instruments. This requires computing systems with robust and consistently high “uptime.” The HPC has sub-systems rated at

“This resource is very attractive to researchers collecting complex data,” explains Doug Cowen, Professor of Physics and Astronomy and Astrophysics and a researcher with AMON and the IceCube neutrino observatory. “We are experimenting with a ‘probabilistic’ database that can collect disparate data, say, on neutrinos and gamma rays, and quickly determine the probability that both have come from the same source. This is cutting edge database work.” Because of the efficiency of the HPC systems the Penn State IceCube group has been able to process and rapidly deliver much more simulation data that any other similarly-sized group. Human and technical challenges remain. Getting thousands of scientists together to communicate openly across many

different languages and cultures is one of the biggest. “We need to build trust between the participating observatories, and this is a big part of the effort,” says Miles Smith, former director of operations for AMON, and a continuing participant. “We are asking them to share their most precious commodity—their data. Yet there is recognition among these diverse groups of scientists that only through cooperation and collaboration are we going to see the breakthroughs we seek in understanding the universe. There is a lot of incentive to work together. “Economically, it also makes sense to create a hub at one location where information is collected and sent to all the partners, rather than going through individual relationships between each observatory,” he adds. “This is a strategy of data sharing that is likely to become more common as scientists grapple with the increasingly complex data that needs to be analyzed.” “We are truly at the dawning of the age of multimessenger astronomy,” adds Fox. “We expect to see discoveries that will be revolutionary for our understanding of the universe, including being able to detect some of the most violent and dramatic phenomena.” Doug Cowen, Ph.D., is professor of physics and astronomy and astrophysics and a researcher with AMON and the IceCube neutrino observatory. Derek Fox, Ph.D., is associate professor of astronomy and astrophysics and science coordinator for AMON. Miles Smith, Ph.D., formerly a research associate in the Institute of Gravitation and the Cosmos in the Department of Physics and director of operations for AMON, is now at NASA’s Jet Propulsion Laboratory.

Derek Fox

Funding for the initial development of AMON has come from the Office of the Vice President for Research and the Eberly College of Science.


We are at the dawn of multimessenger astrophysics, and Penn State has the unique position of being at the center of it.

Other Penn State researchers involved in AMON include Paul Sommers, Stephane Coutu, Gordana Tesic (Physics); Peter Meszaros, Josh Fixelle (Astronomy); Jogesh Babu (Statistics), and Prasenjit Mitra (Information Science and Technology). Abhay Ashtekar (Physics) and Padma Raghavan (Director of the Institute for Cyberscience and Associate Vice President for Research) have been active in securing funding for the project.

Industry Partnerships ›

Dressed to Kill (Cancer)

Research | Penn State 2013

Penn State start-up Keystone Nano is pioneering two new approaches to cancer therapy.

by Walt Mills


aise your hand if you haven’t been touched by cancer,” says Mylisa Parette to a roomful of strangers.

Parette, the research manager for Keystone Nano, has occasional opportunities to present her company’s technologies to business groups and wants to emphasize the scope of the problem that still confronts society. “It’s easier to see the effects of cancer when nobody raises their hand,” she says. Despite 40 years of the War on Cancer, one in two men and one in three women will be diagnosed with the disease at some point in their lifetime. Parette and her Keystone Nano colleagues are working on a new approach to cancer treatment. The company was formed from the collaboration of two Penn State faculty members who realized that the nanoparticle research that the one was undertaking could be used to solve the drug delivery problems that the other was facing. Mark Kester, a pharmacologist at the Penn State College of Medicine in Hershey, was working with a new drug that showed real promise as a cancer therapy but that could be dangerous if injected directly into the bloodstream. Jim Adair, a materials scientist in University Park, was creating nontoxic nanoparticles that could enclose


drugs that might normally be toxic or hydrophobic and were small enough to be taken up by cells. The two combined their efforts and, licensing the resulting technology from Penn State, they joined with entrepreneur Jeff Davidson, founder of the Biotechnology Institute and the Pennsylvania Biotechnology Association, to form Keystone Nano. The new company’s first hire was Parette, whose job is to translate the lab-scale technology into something that can be ramped up to an industrial scale, and to prepare that technology for FDA approval leading to clinical trials. Davidson, Parette, and KN’s research team work out of the Zetachron building, a long, one-story science incubator a mile from Penn State’s University Park campus. Operated by the Centre County Industrial Development Corporation, the building was originally the home of the successful Penn State spin-out company that gave it its name. A second Keystone Nano lab was recently opened in the Hershey Center for Applied Research, a biotech incubator adjacent to the Penn State College of Medicine. “Our excitement is that we think our technology has shown efficacy in a whole range of animal models,” Davidson, Keystone CEO, remarked during a recent meeting in the shared conference room at Zetachron. “We understand the method

of action, the active ingredient. We think it has every chance of being useful in treating disease. Our question is, how do we push this forward from where we are today to determining, one way or another, that it really does work?”

Two approaches to drug delivery Keystone Nano is pioneering two approaches to cancer therapy, both of which rely on advances in nanotechnology to infiltrate tumors and deliver a therapeutic agent. The approach nearest to clinical trials is a ceramide nanoliposome, or what Davidson calls a “nano fat ball around an active ingredient.” Kester, in whose lab the approach was developed, thinks of it as a basketball with a thick bilayer coating that contains 30 percent active ceramide and a hollow interior that can hold another cancer drug. Kester is an expert on ceramide, a naturally occurring lipid, or fat molecule, that is involved with apoptosis, a type of programmed cell death. Part of the reason that cancer tumors are able to survive the body’s defenses, not to mention chemotherapy and radiation, he explains, is that the cancer can suppress ceramide activity in the tumor. The combination of a proven cancer drug, such as sorafenib, delivered in conjunction with ceramide could be a powerful approach to attacking drug-resistant tumors.

The second approach is Adair’s nanoparticles, called NanoJackets, because, Kester points out with a laugh, they are “dressed to kill.” Made from calcium phosphosilicate, a non-toxic material that is essentially the same biomaterial as teeth and bones, NanoJackets will encapsulate a variety of active pharmaceutical ingredients. They show promise both as powerful imaging agents for detecting early stage tumors, and as effective treatment for human breast cancer in animal models. Both approaches are based on the ability to deliver toxic drugs directly to the site of a cancer tumor without the familiar off-target toxicities that plague most current cancer therapies, leading to nausea, hair loss, nerve damage, and sometimes even death. “Both of our technologies are based on non-toxic materials,” Parette emphasizes. “That gives us a significant advantage in a clinical setting.”

The struggles of a startup There’s a good reason the transition from lab bench to production line is labeled the Valley of Death—it’s the place where most ideas go to die. This is even more the case for any technology meant to be used in the body. Big pharmaceutical companies claim to spend on average $1.25 billion on developing a new therapeutic. Davidson, the CEO whose main job is to try to keep the company above water financially, would be happy with an amount several zeros less than that.

Jim Adair

“The problem is that most pharmaceutical companies want to get involved after the therapy has been proven safe and effective in humans,” Davidson remarks. To keep their research moving forward, the Keystone team has relied on federal Small Business Innovation Research and other grants, and on contract work for pharmaceutical and materials companies who want to reformulate their products in nano packages. The contract work, Davidson says, has helped them keep technical staff employed and advanced their knowledge of how to scale up their own products.

Keystone Nano is now gearing up for a Phase I clinical trial, which the company hopes will begin in 2013. Before that can happen, Keystone must finish preclinical testing and assemble an Investigation New Drug (IND) package for the ceramide nanoliposome for federal regulatory approval. “In the preclinical stage there are always going to be things you run into,” says Parette. “So it’s hard to pinpoint the exact time we will be ready to go to the FDA. It’s a combination of factors that include a novel drug, ceramide, that hasn’t been used in this way before, plus a different delivery technology, the nanoliposomes.” Getting FDA approval involves collecting data from animal studies to demonstrate efficacy, safety and an absence of off-target toxicity. In addition, the IND package has a chemistry and manufacturing control section that requires Keystone to provide details of its manufacturing process. All in all, for a small company with only a handful of employees involved in developing new methods to measure novel compounds at the molecular level, these are formidable hurdles to overcome.

Mark Kester


“Fortunately,” Davidson says, “Jim (Adair) is very helpful in identifying and helping us think about nanoparticle characteristics, transformations, purification, and analysis. And Mark (Kester) is very helpful in thinking about what tests you have to run to sort out efficacy and safety.”

Industry Partnerships › Attacking liver cancer The Hershey Center for Applied Research is within a long stone’s throw of the Penn State Milton S. Hershey Medical Center and College of Medicine campus. Along with a number of startup biotech companies, the center houses the offices of the University’s Department of Pharmacology. Mark Kester’s faculty office is on the third floor behind a door with a poster of his native Bronx, a heritage still evident in his speech patterns and his to-the-point style. Research | Penn State 2013

“Ceramide is an agent that kills only cancer cells in the doses we’re using,” Kester says. “Most therapeutics kill proliferating cells, which include hair follicles, white blood cells, and the gut lining. They can kill all the cancer, but kill the patient too. One problem with ceramide is that you’ve got a great cancer killer, but you can’t deliver it. In the blood, it would just turn into something resembling a salad dressing.” But by encapsulating the ceramide in a tiny, water soluble globule of fat, they can solve the delivery problem. Liposome delivery is a time-tested method. The ceramide nanoliposomes are 70-80 nanometers in diameter with a polyethylene glycol coating that hides them from macrophages, the body’s police. They can also be tagged on the surface with small antibodies that only lock onto cancer cells.

“The problem with nanoliposomes is they can be made in small quantities, but they are hard to scale up to preclinical and clinical quantities,” Kester says. “We needed Keystone Nano to solve this, which they’ve done. Keystone is also leading the preclinical testing initiatives and funding the research, and holding discussions with prospective partners. Once it goes to clinical trials, Jim (Adair) and I will not be involved in the clinical testing.” The initial target is liver cancer, a disease that is widespread in Asia and growing quickly in the U.S. Nearly 700,000 people are diagnosed with the disease annually and most patients succumb to the disease within a year of being diagnosed. The current best therapy, called sorafinib, provides only six to nine extra weeks of life. “We can do better,” Kester says. In at least 10 peer-reviewed papers, he and his colleagues have shown that ceramide nanoliposomes are capable of killing liver and breast cancer, melanoma, and certain types of leukemia. A Phase I clinical trial, Davidson explains, is typically geared toward safety and determining dosage of the active agent, and in this case will involve 20 to 25 patients. Phase II deals with efficacy and Phase III looks more critically at a larger patient population.

At Hershey Medical Center, where the trials may take place, there are in the neighborhood of 90 patients receiving treatment for liver cancer at any given time, so finding 20 who would be candidates for the trial should not pose a problem, Davidson believes. “They would have to agree to participate and receive an injection. However, the outcome for those patients is near certain death, so they are quite willing to try something, if not for themselves, then for others.”

NanoJackets— the next technology Meanwhile, the nanoparticles called NanoJackets, created in Adair’s lab several years ago, have been undergoing extensive refinement and scale-up at Keystone. Davidson estimates they are about 18 months away from a clinical trial, but a recent $1 million grant from the Pennsylvania Department of Health through its Commonwealth Universal Research Enhancement (CURE) program should help speed the process. Adair’s particles range from 5 to 50 nanometers in diameter and come in concentrations of a million billion particles per milliliter. One of their remarkable qualities is their ability to increase the longevity and enhance the luminosity of fluorescent dyes.

Good Things Come in Small Packages: View an animation of Keystone Nano’s nano-enhanced therapies for the improved treatment of cancer.


Dressed to Kill

In one scenario, NanoJackets could be loaded with a dye that glows brightly enough to light up very small tumors at significantly greater tissue depth than other fluorescent probes, and at the same time deliver a cancer drug. Combining the two effects makes it possible to track the particles into the tumor, release the active agent, and then watch to see if the drug is shrinking the tumor. Adair and Kester call it theranostics, therapeutics and diagnostics rolled into one. In passive targeting, nanoparticles circulate in the bloodstream until they find their way into tumors through the poorly formed blood vessels that fast-growing tumors develop. Once within the tumors, the particles can further diffuse to deliver drugs to nearby cancer cells, Parette explains. Keystone also uses a variety of methods to actively bind to cancer-specific cellular receptors, she adds. By attaching some kind of ligand, or binding molecule, to the surface of their particles, they can go after cells in leukemia and other non-solid tumors.

“The future of medicine is small. Really small. Every pharmaceutical company is trying to figure out how to deliver molecular-based therapies.”

Keystone Nano

In a conference room at the Hershey Center for Applied Research, Kester shows off this ability, which could be extremely useful for tumor imaging. Small tubes of fluorescent dyes encapsulated in nanoparticles created over five years ago still glow green, blue, and orange under an infrared light.

Ceramide nanoliposomes are 70-80 nanometers in diameter with a polyethylene glycol coating that hides them from macrophages, the body’s police. They can also be tagged on the surface with small antibodies that only lock onto cancer cells.

The future of nanomedicine In the next few years, Parette says, cheap genetic sequencing should allow treatments to be devised for individualized cancer care. One promising technology involves siRNA—an artificial snippet of RNA that can knock out the gene activity in cancer cells. The major issue to date has been how to deliver these short interfering RNA sequences to the proper location in enough quantity to shut down the cancer cells. NanoJackets could be one of those methods. “The interesting part,” says Parette, “is that current drugs are untargeted—there is no selectivity. On the other hand, we’ve got multiple layers of targeting. We’ve got nano, which takes advantage of enhanced permeability and retention. Or we can actively target using ligands or antibodies. Then it turns out that the active ingredients we are working with are selective in and of themselves: Both the ceramide nanoliposomes and the siRNA NanoJackets are active in cancer cells but not in noncancerous cells.” “The future of medicine is small,” Kester says with no evident irony. “Really small. Every pharmaceutical company is trying to figure out how to deliver molecular-based therapies.”

By as early as 2015, he predicts, a patient coming into a hospital for cancer treatment will have a swab done to find out why he or she has cancer. Then the oncologist will design a drug that goes after only that particular mutation. And Keystone Nano will have a technology to deliver the drug directly into the cancer cell. Jeff Davidson, CEO of Keystone Nano can be contacted at and Mylissa Parette, Ph.D., research manager, can be contacted at Jim Adair, Ph.D., is Keystone Nano’s chief science officer and professor of materials science and engineering and of bioengineering at Penn State. He can be contacted at jha3@ Mark Kester, Ph.D., is Keystone Nano’s chief medical officer and the Thomas Passananti professor of pharmacology and director of the Penn State Center for NanoMedicine and Materials in the College of Medicine. He can be contacted at Keystone Nano has received support from the Nanotechnology Institute of Pennsylvania, the National Cancer Institute, the Pennsylvania NanoMaterials Commercialization Center, Ben Franklin Technology Partners, and Penn State. To learn more, visit http:// A version of this article appeared in Penn State Focus on Materials.


Industry Partnerships ›

From the Earth to the Moon, via Penn State




Lunar Lion project offers students diverse research experiences.

tasks of operating in harsh environments, coupled with the energy and ambition of Penn State students, the Lunar Lion can win this competition,” he says.

By Bekka Coakley


Research | Penn State 2013

ithout textbooks, blueprints, or even a template to follow, Penn State students are working side-by-side with faculty in a rare opportunity to build a robotic spacecraft—the Lunar Lion—that will land on the moon and return high-resolution images, video footage, and scientific data. Led by Michael V. Paul, space systems engineer in the University’s Applied Research Laboratory, students and faculty in engineering, physics, astronomy, geoscience, journalism, and business are competing against the world’s rising stars in space exploration to win the Google Lunar XPRIZE Competition. “This is an opportunity for Penn State to establish itself as a leader in a growing field—a provider of the best research and the best graduates to the commercial and private space industry,” Paul says. Funding for the mission comes from a combination of philanthropy, corporate sponsors and scientific partners in the private sector. Maria Matthews, who graduated from Penn State in December with a Ph.D. in physics, served as the team’s business development coordinator. Her work has given her insight into the diversity of the space industry—and opened new doors for her in the field.

much more about the business side of the project, and it’s a lot more exciting to me.” She recently started a job at an aerospace start-up in Huntsville, Alabama. Paul, who himself was the spacecraft systems engineer for NASA’s MESSENGER Mission to Mercury, says Penn State is the only university leading a team in the contest. The other competitors are privately funded. Nevertheless, he likes his team’s chances. “With a group of scientists and engineers who understand the difficult

According to Kevin Walker, a senior from Annapolis, Maryland, and the student project coordinator, about 20 students are working closely on the project, with more joining every month. For Walker, an industrial engineering major, participating in the Lunar Lion initiative while balancing his studies and his Air Force ROTC obligations has taught him a lot about time management and building good habits while applying his growing technical skills. “I’ve learned a lot about technology and business management,” said Walker. “This has been an incredible experience, and when I get out of the Air Force I’d be interested in a career focused more on the design process or even management. I’ve learned a lot about making my own decisions based on where we need to go next with this project.”

Matthews, a California native, was focused on building a career in the space industry. Her advisor told her about Penn State’s Lunar Lion and recommended she get involved. “There are people from so many disciplines working on this project,” she said. “I thought I’d be here to work on systems engineering, but I’ve learned so

Learning from NASA: See members of Penn State’s Lunar Lion team as they seek the expertise of NASA engineers at John Glenn Space Center. 


It’s been 35 years since the last lunar landing. Penn State is trying to make sure there’s another one by 2015.

The Penn State Lunar Lion is joining 28 other teams from around the world in the $30 million Google Lunar XPRIZE Competition to land a privately funded spacecraft on the moon. To win, the team must build a robotic vehicle that can be launched to the moon by a commercial launch service, transmit high-resolution photographic images and video to the mission control at Penn State, travel to a new location at least a third of a mile from the first and collect more video and photographic images from there. The first team to carry out these tasks will win the grand prize of $20 million. Second place will win $5 million. The teams have until December 31, 2015, to finish the mission.

Reuben Bushnell, an electrical engineering graduate student from Baltimore working on power system design for Penn State’s moon lander, said that what makes the project more challenging is that the work each person does impacts everyone else’s. “So if we’re not communicating with each other, it could affect everyone else,” Bushnell said. “You’re responsible for your area, but you’re responsible for other areas as well.” “This is more like an industry internship, on campus,” Bushnell added. “We’re working with a multitude of suppliers for parts, and we’re learning a lot about troubleshooting.”

“It’s a great example of how Penn State is serving the country—we’re generating leaders here, in addition to growing technical expertise,” said Paul.

For more information about the Lunar Lion team visit For more information about the Google Lunar XPRIZE visit

Named Penn State’s Corporate Partner of the Year in 2012-13, Chevron continues to provide wide-ranging support to the University. Chevron, the second-largest integrated oil company in the United States, has invested more than $19 million in a variety of Penn State initiatives, with approximately half of that amount representing philanthropic investment. For example, a half-million-dollar Chevron grant will support several initiatives across the University, including laboratory upgrades, faculty development, scholarly travel, diversity programming, and other initiatives across several academic colleges. According to Penn State President Rodney Erickson, partnering with Chevron “means new and better opportunities for our students, first-rate equipment to facilitate research and learning, and innovative research initiatives that lead to safer and more efficient practices for producing energy,” among other outcomes. Chevron Vice President Bruce Niemeyer calls the partnership “essential to our vision because relationships with quality institutions outside our walls bring new ideas and perspectives that help us realize the potential in our organization.” Penn State’s previous Corporate Partners of the Year include Toshiba/ Westinghouse, Dow Chemical, Boeing, Highmark, PNC, Barnes and Noble, Lockheed Martin, and Bank of America.


According to Paul, this kind of careful, coordinated development of complex systems made the Apollo program and other ambitious space projects successful where competing programs around the world failed. The American process of systems engineering, he said, is key to the nation’s continued technical and economic strength, and the Lunar Lion is an exciting way to train Penn State students in the practice.

He says the Lunar Lion project is an opportunity for Penn State to position itself for future research in space systems and exploration at a time when the space industry is transforming, with the retirement of the Space Shuttle and the birth and recent successes of a growing number of private space companies. And, Paul says, as the Lunar Lion team attracts more funding and builds its momentum through the next phase, it will add to Penn State’s production of the technical and human capital that the country needs to spur economic growth, even before this university-led team launches for the moon.

As Corporate Partner of the Year, Chevron creates new opportunities


enn State University Libraries rank eighth among North American research libraries, according to the most recent Association of Research Libraries Investment Index Rankings, based on dollars spent for total library expenditures, salaries and wages, and library materials plus the number of professional plus support staff employees. Barbara I. Dewey, dean of University Libraries and Scholarly Communications, notes that such recognition is an important asset in “the recruitment and retention of undergraduate and graduate students as well as attracting and meeting the needs of top-notch faculty. It also reinforces our role as a leader in research projects with other top libraries in North America.�



Eric Weiss

This publication is available in alternate media on request. The Pennsylvania State University is committed to the policy that all persons shall have equal access to programs, facilities, admission and employment without regard to personal characteristics not related to ability, performance, or qualifications as determined by University policy or by state or federal authorities. It is the policy of the University to maintain an academic and work environment free of discrimination, including harassment. The Pennsylvania State University prohibits discrimination and harassment against any person because of age, ancestry, color, disability or handicap, genetic information, national origin, race, religious creed, sex, sexual orientation, gender identity, or veteran status and retaliation due to the reporting of discrimination or harassment. Discrimination, harassment, or retaliation against faculty, staff, or students will not be tolerated at The Pennsylvania State University. Direct all inquiries regarding this Nondiscrimination Policy to the Affirmative Action Director, The Pennsylvania State University, 328 Boucke Building, University Park, PA 16802-2801: tel. 814-863-0471/ TTY. U.Ed. RES 14-02

Research/Penn State The Pennsylvania State University 221 Ritenour Building University Park, PA 16802-4600

Nonprofit Org. U.S. Postage PAID State College, PA Permit No.1

Research & Discovery Newswire

News and features about Penn State’s world-class research enterprise. By email every two weeks. Sample it.

Research Penn State