CONVERGENCE The Magazine of Engineering and the Sciences at UC Santa Barbara
A New Century for Physics Fantastic Voyage Q & A with Michael Gazzaniga Engines of Discovery FALL 2005, THREE
Message From the Deans This third issue of Convergence marks the end of the publication’s first year, and we have been delighted with the feedback we’ve received on the magazine. Readers say they love hearing about the latest thinking in science and technology at UC Santa Barbara. And they say they like the fresh style. UCSB is emerging as a national leader in science and engineering, and it’s time we started talking more about the exciting initiatives here. We have so much to say. For example, you should know that: Four teaching faculty in Science & Engineering have won the Nobel Prize in the last seven years (five university-wide) Our faculty are consistently noted as among the most highly cited in their fields We are continuing to receive significant new and renewed funding in a wide range of areas, including bio-nanotechnology to treat cancer, lung and heart disease; materials research; cognitive neuroscience; and much more The College of Engineering is ranked #2 nationally in the number of faculty elected to the National Academy of Engineering
In 2006 we’ll continue to tell our story, not just here in the pages of Convergence, but in places and ways that we hope will surprise and delight you.
Matthew Tirrell Dean, College of Engineering
Martin Moskovits Dean of Mathematical, Life and Physical Sciences, College of Letters & Science
Evelyn Hu Co-Director, California NanoSystems Institute
fall 2005, three
A New Century for Physics
One hundred years after Einstein, what's next?
Picture tiny devices traveling through the body to kill tumors and treat diseased arteries.
8 12 13
20 The New Aerospace Rubber aircraft? The technology of space is pushing some envelopes.
Scoping Out the Synapse:
How did our ability to learn evolve?
What is this? Shorts...Have you heard?
Engines of Discovery:
Faster computers and smarter programs are remaking the practice of science.
'Tis the season to give...to us! SEE PAGE 12
CONVERGENCE The Magazine of
E n g i n e e r i ng and the Sciences at UC S anta Bar bar a
A New Century
for Physics It has been an eventful 100 years since Einstein overthrew our traditional ideas of space and time. UC Santa Barbara scientists take up the question: What’s coming now? The short answer is, we’ll just have to wait and see. Science is unpredictable, and scientists know better than to assume that the future will reliably follow the playbook of the present. “We’re never surprised by the unexpected,” says physics professor James Hartle. The unexpected happened in a big way in 1905, when Einstein published a series of papers that transformed physics with radical and convincing theories on the nature of light, time, space, mass and energy. Among other things, the world was introduced to E=mc2 and started learning to live with the principle of relativity. But such revolutions are very rare. Rather than experiencing a new one, the more likely scenario sketched out by UCSB scientists is that physics will follow a path it has been on for several decades – that of working out the implications of Einstein’s work, along with other 20th Century insights into the often very strange behavior of the natural world. The theorists of the last century left behind questions that the present generation still needs to answer. Hartle and other physicists at UCSB – both on the physics faculty and at the university’s Kavli Institute for Theoretical Physics – have played key roles in pushing their science into new theoretical territory. UCSB is especially notable for its work in string theory, a model of fundamental particles that raises the exciting prospect of what some physicists call a T.O.E. – theory of everything. Physicists at the university involved in this and other path-finding work talked recently with Convergence about the paths their science may follow in coming years. Here are some of the questions and concerns they say will occupy physics as the second century of the Einstein era begins.
The Quantum-Relativity Conflict Over the past 100 years, physicists have been remarkably successful at explaining the very large and very small. Einstein’s ideas, which culminated in the general theory of relativity in 1916, deal with events on a vast canvas, describing how gravity shapes space and time. Quantum mechanics, which also developed early in the 20th century, describes the smallest units of matter and force – electrons, photons, quarks (the components of protons and neutrons) and other particles that, as far as we know, make up everything. It has also led to world-changing technologies, like that of the personal computer. (Einstein played a role here as well. One of his 1905 papers developed a key quantum idea, that light acts like a particle as well as a wave). But there’s a serious problem with these theories: they don’t match up. As UCSB physics professor Gary Horowitz explains, general relativity and quantum mechanics “work well in their domains, but they are not consistent.” Calculations that precisely explain phenomena in one realm, for instance, produce impossible results in the other. As Horowitz notes, the models of the very small and very large also differ on the basic conceptual question of whether one can predict a precise event at a specified point in time. “Quantum mechanics says you can only predict probabilities for various events,” Horowitz says. “General relativity doesn’t deal with probabilities at all.” The world of general relativity is also not hard to grasp
intuitively, he says, “once you’re comfortable with the idea of four dimensions and the curvature of space.” Quantum mechanics is strange even for scientifically trained minds. It describes a micro-world of furious and sometimes absurd activity – like two particles simultaneously taking different paths, or a particle “tunneling” through a wall even though it lacks the energy (by the laws of classical physics) to do so. If anything can be called the main event in theoretical physics today and for the near future, it is the search for a theory that resolves these conflicts. As UCSB Professor Steve Giddings puts the question, “What is the quantum theory underlying space-time?” That is, what theory can explain how gravity, which shapes reality on a vast scale, works at scales far too small to be observed with the most powerful microscopes? Is there common ground for the conceptual models of quantum mechanics and general relativity? Giddings, Horowitz and other UCSB faculty members are working on the possible answer to those questions in string theory.
Strings and Things Greatly simplified, string theory holds that what we call elementary particles are actually vibrating, oscillating onedimensional entities. These “strings” appear as different types of particles – such as electrons, quarks, photons and gravitons (which transmit the force of gravity much as
photons transmit the energy of light) – based on their vibration patterns. First conceived about three decades ago, string theory may not yet be the consensus view of physicists, but it’s clearly the leading contender for that long-sought “T.O.E.” David Berenstein, an assistant professor of physics, says strings are interesting because they explain so much when they’re treated as a constant throughout nature: “You get all sorts of things from that one single idea,” he says. Horowitz says string theory has helped illuminate the quantum aspects of black holes. Kavli Institute professor Joe Polchinski says it may explain the physics just after the Big Bang, when the whole universe was compressed to a quantum scale and, presumably, would have obeyed both Einstein’s relativity and the rules of quantum mechanics. Polchinski also says string theory may explain the recentlydiscovered “dark energy,” which has been detected through its curving of space-time. He is also pursuing the possibility that very large strings, as big as galaxies, might have been created at the birth of the universe. UC Santa Barbara has an especially strong reputation in string theory: Horowitz was among a group of physicists in the 1980s who produced a key breakthrough, describing
“String theory has helped illuminate the quantum aspect of black holes,” says Horowitz. how string theory works in extra dimensions (beyond the four observable ones) to produce the physical properties of elementary particles. Andrew Strominger, another important string theorist, was on the UCSB faculty at the time and collaborated with Horowitz; he has since gone to Harvard University. Polchinski is known for his 1995 discovery of D-branes – structures in string theory that can have more than one dimension. David Gross, now director of the Kavli Institute, co-discovered a key variant of the string bestiary, heterotic string, while at Princeton in the 1980s. Polchinski calls UCSB “one of the top four or five places in the world” for string-theory research. As physicists in coming years continue searching for a theory that covers space, time, gravity, the quantum realm, the first moments of the universe and the heart of black holes, their paths seem sure to lead through Santa Barbara.
More Big Science – and Maybe Some Surprises On the experimental side, physicists are looking forward to more powerful tools for probing the sub-atomic realm and mapping the universe. Theorists have high hopes for a pair of projects near completion or just starting up. One is the Large Hadron Collider (LHC), the world’s most powerful particle accelerator. Located near Geneva, Switzerland, it’s expected to be in operation by 2007. The other is a new observatory (actually a pair of them, in Louisiana and Washington State) designed to detect faint gravitational waves; called the Laser Interferometer Gravitational Wave Observatory (LIGO), it's just now starting to collect limited data. Do scientists expect to see the strings? Not directly, since the LHC – or any feasible accelerator – lacks the power to detect them. But it might offer indirect evidence to confirm string theory, and more powerful colliders in later years may offer more. Strings aside, the LHC has plenty of potential to spur discovery. Giddings suggests that, “in the luckiest scenario, we might even start manufacturing microscopic black holes.” (One of Giddings’ achievements was to figure out the signature of a black hole in a particle accelerator). And as Berenstein points out, the most interesting finds may be of unexpected phenomena that change the way physicists think. “We hope to see new physics,” he says. At the other end of the distance scale, LIGO is expected to give physicists new data on events that distort spacetime, sending out gravity waves that travel at the speed of light. Polchinski says the observatory, which measures tiny changes in laser light paths, might provide some evidence of the long strings that may have been formed at the birth of the universe. As detection of gravity waves gets more precise, says James Hartle, they may give us a way of “looking” at the opaque surface of black holes (given that name because their gravity is so strong that nothing, not even light, can escape). Hartle, who is best known for his work with physicist Stephen Hawking on the quantum mechanics of the Big Bang, says a black hole gives off gravitational waves at a distinctive frequency, depending on its shape. The waves are weak and hard to detect, he says, but this is what makes them interesting, “since once they are produced, they are hard to absorb.”
New Frontiers and Spin-Offs Physics has a history of producing surprisingly practical results from research that at first seems impossibly remote from everyday experience. Its increasingly deeper probing of the sub-atomic world has led to ubiquitous technologies – computers, television, medical imaging,
“The application of methods from physics to biological problems is one that has had a long history and is heating up again,” says Polchinski. superconductivity and more. Einstein’s insights into matter and energy led to the atomic bomb and nuclear power. The indirect effects of physics research can also be important. Polchinski notes, for instance, that the World Wide Web was invented “by high energy experimentalists needing to exchange data efficiently.” Some frontiers of physics today sound like theory at its purest. One is the debate over multiple universes – or regions of our own universe that are beyond our ability to detect and that may follow physical laws different from the ones we know. This line of thought gets mixed reviews, with some physicists sounding skeptical that it could ever produce worthwhile knowledge. Horowitz finds the topic important “if we want to understand how the universe came to be.” But Berenstein says, “It’s not easy to make anything useful out of that type of idea. That’s what ends up putting it in science-fiction books instead of research papers.” On a more practical plane, Hartle sees new interest in biophysics, or “what physics may have to say about matter that happens to be living.” Polchinski agrees: “The application of methods from physics to biological problems is one that has had a long history and is heating up again.” How the science of physics will change the world in the second Einstein century is anybody’s guess. But if the next hundred years is anything like the last, change will be profound, not just in the tools we use but also in the way we think. As Berenstein says, “Certain aspects of fundamental research capture the imagination of the public. These alter the way we perceive the world and thus enrich culture. This is another way in which science contributes to enrich our lives.” And the smartest prediction of all may be the most succinct: Prepare to be surprised.
The Fantastic Voyage of Nano-Bioengineerng
Imagine tiny devices traveling through the body to kill tumors and treat diseased arteries. That’s the scenario taking shape through the collaboration of UC Santa Barbara researchers.
ome of the biggest new ideas in medicine are, in scale, very small. The convergence of disciplines called nano-bioengineering (NBE for short) is a prime example. Its aim is to attack life-threatening conditions with molecules and cell-like devices that can detect diseases and deliver the drugs to cure them. Scientists label them with terms such as “molecular machines” or “smart nanoparticles.” By whatever name, they are on the way to remaking medicine at the sub-micron level. For the highly collaborative faculties at UC Santa Barbara, NBE is a natural research choice. The university is known for bringing diverse disciplines together on major projects, and NBE involves a wide range of them, including biology, chemistry, engineering, materials science and physics. Earlier this year, UCSB was chosen to participate in major nano-bioengineering research efforts funded by Program of Excellence in Nanotechnology (PEN) grants from the National Institutes of Health (NIH). The NIH awarded four grants in all, and UCSB was the only university to participate in two of them. In one of these programs, UCSB is working with Washington University in St. Louis and UC Berkeley under a $12.5 million grant to develop nanoscale agents capable of diagnosing and treating pulmonary artery disease in its early stages. The other program brings UCSB together with the Burnham Institute and Scripps Research Institute, both of La Jolla, Calif., under a $13 million grant. Scientists from the three institutions are applying nanotechnologies
to design new methods for monitoring, treating and eliminating vulnerable plaque, the likely cause of death from sudden cardiac arrest. USCB researchers are also applying nanotechnology to the more general medical problem of drug delivery, whether to treat heart disease, cancer or other diseases. They are working on ways to get chemotherapy agents to their targets without killing good cells, to piggyback drug carriers on red blood cells and to program molecules to attack specific kinds of tumors.
Technology That Delivers The common theme in all these NBE projects, other than the extremely small size of the tools under development, is the challenge of targeting – particular diseases, particular points in the body, particular patients. The promise of smart nano-agents is that they can solve one of the central problems in medicine: How to get the cure to the right spot with the least disruption and damage to a patient’s health. “The problem we’re trying to address is that of getting more drugs to the right place at the right time,” says Joseph Zasadzinski, a UCSB professor of chemical engineering and materials. Traditionally, the surest form of targeting is surgery. It’s also the most traumatic. Drugs can be less risky, but they are far less precise. And some of the most effective drugs, such as those used in chemotherapy, also destroy healthy tissue. Moreover, almost all the drug is wasted. “In general,
99% of the drug goes where you don’t want it,” Zasadzinski says. Cutting the waste through better targeting reduces both the required dose and the risk to the patient.
“We’re trying to get more drugs to the right place at the right time,” says Joseph Zasadzinski.
Tiny delivery vehicles designed to release drugs only where needed are an obvious way, at least in theory, to solve this problem. If they work, they can achieve the precision of surgery without the shotgun side effects of conventionally administered drugs. The challenge is in engineering them. In order to reach and act on the targeted cells, they must be extremely small. But they also must be able to survive the body’s defenses and stay in the bloodstream. As with the nano-scale scientists coursing through a human body in the 1960s science-fiction classic “Fantastic Voyage,” drug delivery is a long mission in a sometimes hostile environment, requiring a certain amount of intelligence.
Along with his vesosomes, Zasadzinski is working on a technology that he calls “a little bit more science fiction.” He and Craig Hawker, a UCSB professor of chemistry and materials, are developing a system that delivers drugs held in polymer cores within tiny gold shells (about 100 nanometers in diameter). The shells can stay in the body and the polymer can hold the drug indefinitely. But the shells can be melted and the drug released with a quick hit from an infrared laser. Millions of these tiny chemotherapy agents thus could be injected into the body, to be activated at specific times and in specific tumors. Since the drug would only be released at its target, it could be used in a concentrated, highly toxic form without endangering healthy tissue.
Zasadzinski and others are attacking the delivery problem by, in effect, building better subs. One type is a cell-like structure called a “vesosome,” which he compares to the double-bagging of fragile items at the supermarket. The outer “bag” is a membrane sphere perhaps 200 nanometers (200 billionths of meter) in diameter. Inside it are smaller sacs, about 50 nanometers across, that are designed to hold chemotherapy agents until the vesosome can reach a targeted tumor. “The body is very efficient in identifying foreign objects in the blood stream. If you have just one bag, there are a lot of enzymes that chew through that bag,” Zasadzinski says. But with the backup of a second bag, the drug can remain encased for much longer, such as a day or more instead of just a few minutes.
Small and Smart For this zap-and-release method to work, a sufficient number of shells must first get to the target. The same goes for all nano-scale delivery methods. One of the key NBE challenges, along with protecting a drug or other therapeutic substance from the body’s natural defenses, is to get as much of the substance to the target and leave as little as possible elsewhere. Toward that end, nano-agents need to be engineered so that they read their environment, home in on particular problems and then treat them. Creating such “smart nano-particles,” as he calls them, is the main research focus for Hawker, who directs UCSB’s Materials Research Lab and is the university’s lead researcher on the $12.5 million PEN pulmonary artery disease project.
Once the vesosome reaches a tumor, it can take advantage of the tendency of tumor blood vessels to leak into the surrounding malignant cells – an effect of the tumor’s abnormally fast growth. “The idea is to keep the nasty drug inside the double-bagging until it gets to the tumor where it can then start leaking,” he says.
Clinicians at Washington University in St. Louis have found that certain biological markers show up long before symptoms of artery disease appear. Cells lining the blood vessels start to display small peptides (molecules made up of amino acids) on their surfaces as a sign of stress. Hawker and his fellow researchers want to develop particles that latch exclusively to these chemical markers. After they accumulate at a disease site, they can be detected with a scan such as an MRI. “If these particles show up in big lumps, then we know there’s trauma there that needs to be treated,” Hawker says.
Samir Mitragotri, a UCSB associate professor of chemical engineering, is investigating another delivery model, in which drugs are attached to red blood cells. The blood cells live for about a month and pass freely through the liver and kidneys. If nano-particles can piggyback on blood cells and avoid getting cleaned out of the blood stream by the liver and kidneys, they can keep drugs in the body much longer and would have a better chance of getting the drugs eventually to the designated targets.
He is also researching new, less invasive ways of getting large drug molecules into the body. “If you have to deliver a protein now, you have to use a needle – a big problem,” he says. So he’s looking at alternatives such as ultrasound, patches, chemicals and liquid jets.
Continued on page 24
& Michael Gazzaniga UC Santa Barbara is known as a place where subjects get studied from all angles and across the disciplines. Nothing fits this multi-dimensional approach better than the human mind, which has fascinated scientists, poets and philosophers from the dawn of human thought. The university will be "It might have been more practical taking its already ground-breaking research in this area to a new level this fall when it opens its Sage to do a smaller organism, but it Center for the Study of the Mind. The Center will draw upon a wide range of scholarly endeavors and wouldn't have gotten the money." technologies in the humanities, social sciences and the sciences to study the relationship of the mind and brain. Leading this exciting enterprise will be one of the world’s most distinguished researchers in the cognitive sciences, Michael Gazzaniga. Gazzaniga is currently the David T. McLaughlin Distinguished University Professor at Dartmouth College and director of the I started as an assistant college’s Program in Cognitive Neuroscience. He is president of the Cognitive Neuroscience Institute and, professor at UCSB some in 1993, founded the Cognitive Neuroscience Society. He is a Fellow of the American Association for the Advancement of Science, the American Neurological 40 years ago. It is a good Association, the American Psychological Association and the American Academy of Arts and Science. He feeling to come home and has served on the President’s Council on Bioethics since its inception in 2002. Gazzaniga is coming to UCSB early in 2006. work on establishing this Gazzaniga has published many books, notably The Ethical Brain (his most recent), Mind Matters, new center. and Nature’s Mind. His many scholarly publications include the landmark 1995 book for MIT Press, The Cognitive Neurosciences, now in its third edition, which is recognized as the sourcebook for the field. Gazzaniga recently shared some thoughts with Convergence about the Sage Center, his research, the intersection of science and ethics, and the pleasure of returning to his native California:
Tell us about the Sage Center for Study of the Mind. What persuaded you to come back West to lead this new program?
how they feel and so on. I think a major goal of the new Center will be to understand the social aspects of the human condition.
I have had the pleasure of starting two major scientific centers and both were centered on studying the brain and its action. With the Sage Center the brain-science aspect plays a crucial role but the agenda is much larger. There are all kinds of academic disciplines interested in the nature of the human mind. Philosophers, economists, anthropologists, evolutionary biologists, humanists, computer scientists and much more all have dogs in this fight for understanding human mental processes. The Sage Center will strive to bring these forces together and to illuminate the problem in a multidimensional way.
How do you and your colleagues plan to address these questions at the new center? Lots of brain imaging, lots of seminars where the goal will be to talk and think across classic disciplines. There will be all kinds of activities. Given your long ties to UC Santa Barbara, what would you say has changed, and what has stayed largely the same, about the university since you first taught here nearly four decades ago?
On a personal note, I started as an assistant professor at UCSB some 40 years ago. It is a good feeling to come home and to work on establishing this new Center.
When I first arrived at UCSB there were many great psychologists here at the time. It was a good place for psychology. Scientists like David Premack, Howard and Tracy Kendler, John Foley, Walter Gogel, and many, many more played huge roles in launching UCSB.
What will set the Sage Center apart from other institutions that study the workings of the mind and brain? How will its mission and work be distinctive? And how will the larger UC Santa Barbara intellectual community play a role?
At the same time the field was limited and the human brain could not really be studied in any kind of major way. The past 10 years has seen an explosion of methods for getting at the nagging questions of the nature of the human mind. Big science is no longer limited to physics, chemistry and biology. Studying the mind needs a similar effort.
Most centers donâ€™t have a canvas this large. Aspects of the problem are studied at other places and the attempt to integrate knowledge across relevant fields is not foremost. It is hard work to do this and quite frankly many specialists will be doubtful it can work. As a result it will be a subset from each discipline who will come together to work on this project.
What would you consider the most significant discovery you have ever made?
UCSB is known for this sort of risk taking and it has done well in doing so. I hope for the same here.
Several years ago my student Joseph Ledoux and I identified a capacity of the left brain called the â€œinterpreter.â€? It is the process that seeks pattern and meaning in our actions, thoughts and feelings. It is a big part of what makes humans unique.
What research questions intrigue you the most right now? For the past 35 years I have studied brain mechanisms associated with personal human consciousness and mostly in patients who had undergone separation of their cerebral hemispheres in an effort to control intractable epilepsy. The vast majority of those studies were behavioral in nature, and while they illuminated the nature of human cerebral specialization and important differences between the left and right brain, they did not tell us about how the underlying pathways functioned. With new brain imaging technologies, these pathways can now be studied in the most amazing way. Individual differences in how quickly we can solve cognitive problems can be traced to differences in the underlying organization of the nervous system. At the same time, I have become captivated with another truth. Ask yourself the following question: How many of the thoughts that you have had in the last 48 hours would have any meaning if you were the only person on Earth? The answer is, not many. We are social animals to the core. We are always thinking about the other person, what their intentions towards us might be,
How many of the thoughts that you have had in the last 48 hours would have any meaning if you were the only person on earth? Michael Gazzaniga
"UCSB is known for this sort of risk taking and it has done well in doing so." new ideas can be paralyzing. I think there are basic mechanisms in the human brain that right the ship when it starts to sink.
Among your teachers, fellow scientists and possibly even students, who would you say has influenced you the most? My mentor Roger Sperry had a huge impact on me as did several outstanding colleagues over the years. I am especially indebted to Leon Festinger, David Premack, George Miller, and Donald Mackay.
Are you an optimist or a pessimist about the ability of the human race to do the right thing with new knowledge? Very much an optimist.
Your specialty, cognitive neuroscience, touches on moral and even political controversies – such as questions of free will and personal responsibility. Do you think this science will someday lay such issues to rest? Or will it just keep stirring up new ones without settling the old? In my new book, The Ethical Brain, I try to put some of them to rest. Neuroscience has lots to say about matters dealing with issues like the moral status of the embryo. It doesn’t have much to say about free will, personal responsibility and culpability. It’s a long story but it is in the book.
And finally… What do you do on a typical weekend? I never see the difference between the weekdays and the weekend. I enjoy life every day and my capacity to enjoy life finds its way into work and play. What are you reading these days for pure pleasure? Descartes’ Baby, by Paul Bloom. I am Charlotte Simmons, by Tom Wolfe.
What has your service on the President’s Council on Bioethics told you about conflicts between science and ethics, or science and religious belief, in this country? Where is there common ground? Or are some differences irreconcilable? It has been a fascinating experience. A council of people interested in ethical issues is different than a scientific council interested in ethics. The former has people from all walks of lives, holding all kinds of varied views. Hashing out a position on this or that can be quite challenging and frequently impossible. That is why I wrote minority reports on several issues and in particular on the stem cell issue. On the whole, do you think the Bioethics Council found some of that common ground? Or did it just succeed in airing differences? Common ground was achieved on some issues and not on others. At the same time, one walks away with a respect for other views, even though one disagrees with them. I am always unsettled by the group thought that takes over any group. I like diverse views. Oddly, universities and other academic settings are rather uniform in their thinking. I have never found that one of our stronger points. Most of the issues you examine in The Ethical Brain seem to stem from a mismatch between fast-moving technology and ethical thought that can’t seem to keep up. We are able to do things before we have decided if they are right or wrong. What’s your answer to this problem? It will happen and bad ideas will die out after some misuses.Trying to work out a perfect world before trying
WHAT IS THIS?
See Answer On Inside Back Cover 14
have you heard?
News and Events from the Engineering and the Sciences at UC Santa Barbara
UCSB has been chosen for two NIH Program of Excellence (PEN) Nanotechnology Grants. Notable is
the fact that UCSB is the only university to participate in two PEN nanotech grants. The work will develop novel technologies to diagnose and treat diseases of the heart and lungs. Both research efforts bring together bioengineers, materials scientists, biologists and physicians working in interdisciplinary teams. UCSB will be working with Washington University in St. Louis and UC Berkeley, under a $12.5 million grant, to develop nanoscale agents to provide early diagnosis and treatment of pulmonary artery disease. At UCSB, the project is being led by Professor Craig Hawker, Director of the Materials Research Lab. We will also be teaming with The Burnham Institute and Scripps Research Institute, both of La Jolla, Calif., to apply a $13 million grant to develop ways to use nanotechnologies in the design of new ways to detect, monitor, treat and eliminate vulnerable plaque, the probable cause of death from sudden cardiac arrest. UCSB professors participating in the project include Matthew V. Tirrell, PhD, Dean of the College of Engineering and professor of chemical engineering; Andrew N. Cleland, PhD, associate professor of physics; Patrick Daughterty, PhD, associate professor of chemical engineering; Samir Mitragotri, PhD, assistant professor of chemical engineering; and Joseph Zasadzinski, PhD, professor of chemical engineering.
UCSB will receive $1,343,859 in state funds over three years to support stem cell research. The grant
was announced by the Independent Citizens’ Oversight Committee and is one of the first 15 awarded by the California Institute for Regenerative Medicine. The goal of UCSB’s stem cell research program is to understand how human embryonic stem cells can be differentiated into ocular cells that might be used to treat eye disease, especially macular degeneration. Some of the work will be carried out in the newly established Laboratory for Stem Cell Biology, where scientists are currently growing government-approved human stem cells. Participating researchers will come from the Department of Molecular, Cellular and Developmental Biology, the College of Engineering, and the Neuroscience Research Institute.
The National Science Foundation has renewed its support to UCSB’s Materials Research Lab (MRL) with a $20.52 million, six-year grant. The MRL was one of 11 Materials Research and Engineering Centers (MRSEC) that successfully competed for renewed support. A total of 29 centers are currently cupported by the MRSEC program with combined annual NSF support of $52.5 million. Craig Hawker, a UCSB professor of chemistry and materials, is Director of the MRL.
Collaborating with UCSD and the Burnham Institute, UCSB is part of a multi-center nanotechnology collaboration funded by the National Cancer Institute. Materials
Scientists at UCSB will develop particles that could attach to specific types of tumor cells, delivering payloads, taking images, modifying tumors, taking key measurements and delivering therapies. UCSB will receive about $2 million from the grant, a five-year initiative. Our researchers – who include Matthew Tirrell, professor and Dean of the College of Engineering; Evelyn Hu, a professor of electrical and computer engineering and materials, and CoDirector of the California NanoSystems Institute; Patrick Daugherty, an assistant professor of chemical engineering; and Dan Morse, a professor of molecular genetics and biochemistry – will work with chemists at UCSB to make nanoparticles that will be coated with “blinkers,” molecules developed at the Burnham Institute to make the particles attach to specific types of tumor cells.
Biodefense and infectious disease study wins $1.5 million from the National Institute of Allergy and Infectious Diseases of the National Institutes of Health.
Peggy Cotter, an assistant professor of Molecular, Cellular and Developmental Biology, will serve of as the project director. Some of the funds will be used to construct a high security biosafety lab at UCSB in which the work will be conducted. NIH has funded ten regional centers nationwide, each a consortium of universities and research institutes, to research countering bioterrorism and emerging infectious diseases. Also involved as an investigator at UCSB is David A. Low, a professor and vice-chair of Molecular, Cellular and Developmental Biology.
News and Events from the Engineering and the Sciences at UC Santa Barbara
Smart bio-nanotubes could be developed for drug or gene delivery. They’re
Cells have developed a surprising transport system for their endosomes, according to research just published in
Physical Review Letters. By marking endosomes with florescent tags and watching them move in live cells, Samir Mitragotri, a UCSB professor of chemical engineering, and graduate students Chinmay Pangarkar and Anh Tuan Dinh, learned that the endosomes travel to the cell’s nucleus using back-and-forth symmetrical movement, rather than going there more directly. This forward and reverse motion leads to even distribution of the endosomes on microtubules. An aster-like layout of the microtubules helps the endosomes accumulate at the nucleus. The researchers think this non-direct approach to the nucleus has evolved to allow hundreds of endosomes to bring nutrients and molecular information to the cell’s center for processing. There’s never a traffic jam on the microtubules, even if the cell moves or if there’s increased traffic flow.
intelligent because they may be designed to first encapsulate and then to open up to deliver a drug or gene to a particular location in the body. The researchers found that by manipulating the electrical charge of lipid bilayer membranes and microtubules from cells, they could create open or closed bionanotubes. Reported in the Proceeedings of the National Academy of Sciences, the research resulted from a collaboration between the labs of Cyrus R. Safinya, professor of materials and physics and Leslie Wilson, professor of biochemistry.
Stucky, a professor of Chemistry and Biochemistry. The glue works as a molecular shock absorber and promotes healing, but has bonds that can uncoil under stress, then making fracture more likely. The research, published in Nature Materials, explains the process by which healthy bone resists fracture and how unhealthy bone fractures at the molecular level.
Three molecules – discovered from a search of 58,000 compounds – appear to inhibit a key perpetrator of Alzheimer’s disease. Each of the three small
molecules protects a key protein, called “tau,” which gets hopelessly tangled in the brains of patients with Alzheimer’s. The finding, published in Chemistry and Biology, may assist in the development of drugs for the disease. In Alzheimer’s, an enzyme called CDK5 attaches phosphate to the tau protein, causing the disease. The three molecules interrupt the disease process, preventing CDK5 from attaching the phosphate to the protein. The research effort was headed by Ken Kosik, co-director of the Neuroscience Research Institute at UCSB.
Human bone has a type of glue that holds mineralized collagen fibrils together,
according to the collaborative research of Paul K, Hansma, a professor of physics, with Daniel E. Morse, a professor of molecular genetics and biochemistry, and Galen D.
The National Science Foundation has named Matthew Tirrell the Engineering Distinguished Lecturer for Fall, 2005.
Tirrell, Richard A. Auhll Professor and Dean of the College of Engineering, presented the lecture, “Modular Materials by Self-Assembly,” on November 7 at the National Science Foundation in Arlington, Virginia. Previous lecturers include Nobel Laureates Heinrich Rohrer, Jean-Marie Lehn, Dan Tsui and John Pople.
News and Events from the Engineering and the Sciences at UC Santa Barbara
Leda Cosmides, a UCSB professor of psychology and Co-director of the Center for Evolutionary Psychology, was given the
Pioneer Award by the National Institutes of Health. The award is a central component of NIH’s “Roadmap for Medical Research,” a program designed to transform the nation’s medical research capabilities and move discoveries from the lab to the bedside more quickly. Cosmides will receive up to $500,000 a year in direct research costs for the next five years. She and her husband and collaborator, John Tooby, a professor of anthropology at UCSB, have developed evolutionary and computational approaches to human motivation and neural development.
MIT’s Technology Review honored Haitao Zheng, an assistant professor of computer science at UCSB, as a top technology
innovator under age 35. Technology Review, a magazine published by Massachusetts Institute of Technology, has named the TR35, its selection of the top technology innovators under age 35. Technology Review describes Zheng’s potential this way: “At 15, Haitao Zheng stood out at China’s competitive Xian Jiaotong University for both her youth and her brilliance. Today, her work on so-called cognitive radios stands out for its potential to make a promising technology practical. Using software, cognitive radios dynamically detect and exploit unused radio frequencies; the devices could alleviate competition for the ever shrinking amount of unassigned radio spectrum. To be truly useful, though, a cognitive radio must not only detect free spectrum but also select the best frequency for a given function, all without interfering with other devices. At Microsoft Research Asia, Zheng created algorithms that allow disparate devices to “negotiate,” automatically allocating the available spectrum efficiently and fairly. Zheng
is continuing her research on open spectrum systems as an assistant professor of computer science at the University of California, Santa Barbara.”
The American Institute of Architects, California Council (AIACC) has given its Award for Regional and Urban Design to the architects of UC Santa Barbara’s new Engineering-
Science Building. The building, which was formally opened in October, 2004, is a multidisciplinary research and teaching facility occupied primarily by faculty, staff and students from electrical and computer engineering. It contains about 90,000 square feet of nanofabrication and teaching facilities, research laboratories, teaching and conference spaces and offices for faculty, staff, post-doctoral students and graduate students. The architects for the building were CO Architects, formerly Anshen+Allen Los Angeles.
Computer hacking award went to UCSB. (In the world
of computer science, winning this award is a good thing). Led by Giovanni Vigna, a UCSB associate professor of computer science, a group of computer science graduate students won the “Capture the Flag” competition at DEFCON,
the largest underground hacker convention in the world. The competition, held in Las Vegas July 29 to 31, pit eight competitivelyselected teams against each other in a real time hacker war. Each team was given access to a UNIX server that contained a number of undisclosed vulnerabilities. Each team’s task was to discover the vulnerabilities by analyzing the server’s software, fix the vulnerabilities and exploit them to compromise the security of the other teams. The goal? To steal or modify a number of secret files, called “flags.” Team members received “black badges” that allow them lifetime free entrance to the convention.
Electrostatic Discharge Association International Research Award for 2005 was won by Kaustav Banerjee, an associate professor
of electrical and computer engineering at UCSB. The annual competition selects one outstanding researcher from academic institutes around the world in the field of electrostatic discharge (ESD). Banerjee’s proposal, “Electrothermal engineering of multi-gate CMOS devices for improving robustness under ESD conditions” was rated as the most promising research work for the advancement of ESD knowledge as the semiconductor industry moves toward nanometer technologies. The award was announced during the opening ceremony of the EOS/ESD Symposium held in Anaheim, California, in September 2005, and includes a grant of $10,000 to support Banerjee’s research.
Convergence, the magazine of Engineering and the Sciences (yes, us!) won an Award of Excellence from the Public
Relations Society of America's L.A. Chapter October 20.
engines of discovery
Faster computers and smarter programs are remaking the practice of science. There’s no substitute for genius. But computing power greatly expands a scientist’s reach – so much so that it is transforming science itself. In physics, researchers turn to supercomputers to simulate and analyze events in realms that are too small or too remote to be perceived. In biology, high-power processing is used to develop models and simulations to predict the behavior of organisms from the molecular level on up. In the eyes of many scientists, computation is now the third major pillar of science, standing with (and bolstering) experiment and theory.
systems,” as they are called, are crucial to cell activity such as the heat-shock response of E. coli, in which an estimated 20 to 30 molecules per cell (a tiny number in ordinary chemistry) help sense the folding state of the cell and regulate the production of heat shock proteins. Reactions in these systems cannot be predicted with the same mathematical tools used for normal large-scale chemical reactions, in which so many molecules are involved that the randomness – and uncertainty – is averaged out. In some biological processes, Petzold says, the outcome could hinge on the behavior of just a few molecules. What, then, does a biologist/computer scientist do to figure out how a molecule or small group of them will alter an E. coli cell under heat stress? The answer is to do simulations – lots of them. The computer is given the range of possible reactions and runs the process over and over, each time changing the conditions based on random numbers. If the assumptions are correct, this process will produce a probability curve that gives scientists a way to predict the outcome of the system.
Linda Petzold, a UC Santa Barbara professor of mechanical engineering and chair of the Computer Science department, says computers and algorithms – the step-by-step instructions that focus computers on the desired tasks – are coming into their own as engines of discovery, not just data-crunchers. “In many areas of science and engineering,” she says, “the boundary has been crossed where simulation, or simulation in combination with experiment, is more effective in some combination of time, cost and accuracy than experiment alone for real needs.”
Similar methods are being used to simulate the electrodeposition of copper interconnects between computer chips. Manufacture of computer components has reached a scale so small that such a detailed molecular-level simulation is now required. “There are relatively few molecules involved in populating the trenches,” Petzold says, “so an averaged (differential-equation based) model cannot capture the physics.” As with E. coli cell chemistry, this modeling requires considerable computing power, with algorithms and software to match.
Physics and the Computer Pioneers Of all the sciences, physics may have the longest history with high-performance computing. The first electronic computers and model grew out nuclear-weapons of the Manhattan Project in the 1940s, when physicists used them to understand the dynamics of nuclear reactions. One of their most fruitful ideas was the Monte Carlo method, which solves mathematical problems by running simulations based on random numbers. Since then it has been applied widely, not just in physical sciences but also in finance. But the potential sensed by those computer pioneers has only been realized in recent years with the advent of much faster processors and more efficient computation methods.
A Science of Integration With high-performance computing playing a more crucial role in science and engineering, Petzold says it now warrants study as a field in itself. She calls it Computational Science and Engineering (CSE) and explains what is new about it: “Although it includes elements from computer science, applied mathematics and engineering and science, CSE focuses on the integration of knowledge and methodologies from all of these disciplines. As such, it is a subject that is distinct from any of them.” Petzold’s part in CSE includes work on the simulation of biochemical processes that involve a high degree of randomness and small populations of some chemical species. These “discrete stochastic
Bob Sugar, a UC Santa Barbara research professor in physics, knows at first hand just how far computing has come. When he earned his PhD at Princeton in 1964, computers were big, slow and touchy. Sugar’s doctoral thesis made some use of these machines, and Sugar remembers the ritual: “The university had a computer center, you took a deck of IBM cards, [and] you came back the next day.” Then, he said, you had to be ready for a message like, “too
many left parentheses in Card 15.” The next day you would come back and get, “too may right parentheses in Card 82.” Not surprisingly, even physicists tended to make do without computers. “If problems got so complicated that you couldn’t solve them analytically with paper and pencil, you made some approximations that allowed you to do so.”
“If problems got so
Computers are more cooperative these days and, of course, much faster. Sugar says he became involved in large-scale computing in 1980, when available machines could run hundreds of thousands of flops, floating-point operations per second. (A floating-point operation uses numbers based on scientific notation, such as 3 x 104 for 0.0003). Today, they can run at several trillion flops. Now physicists are using them to investigate matter at the level of the smallest known particles, such as quarks and gluons – the building blocks of protons and neutrons.
complicated that you
“Very recently our group calculated the decay rates of some particles called D-mesons, which had not been measured,” Sugar says. “Within a week of our publishing these results, to within an accuracy of 8%, a research group at Cornell University measured the decay rates of these particles also to within 8%, and fortunately the results agreed.”
some approximations that
couldn’t solve them analytically with paper and pencil, you made
allowed you to do so.”
Sugar estimates that the D-meson calculation took 1.0 to 1.5 teraflop-years. (A teraflop is a trillion floatingpoint operations). And he says this was just a “warmup” for further calculations. Where do researchers find the computer capacity for all this work? The answer so far, according to Sugar, is that they find it where they can. Until recently government super-computer centers run by the National Science Foundation and the U.S. Department of Energy were the major sources. “Beggars can’t be choosers, so we use everything we can get our hands on,” he says.
From Supercomputers to “Smart Dust” But particle physicists have begun to acquire dedicated computers of their own. The government, through the Department of Energy, has begun to fund the design and construction of powerful and efficient computers dedicated to the study of elementary particle physics. In this field, Sugar says, “we seem to have reached a major milestone where we can reliably calculate some quantities to within 2% to 3%,” the accuracy level “that’s really needed to make reliable predictions.”
"With computer hardware now so compact and inexpensive to produce, the age of so-called 'smart dust is approaching," says Joao Hespanha.
Beyond modeling and simulation, computers are also advancing many fields of science by simply doing what they have always done – calculate – at much greater speed. Eckart Meiburg, professor and chair of UC Santa Barbara’s Department of Mechanical Engineering, says today’s computers enable researchers to calculate the mathematics of complex, turbulent fluid flows, on scales as large as the atmosphere and as small as the pores between grains of rock. Hardware advances have made this possible, as have numerical models (such as the spectral element method) that simplify the computer’s work. Meiburg says parallel computing – dividing a task among multiple computers – may have been the biggest breakthrough of all. “It all of a sudden opened up new doors,” he says. Computers are not just growing more powerful. They are getting smaller and cheaper as well. This trend also has important implications for science, says Joao Hespanha, an associate professor of electrical engineering at UCSB. With computer hardware now so compact and inexpensive to produce, he says, the age of so-called “smart dust” is approaching. Sensor networks of tiny computers, possibly in the thousands, might be set up to collect, process and share data. Hespanha says federal civilian and defense agencies are funding research into this technology, which could be used to monitor oceans and forests or theaters of war. The main cost and challenge of creating these networks would be to develop the algorithms that enable all those small, cheap computers to work together. From supercomputers to smart dust, the progress of scientific computing is in many ways a matter of cross-fertilization. Scientists create new challenges for computers, and computers develop in ways that not only meet those challenges but also open new avenues of discovery. “As science and engineering move in new directions, so does computation,” says Petzold. “It’s an enabling technology for a lot of disciplines.” And right now, she says, “I see more exciting challenges than I’ve seen in a long time.”
Computers enable researchers to calculate the mathematics of complex, turbulent fluid flows. Scientists are now able to suggest gravity as it propagates along the bottom wall of a channel (left) or fluid motion in a microchannel used for biological detection (right).
The New Aerospace Ready for rubber aircraft? We’re not quite there, but the technology of flight is pushing some envelopes.
Aerospace engineering is all about going farther, higher and faster as efficiently and safely as possible. That essential challenge hasn’t really changed since the days of the Wright Brothers, but today’s engineers are attacking it with tools, materials and design concepts that aviation pioneers could never have imagined. One such concept is that of flexible surfaces. These would be thin sheets of titanium, not rubber, but they would indeed be capable of changing shape in mid-flight. Nozzles of jet engines, wings or whole fuselages could be made to morph for optimal flight performance at various speeds, says Tony Evans, a UC Santa Barbara professor of materials and mechanical engineering. In an extreme case of this concept, says Evans, a plane’s wings would flap like those of a bird. “But think of a less extreme case, where the tip of the wing would move up or down, or rotate, as a function of the speed or position of the airplane,” he says. These “morphing structures,” as he calls them could also cut jet noise by changing the shape of a nozzle – for quiet at takeoff and maximum efficiency at cruising altitude. Such technology is still early in development, but engineers such as Evans are well along in developing it for newgeneration craft, from light planes to hypersonic and space vehicles. Evans, whose accomplishments have earned him membership in both the National of Academy of Sciences and the National Academy of Engineering, has wide-ranging expertise. His background is in materials science, but he now says he is “probably more of a mechanical engineer.” He works on a wide range of programs with colleagues at
Image courtesy of NASA
Harvard, Princeton, the University of Virginia and, in the U.K, Cambridge. In the commercial sector, he has worked closely with General Electric Co., the world’s leading maker of jet engines. In the materials area, Evans has focused his research on the properties of brittle materials, such as ceramics. These are crucial elements in much of the new aerospace technology. They come into play in advanced jet engines – one of the last places you would want to see brittleness. But as Evans notes, ceramic compounds can take a lot more heat than metal can, and engines are much more fuel-efficient when they run hotter. Over the last decade, he says, engine manufacturers have learned how to couple the ceramic’s heat resistance with the metal’s strength by layering the two types of material on engine components. Combustion temperatures in engines are now 200° C above the melting point of nickel alloy, doubling fuel efficiency in the process. In the research phase are new ceramic composites, such as silicon carbide, that are lighter than nickel alloy and can operate at higher temperatures.
Evans is also working with Boeing Aircraft Co. and Lockheed Martin Corp. on materials and design for new hypersonic craft, capable of flying up to eight times the speed of sound (just over 6,000 miles per hour at sea level). The technology they are developing would be used first in military aircraft but could also be applied later on to vehicles replacing the Space Shuttle. High speeds produce intense heat on the craft’s surface, and the heat-shielding solution for the shuttle – tiles – has an obvious and serious drawback: the tiles tend to fall off. Engineers, says Evans, are working
to replace shuttle tile “with a materials concept that is much more robust and is able to withstand extremely high heat flux.” One candidate is hafnium boride, which has excellent heat-shielding ability and an extremely high melting point. But, like other ceramics, it’s fragile. As in the jet engine, the engineering challenge is to wed its heat resistance to a metal’s strength, without adding too much weight. Hypersonic and space flight also may make use of shapechanging structures. Jet engines that morph with changes in speed are one idea under development (as the jet goes hypersonic, the engine’s inlet size needs to decrease for optimum efficiency). The same principle could be applied to the craft as a whole, making it narrower with increasing speed. Taking flexibility to this level will challenge the engineers on several fronts. The surface material must be strong, light and thin, so that it can move without building up any internal stress. The underlying structure must be strong, light, and movable at many points. Traditional frames won’t do. Controlling the morphing process to keep the shape at its most efficient under changing speeds and conditions will require heavy-duty computing power. But advances of recent years in all these areas – materials, structure and computing – are putting such futuristic technology within reach. Already, the aerospace industry has made huge strides in fuel efficiency. With fuel getting more expensive all the time, and with Washington showing a renewed interest in space travel to the moon and beyond, the ideas once consigned to science fiction may soon, literally, take flight.
Neuroscientist Ken Kosik has come to UCSB to explore new frontiers. One of these is the evolutionary biology of learning.
osik came west from Harvard in 2004 with a distinguished, broad record of research and a desire to expand his range even more. UC Santa Barbara is known as an ideal place for scientists who refuse to be boxed in, and Kosik plans to make the most of the university’s opportunities for multi-disciplinary work. As Harriman Professor of Neuroscience and co-director of UCSB’s Neuroscience Research Institute (NRI), Kosik started his first year as a full-timer at UCSB this fall. He has several initiatives in mind, spanning the intellectual spectrum from social services to evolutionary biology. One is for a new clinical program in Santa Barbara to serve people with Alzheimer’s and other memory-related diseases. It would include a community-based center, affiliated with NRI, where patients and families would go not only for treatment but also for help with related issues – family, legal and social – that Kosik says “the physician is not equipped to deal with.” The physician’s role, he adds, “might be only 5% to 10% of the services that the family would receive.”
Kosik has also assumed what he calls a “facilitator” role in UCSB’s stem cell research. “I would be the first to confess that I am not a stem cell person,” he says, but he has brought together a consortium of researchers from several disciplines – including marine biology, engineering and materials science – to come up with projects that would advance stem-cell-related technology and tap some of the $3 billion in state funds to be made available through California’s Proposition 71. Kosik made his mark as a scientist largely for his work on Alzheimer’s disease, and he’s continuing that research with like-minded UCSB scientists. During the summer of 2005, he and other NRI researchers reported two major discoveries related to Alzheimer’s and other degenerative diseases. One was the finding by NRI research scientist Ratnesh Lal that cell degeneration can be traced back to mis-folded proteins in the cell membrane. The other was the discovery, by a team under Kosik’s leadership, of three molecules that appear to inhibit an enzyme that plays a key role in the development of Alzheimer’s. Finally, there is a budding project that Kosik calls the “evolution of the synapse.” Kosik wants to draw on UCSB expertise from all over the academic map – evolutionary and marine biology, computer science, materials science, engineering and physics – to retrace the origin of one of life’s most basic structures. The synapse is the intercellular space through which nerve impulses travel between cells. It is also the path of information and learning. Kosik wants to trace its lineage back to its earliest occurrence in primitive life, and to find the genes that gave rise to it. “What I suspect we are going to find is that the genes that are critical for a synapse existed before there were synapses,” Kosik says. “I find that extraordinarily profound, and there are many implications.” One is the possibility of new therapies for diseases such as Alzheimer’s. Another (more “blue-sky,” he says) is that of creating computers on a biological model: “By getting at the simplest forms of biological systems capable of computation, we might be able to fabricate artificial nanosystems that have computational properties.” Boris Shraiman, a physicist at UCSB’s Kavli Institute who is working with Kosik to analyze genetic data, says the aim is to “identify some minimal set of protein interactions that enable learning” – and to figure out how those interactions started.
“What I suspect we are going to find is that the genes that are critical for a synapse existed before there were synapses,” Kosik says.
Before functioning synapses existed, he says, “Everything we know about the process indicates that the individual electrical players were already in place, but were involved in some other functions,” Shraiman says. Then “some incremental event of evolution occurred” that made these players interact in a new and powerful way. By figuring out what turned certain protein molecules into learning machines, researchers might also understand what goes on when those machines go haywire or break down – as happens in diseases such as Alzheimer’s. Such knowledge may not produce any quick medical breakthroughs, but Kosik says “all research does not need to be justified in terms of its biomedical applicability.” He adds, “One of the good things about UCSB is a belief in basic research for its own sake.”
Continued from page 6
The Fantastic Voyage of Nano-Bioengineerng
But he would like to take the potential of these particles further and make them multi-functional. In the next phase of development, he says, “we’re going to include another building block in the nano-particle,” allowing it to carry a drug and “not only detect disease but treat it at the same time.” What Hawker and others have in view is something like a synthetic virus, an engineered particle (about the size of an actual virus) that has the viral quality of surviving in the body in a dormant state until it is switched on by a chemical trigger. But that trigger has to be highly specific and attached only to the intended target. Otherwise, like the harmful viruses found in nature, the engineered nano-particle will switch on in healthy spots where it can do real harm. It’s one thing to come up with a molecule that would bind to a particular target and allow the release of a drug. It’s quite another to achieve a high level of specificity, or the lack of a side effect. Patrick Daugherty, a UCSB assistant professor of chemical engineering, is attacking this problem by selecting molecules for different functions, such as binding to a tumor and activating a drug, and then combining them into a single entity. This is done by using a molecular library – a test tube with about 10 billion E. coli. cells, each making a different compound – to find the one-in-a-billion cells that might perform the desired function. “Once we have the cell out,” Daugherty says, “we can crack open the cell and then read the DNA,” thereby getting the blueprint for engineering proteins.
Smart nanoparticles can then be assembled through a process Daugherty calls “layering.” One molecule could be added to ensure selectivity, such as by binding the particle only on tumor vasculature. That component alone would greatly cut the risk of a chemotherapy drug doing damage outside a tumor, but another molecule would be needed to make sure that the particle, once bound to the target, would then turn on. This “catalytic selectivity” layer would ensure that most of the drug would bind to the target and actually take effect. A third type of molecule, which Daugherty says is yet to be developed, might act as a sensor, confirming that the nano-particle is binding to a tumor and not latching on to healthy tissue by mistake. This could act to give a “go” signal after binding and before the catalyst kicks in. Daugherty says the use of binding and catalytic capabilities is not new to NBE. What is new is the potential of combining these and other qualities in complex, precisely programmed therapeutic agents that act like beneficial micro-organisms. “We are trying to imitate biology,” he says, “looking at forces in nature and harnessing them in molecular machines.” And how soon might all this engineering effort pay off in cures for cancer, cardiovascular disease and other killers? Daugherty estimates that some of the technologies that he and his UCSB colleagues are hatching now might be in trials three or four years from now. In other words, the new era of small and smart medicine may be not far around the corner.
CONVERGENCE The Magazine of Engineering and the Sciences at UC Santa Barbara
What it is! Answer from page 12
It's an advanced super alloy gas turbine jet engine blade. High performance coating systems being developed by Tony Evans and his collegues at UC Santa Barbara's Department of Materials and Mechanical Engineering will extend the limits and durabilty of these high performacce alloys.
FALL 2005, three Editor: Barbara Bronson Gray Design Director: Peter Allen Contributing Writer: Tom Gray Editorial Board: Matthew Tirrell, Dean, College of Engineering Martin Moskovits, Dean of Mathematical, Life and Physical Sciences, College of Letters and Science Evelyn Hu, Co-Director, California NanoSystems Institute Christy Ross, Assistant Dean for Strategy and Corporate Programs Kristi Newton-Day, Assistant Dean of Development, Science and Engineering Barbara Bronson Gray, Communications and Media Relations, Science and Engineering Peter Allen, Publications Director, Science and Engineering Joy Williams, Assistant Dean for Budget and Administration Denise Leming, Executitive Assistant to the Dean Convergence is a publication of Engineering and the Sciences at the University of California, Santa Barbara, CA 93106-5130. •
To make a change of address: please send e-mail to firstname.lastname@example.org or send your name and new address to Jan Adelson, Development Coordinator, Science and Engineering, UC Santa Barbara, Santa Barbara, CA 93106-5130.
If you have questions about the publication, please contact Barbara Bronson Gray at email@example.com.
Copying for other than personal or internal reference use without express permission of Convergence at UC Santa Barbara is prohibited. The University of California, in accordance with applicable federal and State law and University policy, does not discriminate on the basis of race, color, national origin, religion, sex, gender identity, pregnancy (including childbirth and medical conditions related to pregnancy or childbirth), disability, age, medical condition (cancerrelated), ancestry, marital status, citizenship, sexual orientation or status as a Vietnam-era veteran or special disabled veteran. The University also prohibits sexual harassment. This nondiscrimination policy covers admission, access and treatment in University programs and activities. Inquiries regarding the University’s student-related non-discrimination policies may be directed to: Office of Affirmation Action Coordinator, University of California, Santa Barbara, 805.893.3105.
If you need Convergence in another format because of a disability, please contact Peter Allen: 805.893.4803.
CONVERGENCE UC Santa Barbara CA 93106-5130
Non Profit Org. U.S. Postage PAID Santa Barbara, CA Permit No. 104