Issuu on Google+


The University of Texas at Austin TEXAS ADVANCED COMPUTING CENTER

As I rode my bike around this spring, the barreling orange wings that usually dart into my face were sadly absent.

A

ustin is known to host more than 170 species of butterflies, most notably the Monarch, which has seen a significant decline in numbers over the past thirty years. Droughts and drastic temperatures have led to food shortages and habitat losses, resulting in major die-offs in monarch populations.

Austin is known to host more than 170 species of butterflies, most notably the Monarch, which has seen a significant decline in numbers over the past thirty years. Droughts and drastic temperatures have led to food shortages and habitat losses, resulting in major die-offs in monarch populations.

Graham Toal, a computer scientist at the University of Texas-Pan American (UTPA) has been using the Longhorn visualization system at the Texas Advanced Computing Center (TACC) to test a hypothesis: that certain random populations of butterflies have genes which permit them to wander to different locations therefore allowing them to survive. Toal’s simulation is exploring the possibility that this diaspora enables butterflies to recuperate their population.

Graham Toal, a computer scientist at the University of Texas-Pan American (UTPA) has been using the Longhorn visualization system at the Texas Advanced Computing Center (TACC) to test a hypothesis: that certain random populations of butterflies have genes which permit them to wander to different locations therefore allowing them to survive. Toal’s simulation is exploring the possibility that this diaspora enables butterflies to recuperate their population.

The monarch’s usual life cycle takes it south to Mexico in the winter and back up north towards the US-Canada border in the spring. Butterflies that stay in roughly the same place die of starvation, as neither region produces enough food to allow them to survive a year. These assumptions about butterfly migrations are only applicable if As I rode my bike around this spring, the barreling orange wings that usually dart into my face were sadly absent.

The monarch’s usual life cycle takes it south to Mexico in the winter and back up north towards the US-Canada border in the spring. Butterflies that stay in roughly the same place die of starvation, as neither region produces enough food to allow them to survive a year. These assumptions about butterfly migrations are only applicable if weather conditions are ideal and the food supply is sufficient, which has not been the case in recent years.

Mourin Nizam | Science and Technology Writer


Paths that migrating birds or butterflies follow are called “flyways”. North American monarchs migrate south in the fall to California, Mexico or Florida. On the way north in the spring they lay eggs. The young produced by those and the next generation’s eggs that return all the way north and start south again (left). Monarchs’ migration path can change slightly each year. The dots represent sighting reports sent in from citizens (right). Click image to see larger version.

Toal holds a number of assumptions in his simulation, one being that butterflies in a certain gene pool act on instinctive or innate behavior based on current environmental factors, which effectively allows them to survive over others. This notion is not to be confused with Lamarckism, or the “inheritance of acquired traits”, meaning that characteristics that an organism may develop during its lifetime are genetic and can be passed on to its offspring. The hypothesis also assumes that butterflies will stay in a region where there is food, but will forage in random directions for random distances when there is no food. To test his ideas, Toal performed a virtual experiment with a population of creatures that resemble the Monarch butterfly. As long as each butterfly heads in a particular direction consistently when it encounters the same environmental conditions, evolution will eventually breed out those butterflies for which the random direction leads to fatal outcomes and will set up a stable population when the random choices lead to a feeding ground, a breeding area with available milkweed, and an overwintering area.

The model starts with an initial population of 10 million butterflies, where the life of each and every individual is updated once per hour, over a simulated period of 8000 days. A few simple rules govern the virtual butterflies’ lives: • Butterflies which do not eat enough eventually die of poor health. • Butterflies eventually die of old age. • Butterflies can eat in the Northeast region during the summer season (the area shows as blue when it is a source of food). • Butterflies can eat in the Mexico region (green when food is available). • Butterflies can breed when in the Mexico region when food is available. When in either of these regions, they don’t fly out of them. Toal’s simulation shows that after the initial scattering some butterflies randomly disperse from those regions when they are no longer hospitable; however, some butterflies happen to forage in directions that lead them back to the original feeding regions when they are suitable again. If they or their offspring in turn randomly manage to return to the original food-producing region in time, they live long enough to reproduce.

August 23 | Summer 2011


The University of Texas at Austin TEXAS ADVANCED COMPUTING CENTER

“The descendants of the butterflies which randomly succeeded in establishing a cyclic route have the same genes, and therefore follow roughly the same route,” The experiment is a proof of concept carried out using a small startup explained Toal. allocation available to researchers in the University of Texas system. The allocations are part of a continuing engagement with the UT system, most recently through the University of Texas Research Cyberinfrastructure Initiative. The work was done under the auspices of the Division of Information Technology at the University of TexasPan American as a demonstration project showing UTPA researchers how to add visualization to their HPC research projects. In the current, simplified model the butterflies have no genetic variation and are effectively clones, so they are not likely to adapt well to changes in the environment. Toal intends to apply for a larger allocation in order to refine the model with details taken from the real behavior of Monarch butterflies. “A more detailed model would take into account all the inputs available to a butterfly, such as the temperature, the winds and the food supply,” continued Toal. While the simulation is too crude to use to make predictions, it does succeed in showing that a cyclic migration loop is a plausible behavior that can emerge from a much simpler mechanism. This means there is hope yet for seeing these creatures again—they are currently in a recovery phase. Said Toal: “Butterflies have been in this pattern long enough to have adapted to these changes in the weather, and boom and bust years are part of their natural cycle now.”

Mourin Nizam | Science and Technology Writer

August 23 | Summer 2011


The University of Texas at Austin TEXAS ADVANCED COMPUTING CENTER

T

he electron microscopes that Wah Chiu oversees can image a virus down to its individual protein side chains. Yet, questions remain regarding how to convert these images into useful information to aid physicians and scientists in the understanding and diagnosis of disease? Chiu is the founder and director of the National Center for Macromolecular Imaging (NCMI) — a National Institutes of Health-funded center that provides cryo electron microscope (Cryo-EM) resources to the scientific and biomedical community, as well as the Center for Protein Folding Machinery. He is a leading developer of the computational methods through which Cryo-EM imaging becomes useful for a wide spectrum of applications. Chiu is exploring these issues with the goal of understanding enough about the architectural organization of the proteins that make up viruses to develop new therapeutic strategies by which infections can be limited.

Shorter Captions for these images

“How do we extract useful three-dimensional information from multiple two-dimensional images, and then put them together in a coherent manner?” Chiu said. “This is exactly where the computing comes in.” Chiu has been working at the interface of microscopes and advanced computers since early in his career, both with the National Science Foundation and National Institutes of Health. He saw early on that computers would be essential for interpreting microscopic images at near atomic resolution. “The imaging capability was improving, the microscopes were getting better, and what I saw was needed was computing,” Chiu said. “That’s what I’ve put a lot of effort into in the last 10 years; to explore new image processing algorithms and to use high performance computers in our research.” Recently, Chiu has been testing his Cryo-EM analysis methods using the Ranger supercomputer at the Texas Advanced Computing Center.

To appreciate the problem, it is necessary to consider how cryoEM microscopes work. Electron microscopes image specimens by “The three-dimensional density map is extremely difficult for a human being to comprehend, so we need to simplify the shooting electrons at samples (rather than photons as in optical representations of the density in terms of lines, curves, and the microscopes). These microscopes then capture the diffracting connectivity from one amino acid to another in a protein molecule,” electrons to form a two-dimensional image, which contains Chiu said. structural details of molecules on a sub-nanometer scale. The technology also allows for the observation of frozen, hydrated The process is iterative, with human insight feeding into and biological specimens that have not been stained or chemically fixed improving the computational models, so biological discoveries can in any way, showing them in their native solution environment. be made. Visualizing images from a single electron microscope sample is Unlike most applications that use high-performance computing, trivial. But to reconstruct data from 50,000 or 100,000 particle images to determine the three-dimensional location of atoms, Chiu’s work is not a simulation. and then to combine that data into a simplified visual model that humans can interpret, well, that requires advanced computers of “We’re doing image processing of an experimental observation,” he explained. “What we’re getting is not an imaginary photograph. It the highest order.

Aaron Dubrow | Science and Technology Writer


to explore new image processing algorithms and to use high performance computers in our research.” Recently, Chiu has been testing his Cryo-EM analysis methods using the Ranger supercomputer at the Texas Advanced Computing Center. “The three-dimensional density map is extremely difficult for a human being to comprehend, so we need to simplify the representations of the density in terms of lines, curves, and the connectivity from one amino acid to another in a protein molecule,” Chiu said. The process is iterative, with human insight feeding into and improving the computational models, so biological discoveries can be made. Unlike most applications that use high-performance computing, Chiu’s work is not a simulation. “We’re doing image processing of an experimental observation,” he explained. “What we’re getting is not an imaginary photograph. It all originates from physical experiments.” Working with colleagues at Baylor, Chiu resolved precise images of several bacterial and animal viruses to understand how they change shape and package the viral DNA during the virus maturation process.

Wah Chiu, founder and director of the National Center for Macromolecular Imaging (NCMI).

all originates from physical experiments.” Working with colleagues at Baylor, Chiu resolved precise images of several bacterial and animal viruses to understand how they change shape and package the viral DNA during the virus maturation process.“How do we extract useful three-dimensional information from multiple two-dimensional images, and then put them together in a coherent manner?” Chiu said. “This is exactly where the computing comes in.” Chiu has been working at the interface of microscopes and advanced computers since early in his career, both with the National Science Foundation and National Institutes of Health. He saw early on that computers would be essential for interpreting microscopic images at near atomic resolution. “The imaging capability was improving, the microscopes were getting better, and what I saw was needed was computing,” Chiu said. “That’s what I’ve put a lot of effort into in the last 10 years;

“Viruses are mere proteins and nucleic acids. They start with one protein and build the whole virus with multiple proteins forming a shell,” Chiu explained. “But the shell is not a static object. It can breathe and expand, and that expansion requires a structural rearrangement of the proteins. In our model, we were able to see that rearrangement and explain exactly how the change takes place within one protein and at the interfaces among all its neighboring proteins.” The resolved models illustrated “before and after” images of the virus, first empty and then filled with viral DNA. To enact the transformation – a virtual ballooning – some amino acids are forced apart while others are moved closer to each other, making way for the negatively charged DNA to enter the virus from one entry and, simultaneously, for the scaffolding proteins to exit from another opening. This discovery has important ramifications for many infectious diseases, including HIV, herpes, and bacteriophages. Whereas most applications that run on supercomputers are highly parallel, Chiu’s method is “embarrassingly parallel” requiring many small, independent calculations to create a complete threedimensional picture of the virus. Fortunately, Ranger and other

July 27 | Summer 2011


The University of Texas at Austin TEXAS ADVANCED COMPUTING CENTER

"How do we extract useful threedimensional information from multiple two-dimensional images, and then put them together in a coherent manner?" Chiu said. "This is exactly where the computing comes in." TACC systems were designed to accommodate serial, high- of the National Academy of Science (Jan. and July 2011), Structure throughput and other disparate research methods, all of which are (May 2011), and the Journal of Structural Biology (May 2011). required for science to advance. The NCMI assists more than 100 research projects a year, imaging “A lot of computer facilities are only interested in the CPUs, but samples relevant to disease and to the basic of understanding of have no interest in real-world applications,” Chiu said. “TACC is life’s processes. As director of the center and lead developer on interested in taking care of scientists like us, who have real world a number of community tools that allow for enhanced Cryo-EM problems to solve.” images, Chiu is supporting discoveries around the world. “Dr. Chiu’s research is an excellent example of the sort of work “Scientifically, these tools allows biologists to see things that they that we aspire to enable here at TACC,” said Michael Gonzales, couldn’t see before, things that are closer to reality rather than TACC’s computational biology program director. “By leveraging idealized situations,” Chiu said. “Just as physicists study the the advanced computing technologies of our center, his research is hydrogen atom, but the periodic table has more than hydrogen, in providing critical insight into the fundamental physical properties a similar way, we can study more realistic systems to understand governing viral infections.” the biological activities, not as an isolated test, but in the cellular environments.” Chiu’s research findings have been published in the Proceedings

Aaron Dubrow | Science and Technology Writer

July 27 | Summer 2011


Quantum Computing

The University of Texas at Austin TEXAS ADVANCED COMPUTING CENTER

Aaron Dubrow | Science and Technology Writer

Quantum computers may represent “You have a lot more flexibility in setting the next major evolution in technology. the values of the things that you In theory, they would allow for faster compute,” said Chris Van de Walle, a and more complex computations using professor in the Materials Department a fraction of the energy. However, in at the University of California, Santa practice, building a quantum computer Barbara, studying potential quantum remains a very tricky engineering systems. “You could basically have any continuous value that is being problem from the atomic-level up. encoded in the wave function of some At the atomic level, particles behave entity that you are now using as your differently than they do in classical fundamental unit of computing.” physics. According to the Heisenberg uncertainty principle, it is impossible If it sounds far out, it is. Over the last to precisely determine the speed decade, researchers have investigated and location of a particle at any various ways of designing a practical given moment. Instead, particles are implementation of a quantum bit (or characterized by a wave function qubit) and none are near completion. that represents a probability that the “If you can come up with such particle will be in a given state. qubits and incorporate them in the In quantum computing, instead of 0s computing architecture, it has been and 1s, information can be encoded shown theoretically that you can solve in the wave function and the infinite problems computationally that are variations that are possible in the currently simply not feasible,” Van de Walle said. “The big challenge of spectrum of the wave. course is to come up with specific implementations of these qubits.”


Promising Results

One of the most promising The biggest advantage of NV centers Advanced Computing Center were implementations involves not a thing, in diamonds is their ability to operate at able match experimental results for but an absence: a defect in diamond room temperature, rather than requiring the NV center. that leads to a missing carbon in the near absolute zero temperatures, as material’s matrix, with a rogue nitrogen other quantum computing systems do. They also added a few crucial pieces of atom located nearby. This altered Also, electrons in the NV center can information to the corpus of knowledge structure creates a hole, or vacancy, remain coherent for a long time and about the NV defect. In particular, they called an NV (nitrogen vacancy) center, can be manipulated by outside forces. found that the charge state of the with a specific wave function that many defect plays a crucial role in achieving believe can be effectively manipulated “You can control where the vacancy a useable wavelength. This means for quantum computing. is formed in the crystal and you can controlling the number of electrons also probe it very accurately with laser that can enter the vacancy by suitable In industry, defects are typically beams with a specific wave length,” he doping of the material. considered something to be avoided explained. at all costs. However, in the case of “For NV centers in diamonds, the materials for quantum computing, it Van de Walle, an expert in defects optimal charge state is a -1 charge is the defect that makes computation and impurities, has been working state,” Van de Walle said. “For defects possible. closely with David Awschalom, an in other materials, it may be a different experimentalist at UCSB and a charge state, and just by guessing the “For the quantum computing leading quantum computing expert, charge state, you wouldn’t be able to application, the defect is actually a to expose the atomic-level dynamics know if it’s a good choice. But that’s good actor,” Van de Walle said. “It’s of the diamond center. Van de Walle’s what we can calculate.” the qubit that you want to use as your computational simulations on the unit of computation.” Ranger supercomputer at the Texas

from Diamonds

July 20 | Summer 2011


The University of Texas at Austin TEXAS ADVANCED COMPUTING CENTER

Beyond Diamonds There is one obvious problem with the NV center implementation. Diamonds are expensive and difficult to work with. To make a practicable system that could replace digital, siliconbased chips, the researchers needed a material with the same characteristics, but that was more common and easier to manipulate.

been removed. Single nitrogen atoms, too, can be placed in the appropriate location through ion implantation. “That really looked to us like the closest similarity we could have for any defect in a material other than diamond,” Van de Walle said. “Fortunately, silicon carbide is a lot cheaper, a lot more readily available, and a lot easier to process than diamond is, so it already meets a lot of the criteria that we are setting for the material to find a suitable qubit.”

They used their earlier simulations as the basis for a set of criteria on what might make a useable quantum environment, and then started testing likely candidates. Most recently, Van de Walle simulated the behavior of The researchers found that the silicon carbide (SiC). Like diamonds, properties of the silicon carbide center silicon carbide can be prepared so are in many ways similar to the NV that four carbon atoms surround a center in diamond and hence suitable vacancy where a silicon atom has for qubits. In fact, in one way, the silicon

Joel Varley and Justin Weber, who performed the computational research on defect centers for quantum computing in the Van de Walle group, recently graduated with a Ph.D. in Physics from the University of California, Santa Barbara. Photo credit C. Van de Walle

carbide center might even be better than the center in diamond because its excitation energy corresponds to a laser wavelength that is closer to commercially available, cheap lasers. The results of their research were published in the May 2010 edition of the Publication of the National Academy of Science. To calculate the quantum mechanical interactions of hundreds of atoms requires thousands of computer processors working in tandem for days. “Without the ability to run on TACC’s supercomputers, we would simply not have been able to do this project.” The high-fidelity quantum simulations are inspiring confidence among their experimental collaborators, and generating new ideas for lab experiments. “The ability to take our expertise in the area of defects, and to use it creatively to design defects with certain properties is really great,” Van de Walle said. “It’s exciting to be able to dig into what we know about defects and use all of that knowledge to construct a defect with a given property.” It just goes to show that sometimes perfection is overrated.

Professor Chris Van de Walle with an image of the defect in silicon carbide that is predicted to have applications in quantum information science. Photo credit N. Greenleaves.

Aaron Dubrow | Science and Technology Writer

July 20 | Summer 2011


The University of Texas at Austin TEXAS ADVANCED COMPUTING CENTER

Animations and visualization are generated with various nanoHUB.org tools to enable insight into nanotechnology and nanoscience. Above, a simulations shows a graphene nanoribbon that can be either a zig-zag (left image) or arm-chair (right image) type. Both zig-zag and armchair type GNR are shown with varying widths. Additional animations are also available at http:// nanohub.org/resources/8882

I

magine if the rapid technological progress we’ve become accustomed to suddenly leveled off. Many experts believe this could occur if silicon transistors — the basis for nearly all electronics — reach their miniaturization limit, which is believed to be less than a decade away. This scenario may come as a relief to some — no need to buy the latest gadget. But economically it would be a disaster for the United States. Not only has the semiconductor industry been the U.S.’s biggest export over the last five years, it is widely recognized as a key driver for economic growth globally. According to the Semiconductor Industry Association, in 2004, from a worldwide base of $213 billion, semiconductors enabled the generation of some $12 trillion in electronic systems business and $5 trillion in services, representing close to 10% of world gross domestic product. Economic progress like this cannot be slowed without a fight. Consequently, a massive scientific effort is underway to find new materials, new methods, or even new paradigms that can replace silicon transistors in a fast, cost-effective way. This race, inside the R&D centers of multinational corporations like Intel, IBM, GlobalFoundaries, Advanced Micro Devices, Samsung, and others, and also in academia, has led to several promising ideas. Nanotransistors made of graphene and quantum computers [featured in Part 1 and 2 of this series] are leading contender for future devices, but both involve unproven materials and processes. Aaron Dubrow | Science and Technology Writer

A promising design being explored at the Midwest Institute for Nanoelectronics Discovery (MIND) are "tunneling" transistors that use "III-V" materials, comprised of elements from the 3rd and 5th columns of the periodic table. These materials consume less energy and can be made smaller than silicon without degrading. "III-V materials have been studied extensively," said Gerhard Klimeck, director of the Network for Computational Nanotechnology (which hosts nanohub.org) and a professor of electrical and computer engineering at Purdue University. "But they have not reached Intel or IBM because industry has been able to build transistors with silicon and it's expensive to completely retool." The III-V materials have made inroads in certain niche applications like optical and high-speed communications. However, it has not cracked the CPU market where estimates for building fabrication plants based on new materials or technologies are in the range of several billion dollars. Because of the size of the investment, a great deal of preliminary research needs to be done before any manufacturer will make the leap. What's wrong with silicon? you ask. First, silicon chips use unsustainable amounts of power; second, by packing so many transistors on a chip, they can reach temperatures high enough to melt metal; and third, an odd quantum characteristic called tunneling allows electrons, at small length scales, to burrow under a barrier and leak charge. Tunneling is considered a major problem in CMOS semiconductor design. “It’s a leakage path that we don’t want,” Klimeck said.


“But maybe tunneling can turn from an obstacle into a virtue in these devices.” A transistor’s actions are two-fold. Not only does the device have to switch on and off, it must also be able to distinguish between the two states. Since the off state is always little leaky, the goal is to increase the ratio of “on” current to “off” current to at least 10,000.

Physics and the IEEE Electron Device Letters in 2010-2011. "If you can switch from on to off in a smaller swing, you can reduce the whole swing from .9 volts, which we have today, to .5 or .4, volts, which is what we're aiming for," Klimeck said. That factor of two reduction in voltage results in four times less power required. "That's a huge improvement if you can maintain the same current flowing through your valve." Computer modeling and simulation help the researchers explore the design space and physical properties of the materials, showing how one constructs a real device, atom by atom, in terms of geometries and growth.

There are fundamental limits in this regard for today’s CMOS technology, but III-V materials, and specifically the tunnel FET (TFET) transistors that "We try understand on the simulation side what can be done and Gerhard Klimeck, director of Klimeck is exploring, can perform better. provide the experimentalists with ideas," Klimeck said. the Network for Computational Nanotechnology and a They are often called “steep sub-threshold professor of electrical and swing devices,” because they swing from "The tunnel FETs look fairly similar to the CMOS transistor that computer engineering at Purdue almost no current to full current with a we see today, though they use very different materials and actually very steep slope. As a consequence, they would require less power turn off and on by a quantum effect called tunneling," said Jeff while still performing the same number of operations. Welser, Director of the Nanotechnology Research Initiative, which funds the studies at the MIND center. "It turns out that by using Recent simulations on the Ranger supercomputer at the Texas tunneling, you can get transistors to turn on much more quickly." Advanced Computing Center (TACC) and the Jaguar supercomputer at the National Center for Computational Sciences, led to a greater Though esoteric, the search for new nanotransistors is incredibly understanding of the quantum, and atomic-lever dynamics at play important for national competitiveness and economic security. in the nanoscale device. Determining the energetics and electron Semiconductors are not only the U.S.'s largest export, they are the pathways of these new nanoscale forms of III-V materials required foundation for the last four decades of incredible growth in wealth, more than 15 million processor hours on Ranger and 23 million health and scientific advancement. hours on Jaguar between 2008-2011. "Making sure that the nation continues to be on the leading edge The research group, led by Alan Seabaugh of the University of Notre of this export is of utmost importance, and it's timely to do that Dame, found that the sub-threshold conduction problem is related because we know that the industry does not have a solution at the to the way electrons gather in the device. The group started out 8 nanometer level," Klimeck said. "If we do not find a solution to with a design developed at the University of California, Berkeley continue to improve computers, the technical advancement that that was released to much excitement in Nature magazine in 2010. we've seen in the last 40 years will stop." Using the computational tools they developed, the researchers found that the off current for the transistor was extremely high — a big problem for the device design. To explain the physics of the problem, Klimeck likened the electrons involved in computing to water molecules in a bucket. The bucket has a hard bottom, but it has a fuzzy upper layer where electrons act like water vapor. The vapor cannot be controlled or "gated," resulting in a large voltage range to turn the switch on and off. Band-to-band tunneling transistors have (figuratively speaking) a top on the bucket. Therefore the flow of the electrons can be tightly controlled without any temperature dependent "vapor" and the devices can turn on and off with a smaller voltage swing. Klimeck et al filed a patent sponsored by the Nanoelectronics Research Initiative (NRI) for their improved tunneling design, and published several papers on the subject in the Journal of Applied

With the scaling down of metal oxide semiconductor field-effect transistors (better known as a MOSFET), researchers are looking at new transistor designs. Among them is the gate-allaround nanowire MOSFET. Due to quantum mechanical confinement in both the transverse directions, an inversion channel is formed at the center of the device. This phenomenon iscalled volume inversion. Threshold voltage for the simulated nanowire device in the accompanying image is ~0.45V.

June 29 | Summer 2011


I

f you had the power to improve your life, your community, and to make a significant contribution to future generations, would you? One hundred Austin residents in the Mueller development declared a resounding “yes” and have joined forces with Pecan Street Project to learn about smart grid technology and how to use energy more effectively in their homes. As global energy prices continue to soar and with power generation accounting for 40 percent of the U.S. carbon footprint, energy efficiency is an increasingly important consideration. Now, more than ever, there is significant momentum from both the general public and government to make “smart grids” a high priority. According to Michael Webber, associate director of the Center for International Energy and Environmental Policy at The University of Texas at Austin, utilities and energy companies are expected to spend $1-2 trillion over the next few decades on building, updating, and upgrading their grids nationwide. At the same time, energy consumers are expected to spend tens of billions of dollars on energyrelated appliances in the home. “Austin is a great test bed because we have energy-conscious, savvy residents who are willing to be partners in the process,” Webber said. “In addition, we have an energy mix with similar diversity to the nation as a whole (nuclear, coal, natural gas, wind, etc.). And, we have very high peak loads in the summer because of the need for air conditioning. These peak loads create problems for the grid; therefore, we have more to gain by finding innovative ways to manage energy consumption.” A smart grid is a system that delivers electric power to consumers in a more intelligent manner than is now possible, and has enhanced controls that protect equipment and foster the safe integration of distributed energy sources throughout a neighborhood, a city, a region, and even a continent. By adding monitoring, analysis, control, and communication capabilities to the electrical delivery system, smart grids hold the potential to maximize throughput while reducing energy consumption. “Before smart grid advocates and companies ask customers to invest in new products and services, we all need a better understanding of what they want, what they’ll use and what they’ll get excited about,” said Pecan Street Project Executive Director Brewster McCracken. “Our work at Mueller is the most comprehensive energy consumer research being conducted anywhere in the country. It’s the perfect place for real-world energy research, and we’re thrilled that the Mueller residents have invited us into their community.” Bert Haskell, technology director for Pecan Street Project, is responsible for reviewing different technologies involved in smart grid research, selecting the best architecture, and developing the optimal solution for consumer smart grid usage. “Our objective for the smart grid demonstration project at Mueller is to understand how the grid is going to benefit the consumer, and that makes us very unique compared to other smart grid projects,” Haskell said. “Most of them are planned and run by a utility, and the utility is trying to benefit itself. We have the full cooperation and support of Austin Energy who is very interested in discovering how they can best serve their customers.”

Faith Singer-Villalobos | Science and Technology Writer

Texas Advanced Computing Center | Feature Story For more info, contact: Faith SingerJune 15 | Summer 2011


The University of Texas at Austin TEXAS ADVANCED COMPUTING CENTER

In a smart grid world, the consumer is given real-time and accurate information about their energy use, and can make decisions on how much to use, what time of day to use it, and how much to pay for the energy. For example, you may want to keep your house set at 75 degrees Fahrenheit when prices are low, but you may decide to increase your thermostat to 78 degrees if prices are high. Similarly, you may want to dry your clothes for $.05 per kilowatt-hour at 9:00p.m. instead of $.15 per kilowatt-hour at 2:00p.m. Real estate agent Garreth Wilcock moved to the Mueller development from a sprawling 1960’s ranch house and quickly realized the benefits of living in a planned community that promotes energy efficiency. “If we can use what we’re learning here to impact the way homes are built and the way people can take advantage of changes in energy rates… that’s exciting.” Communities like Mueller are also positioned to take advantage of new technologies, including plug-in hybrid electric vehicles, various forms of distributed generation, solar energy, smart metering, lighting management systems, and distribution automation.

Mueller residents Garreth Wilcock and Kathy Sokolic serve on the Pecan Street Project’s executive committee and have an Incenergy monitoring system installed

expertise to look at different database structures and know how to organize the data so it’s more efficiently managed. We’re very excited to work with TACC to come up with new paradigms on how to intuitively portray what’s going on with the gird and energy systems.”

With the sensor installations in place at each of the 100 As early adopters, Pecan Street Project participants Wilcock homes, new data is generated every 15 seconds showing and Kathy Sokolic participate on an advisory council to precisely how much energy is being used on an individual review new ideas and products and help decide what goes circuit. Initially, TACC developed a special data transfer into the houses. Sokolic recently had her home evaluated format to pull all of the data into a database on TACC’s for an electric car charger, which could have implications “Corral” storage system. To date, the database contains approximately 400 million individual power readings and for the rest of the houses in the Mueller development. continues to grow. “It’s really important for me to practice what I preach,” Sokolic said. “I drive my car a lot so I don’t feel I live the lifestyle that Currently, TACC is collaborating with Austin Energy to I should. But, being able to move here, I was able to get compare their readings with the instantaneous usage solar panels, and I can move forward with all kinds of green readings from the participating homes at the Mueller initiatives. Living in an energy-efficient house and having development. Together, TACC and Austin Energy are calibrating the data to develop an accurate baseline about the ability to participate in this program is fantastic.” energy usage in the entire city of Austin. How Supercomputing Plays a Role “We’re trying to create very rich resources for people to use As you can imagine, the smart grid demonstration project in analyzing patterns of energy usage,” said Chris Jordan, a at Mueller is generating complex and large datasets that member of TACC’s Advanced Computing Systems group. require powerful supercomputers to capture, integrate, and “We’re helping to enable forms of research that we can’t verify the information, and to make sure that it is properly even foresee right now, and over time as the resources grow and become more varied, we expect whole new forms of synchronized and analyzed. research to be conducted. We’re really interested to see Enter the Texas Advanced Computing Center (TACC), one what people can do with it, such as how the data stream of the leading supercomputer centers in the nation, located can transfer itself into a decision-making device for city planners and individual consumers.” in Austin at The University of Texas. “TACC has some of the world’s fastest computers, so we’re confident they can do any kind of crunching, rendering, or data manipulation,” Haskell said. “They have the technical

Faith Singer-Villalobos | Science and Technology Writer

One of the weaknesses in smart grid systems is the way they visualize data, which is often not intuitive. Since TACC is a leader in providing visualization resources and services


“For us to maintain the data at this rate and give a good response time for someone who wants to query the site is certainly a challenge,” Navratil said. “We not only have to show the information in an understandable format, we have to show it quickly, and that’s the soups-to-nuts challenge of having an interactive data and visualization site.” To date, TACC staff members have developed an interface that graphs energy usage characteristics of the homes spread out across the city. Phase Two of Pecan Street Project’s Demonstration Project at Mueller While Phase One of the project is a blind study focusing on data collection alone, in 2012, Pecan Street Project will focus on behavior change and integrate more customercentric technologies. “Customer-centric to me is really customer value – what do they want and need?” Haskell said. “I guess there’s a certain ‘wow’ factor around the idea that you can control a power system in your house from your iPhone, but to me that’s not customer-centric. To me, customer-centric is that it all takes care of itself, and the customer doesn’t have to think about it, but has the lowest possible power bill each month.” And, ever new services they can elect to pay for.

Installation of the monitoring system is simple and quick. A licensed electrician needs access to the resident’s circuit panels (breaker boxes) inside and outside

to the national science community, and even conducts its own research into visualization software and algorithms, they were a perfect partner for the Pecan Street Project’s research at Mueller.

Overall, Pecan Street Project is trying to understand how energy management systems can be integrated into our lifestyle. Haskell continued: “That’s what we want to figure out―how that future automated home environment will interface to the smart grid to provide the peak energy demand characteristics that the utility needs to run their network without creating a burden on the consumer.” A deep curiosity has been awakened within Wilcock, Sokolic, and the other project participants; they are a group of people who are on a learning journey. They share articles with one another, meet regularly to discuss the smart grid effort, and often help each other with projects at each other’s homes.

“Everything can provide information, but to give the information context it needs to be meaningful,” said Paul Navratil, research associate and manager of TACC’s Visualization Software group. “In most scientific contexts, “It’s about finding other ways of looking at things,” said you’re simulating a real phenomenon. With this project, we Sokolic, “and we definitely have to work together.” have abstract information, so we have to work to make the information meaningful to the researchers, consumers, and Learn More industry partners who want to demonstrate the value of this project.” Pecan Street Project is interested in getting more people to make use of this data. The availability of clean, affordable, Navratil says it is a massive data mining problem, but this is reliable energy is central to our economic and societal something he and the other experts at TACC work with on a objectives. For more information on how to get involved, daily basis. please visit: http://www.pecanstreetproject.org/

June 15 | Summer 2011


The University of Texas at Austin TEXAS ADVANCED COMPUTING CENTER


I

f your house ever caught fire, it’s a safe bet you would want the firefighters who show up to have met Craig Weinschenk. A master’s-turneddoctoral student in mechanical engineering, he has been studying fire science since he arrived at the University in 2006.

situations in order to study how flames grow and how air impacts fires. He also gets hands-on training from the Austin Fire Department to better understand how fires work and how best to handle them.

"I attack the fire problems from both the experimental and computational side," he says. After applying to graduate programs across "We want to improve science and technology, the country, the New Jersey native began but we're trying to adapt everything we do to making the rounds to visit the universities. assist firefighters." Almost immediately after he set foot on UT’s campus, Weinschenk met mechanical Early on in his time at UT, Weinschenk had a engineering professor Ofodike Ezekoye, an expert profound moment while attending a firefighting in combustion and heat transfer, and realized cadet class. "They give fire science lectures, and what he was meant to do — even if it came as a instructors were quizzing students and asking surprise to him at the time. questions," says Weinschenk. “We want to improve science and technology, "The fact that they asked for my input made but we’re trying to adapt everything we do to me think, ‘Wow, there's really a chance to do assist firefighters,” says Craig Weinschenk. something good here.'" “I met Dr. Ezekoye and we started talking about And do something he has. Weinschenk, who some research he was doing,” says Weinschenk. “I Ezekoye calls "a rising star" in fire research, saw a melted fire helmet and asked him about it. has co-authored a paper that was published in We talked about my friends, many of whom are Fire Technology and served as president of the firefighters. Dr. Ezekoye said, ‘We have a grant UT chapter of the Society of Fire Protection coming through. Do you want to work on it?’ Engineering. He says it has been "amazing" to be at And now I’m in Texas.” UT for the past four-plus years. He has greatly enjoyed working with Ezekoye, who he calls "one Weinschenk, a recipient of theMeason/Klaerner of the smartest people I have ever met." Endowed Graduate Fellowship in Engineering and other endowed funds, is using engineering "There are so many opportunities" for graduate science to better understand firefighting students to grow at the University, says tactics and to bridge the gap between science Weinschenk, who earned a master's degree in 2007 and real-life firefighting situations. and is on track to receive his PhD in May 2011. "In addition to conferences, there are discussions Though there is a dedicated "burn building" with other academics and professors. There is on UT's Pickle Research Campus, he uses so much work going on. If you talk to anyone supercomputers in theTexas Advanced around the country and say that you do Computing Center (TACC) to simulate fire research at UT, it has a credibility that is rare."

Lauren Edwards

June 8 | Summer 2011


The University of Texas at Austin TEXAS ADVANCED COMPUTING CENTER


O

On June 1, 2001, the newly reorganized Texas Advanced Computing Center (TACC) officially began supporting computational researchers at The University of Texas at Austin (UT Austin) and throughout the national academic community.

how the early universe formed. In addition, TACC helped predict the storm surge from Hurricane Ike, delivered geospatial support during the Haiti disaster, and is currently providing emergency computing resources to Japanese researchers who are unable to access their own systems in the wake of the earthquake and tsunami.

Now home to some of the most powerful and recognized supercomputers in the open science community, TACC began 10 years ago by building from a predecessor organization, and by inheriting a dozen employees, a space on the J.J. Pickle Research Campus, and a small 88processor, liquid-cooled Cray T3E, the original Lonestar system. From these humble beginnings, TACC began a rapid ascent to become one of the leading supercomputing centers in the world. Born from the shared vision of leadership at UT Austin and TACC’s director, Jay Boisseau, TACC has become an epicenter for research that advances science and society through the application of advanced computing technologies. "The University of Texas, situated in Austin, presented a tremendous opportunity to build a world-class advanced computing center that supported outstanding science not just at UT, but across the nation," Boisseau said. "The quality of the university, the depth of the talent pool, the high profile of the university and the city, and the small, but dedicated staff that were already on hand, presented the elements for a new plan, a new center, and laid the foundation for what we've accomplished thus far." Over the past decade, TACC's expert staff and systems have supported important scientific work, from emergency simulations of the Gulf oil spill, which helped the Coast Guard protect property and wildlife, to the first models of the H1N1 virus, which enabled scientists to understand the virus's potential resistance to antiviral medication, to the clearest picture yet of

Aaron Dubrow | Science and Technology Writer

Deployed in February 2011, Lonestar 4 is TACC’s newest supercomputer and the third largest system on the NSF TeraGrid. It ranks among the most powerful academic supercomputers in the world with 302 teraflops peak performance, 44.3 terabytes total memory, and 1.2 petabytes raw disk.

The center has deployed increasingly powerful computing systems, which have enabled important scientific accomplishments. These include three systems that debuted in the top 30 "most powerful in the world" on the Top 500 list for open science: Lonestar 2 (#26 in 2003); Lonestar 3 (#12 in 2006); and Ranger (#4 in 2008). At $59 million, the Ranger award also represented the largest single grant to The University of Texas from the National Science Foundation (NSF).

Prior to the formation of TACC, the staffing and systems for advanced computing was at an all-time low on campus. An external review board had reported to UT Austin leadership in 1999 that if it wanted to sustain and extend leadership in research in the 21st century, the University needed to develop its computational capacity. As a first step, the Vice President for Research, Juan Sanchez, hired Jay Boisseau, who got his PhD from UT Austin and who had previously worked at the San Diego Supercomputing Center and the Arctic Region Supercomputing Center, to lead the effort. Boisseau rapidly set about expanding the core team inherited from ACCES, and recruiting additional talented staff to broaden TACC’s technology scope and to help realize his vision. Said Sanchez: “TACC grew from a vision to the reality it is today thanks to the strong commitment of The University of Texas at Austin to become a leading player in advanced computing, and the dedication, focus and expertise of its director, Dr. Boisseau, and his outstanding staff.” Leveraging top-tier research faculty at the University, local technology partners like Dell Inc., and funding from the NSF, TACC developed rapidly from a small center to a leading provider of computational resources nationwide. TACC currently has nearly 100 employees and continues to expand.

Additionally, TACC currently operates the world's highest-resolution tiled display (2008: Stallion), and the largest remote and collaborative interactive visualization cluster (2010: Longhorn). TACC did not emerge in a vacuum. UT Austin had operated supercomputers through a variety of institutes and centers since 1986, including the UT System Center for High Performance Computing, the UT Austin High Performance Computing Facility, and the UT Austin Advanced Computing Center for Engineering and Science (ACCES).

“Ranger” Principal Investigator (PI), Jay Boisseau, and CoPIs, Karl Schulz, Tommy Minyard and Omar Ghattas (not pictured) brought the 579.4 teraflop supercomputer to The University of Texas at Austin where it helps the nation’s top scientists address some of the world’s most challenging problems.

June 1 | Summer 2011


The University of Texas at Austin TEXAS ADVANCED COMPUTING CENTER

As TACC resources grew in capability and the center hired additional staff, bringing great expertise, the center’s position in the high performance computing community grew as well. In 2002, the High Performance Computing Across Texas (HiPCAT) consortium was formally established by researchers at Rice University, Texas A&M, Texas Tech, University of Houston, and UT Austin, with Boisseau as the first director. In 2004, TACC was selected to join the NSF TeraGrid, the world’s largest distributed infrastructure for open scientific research.

for the national scientific community. The center also received word in May that the National Science Board had approved $121 million for the follow-on to the NSF TeraGrid, known as Extreme Science and Engineering Discovery Environment (XSEDE), in which TACC will play a leading role. The emergence of TACC as a worldclass supercomputing center has arisen in the context of computational science becoming the third method of investigation, which, in conjunction with theory and experimentation, is driving advances in all fields of research. The resources that TACC deploys enable scientists to explore phenomenon too large (i.e. black holes), small (quarks), dangerous (explosions), or expensive (drug discovery) to investigate in the laboratory.

Galaxy Formation in the Early Universe: This is a

In 2007, TACC began providing resources visualization of a galaxy formation dataset, about 5 million particles simulated for 631 timesteps on Ranger. This on Lonestar 3 to other UT System simulation and corresponding visualizations help answer institutions, a role that has now grown questions about the formation of the early Universe, about in scale with Lonestar 4 and with the UT 100 million years after the Big Bang. This research also helps guide the observations of the James Webb Space Research Cyberinfrastructure project. Telescope, the replacement for the Hubble Space Telescope, In 2009, the NSF awarded a $7 million scheduled for launch in 2013. [Image credit: Christopher Burns, Thomas Greif, Volker Bromm, and Ralf Klessen] grant to TACC to provide a new compute resource (Longhorn), and the largest, most comprehensive suite of visualization and High performance computing is also used “We wouldn’t be able to do anything without data analysis services to the open science to predict the outcome of complex natural TACC,” said Mikhail Matz, a professor community. And in 2010, TACC was phenomena. This is the case for Clint of integrative biology at UT Austin who selected as one of four U.S. advanced Dawson, one of the leaders in forecasting combines the power of supercomputers computing centers awarded $8.9 million storm surges associated with tropical with next-generation gene sequencers. “We can generate massive amounts of for eXtreme Digital (XD) Technology storms. genetic sequences, but then what? The Insertion Service (TIS) award to evaluate and recommend new technologies as part “We rely on our partnership with TACC main challenge here is to figure out the of the NSF TeraGrid and its follow-on because, without them, we wouldn’t be most appropriate and effective way of able to do real-time forecasting of extreme dealing with this huge amount of data, and initiative. weather events,” said Dawson, head of the extracting the information you want. To do Computational Hydraulics Group housed in that, we need very powerful computers.” the Institute for Computational Engineering and Sciences (ICES) at UT Austin, and a But TACC is more than the host of powerful computing systems. It is also home to an longtime user of the center’s systems. inimitable group of technologists who are This sentiment is shared by nearly all of the instrumental in accelerating science, often scientists and engineers who use TACC’s by working directly with researchers to systems. The majority of computational make sure their codes run quickly and The particles in the visualization represent portions of the cycles are allocated by the NSF to the effectively. oil spill and their position is either hypothetical or reflect most promising computational science the observed position of the oil on the surface. The data is research; some cycles are reserved for “In order to do these large-scale science visualized using Longhorn and MINERVA, which is an open source geospatial software. [Credits: Univ. North researchers at Texas institutions of higher runs, it’s a big team effort,” said Philip Carolina at Chapel Hill, Institute of Marine Sciences; Univ. learning, including community colleges Maechling, information architect for the Notre Dame, Computational Hydraulics Laboratory; Univ. Texas, Computational Hydraulics Group, ICES; Univ. Texas, and minority-serving institutions. As much Southern California Earthquake Center, Center for Space Research; Univ. Texas, Texas Advanced as a new telescope or electron microscope who uses Ranger to simulate earthquakes Computing Center; Seahorse Coastal Consulting] drives discoveries in astronomy or biology, and predict their impact on structures in the In February 2011, TACC deployed a advanced computing systems allow for new Los Angeles basin. “You need the help of a powerful new supercomputer, Lonestar 4, kinds of investigations that push knowledge lot of people on our end, but also the help forward across all scientific disciplines.

Aaron Dubrow | Science and Technology Writer


of the staff at TACC in order to get all the pieces to come together.” Working with Maechling’s team, TACC has helped advance earthquake science and contributed to the development of updated seismic hazard estimates and improved building codes for California. For users like Dawson, Matz, and Maechling, access to TACC’s Ranger supercomputer and other systems means faster time-to-solution, higher-resolution models, more accurate predictions, and the ability to do transformative science with the potential for social impact.

TACC 10th Anniversary

Celebration and Colloquium

“We’ve made our systems reliable, high performance and scalable, and we’ve provided great user support,” said Boisseau. “Our systems are constantly in demand — often far in excess of what we can even provide — because we’ve established a reputation for making TACC a great environment for scientific research.” TACC supports more than 1,000 projects, and several thousand researchers, each year, on its diverse systems. On Friday, June 24, TACC will commemorate its 10th Anniversary with a half-day celebration and colloquium event on the J.J. Pickle Research Campus. The event will bring together experts in the high performance computing community, top scientific researchers who use TACC’s resources, and leadership from the center to discuss the past, present and future of advanced computing, and the ways in which high performance computing is advancing science and society. [A full description and calendar of events is available online at: http://www.tacc.utexas. edu/10-year-celebration/.]

The “TACC 10th Anniversary Celebration and Colloquium” took place on Friday, June 24, at the J.J. Pickle Research Campus. Many of TACC’s partners, colleagues, users, and supporters shared in this achievement as the center celebrated 10 years of enabling scientific discoveries and advanced computing achievements. The event focused on three themes: TACC’s Past, Present, and Future. The Past included viewing a 10-year timeline of significant events and an interview with Jay Boisseau about the center’s history; the Present consisted of system tours, science talks about current research, demos, and a visualization lounge; and the Future showcased an expert panel of HPC leaders discussing their vision for the next 10 years of HPC. Enjoy this photo essay of TACC’s 10th Anniversary Celebration and Colloquium! Feel free to share it with friends and colleagues and post it to your social media outlets.

June 1 | Summer 2011


The University of Texas at Austin TEXAS ADVANCED COMPUTING CENTER

Aaron Dubrow | Science and Technology Writer


June 24 | Summer 2011


Texas Advance Computing Center