Center for High-Performance Computing and Communications (HPCC) 2012
E xc e lle n c e across the disciplines
Abo ut HP CC
photo: Derek Lazzaro
The Center for High-Performance Computing and Communications (HPCC) serves as a foundation and model for computational research at the University of Southern California. HPCC bridges USC’s unique strengths in scientific computing, computer science, and communications by supporting research groups in a variety of disciplines, including astronomy, civil and environmental engineering, computational biology, epigenomics, linguistics, materials science, pharmacology, and structural seismology. HPCC resources enable researchers to move beyond the limitations of traditional research. A prime example of this is the work of Professor Simon Tavaré, holder of the George and Louise Kawamoto Chair in Biological Sciences at the USC Dornsife College of Letters, Arts, and Sciences. In 2011, Tavaré was named a fellow of the Royal Society, the world’s oldest scientific academy in continuous existence and the highest distinction a British scientist can receive. Tavaré also serves as principal investigator for the Center of Excellence in Genomic Science at USC, which received a five-year, $12.1 million grant renewal from the National Institutes of Health in 2009. Under his leadership, the center is investigating the process by which genotypic variation translates into phenotypic variation. The group hopes to create a unified picture
of how different genetic variants interact with the environment to influence aspects of human disease. Tavaré’s group uses HPCC resources to study aspects of tumor evolution. He and his research team use variation in the DNA of tumor cells to infer a tumor’s ancestral history. To generate the molecular data, they have developed a high-throughput, experimental technology for assessing variability in millions of individual tumor cells. Approximate Bayesian computation methods are then used to infer parameters of biological interest. One aim of this research is to identify the role that cancer stem cells play in the evolution of tumors. The overarching goal of HPCC is to expand the reach of “big computing” by addressing system-level questions in social, economic, and cultural research. With its advanced computing and communications resources, HPCC helps to create the virtual organizations, virtual worlds, and immersive environments that will continue to transform and globalize higher education in the twenty-first century. Elizabeth Garrett Provost and Senior Vice President for Academic Affairs
Launched in 2000, the University of Southern California’s Center for HighPerformance Computing and Communications (HPCC) is a global leader in research computing.
Among supercomputers in an academic setting, HPCC’s supercomputer cluster is the 9th fastest in the United States. Among all supercomputers in the world, it is ranked 120th. It claimed these distinctions by achieving a benchmark in fall 2011 of 149.9 teraflops, or 149.9 trillion floatingpoint calculations per second, on its 2,225-node, 10-gigabit-backbone cluster. USC’s HPCC has achieved its current status among the world’s top supercomputer sites through local investments and without national funding. The local aspect of HPCC allows USC faculty and graduate students unfettered access to a world-class facility, enabling them to tackle even the most challenging data-intensive scientific problems. HPCC helps graduate students develop the twenty-first-century skills necessary to become global leaders in advanced technologies and critical scientific discoveries.
Cover: Adarsh Shekhar, a graduate student in the USC Viterbi School of Engineering’s Mork Family Department of Chemical Engineering and Materials Science, investigates a body-centered cubic supercrystal consisting of DNA-functionalized gold nanoparticles. Page 02: Adarsh Shekhar, a graduate student in the USC Viterbi School of Engineering’s Mork Family Department of Chemical Engineering and Materials Science; Shinnosuke Hattori, a visiting scholar at the USC Collaboratory for Advanced Computing and Simulations; and Manaschai Kunaseth, a graduate student in USC Viterbi School of Engineering Department of Computer Science, visualize the formation of vortex rings around the periphery of a gas bubble during its shock-induced collapse in water. Page 03: Zaoshi Amy Yuan, a graduate student in the USC Viterbi School of Engineering’s Mork Family Department of Chemical Engineering and Materials Science, studies the fast diffusion of an electron-hole pair in an organic solar cell.
Priya Vashishta Professor of Chemical Engineering and Materials Science, Computer Science, and Physics and Astronomy Faculty Executive Director, USC Center for High-Performance Computing and Communications
Data Vi s uali z ati o n and C o ll abo r atio n Located in the Seaver Science Center, the Center for Data Visualization and Collaboration houses technologies that provide researchers with the means to teleconference, collaborate, and share complex information in high-resolution visual forms.
Powerful tools for visualization give USC faculty and graduate students the ability to view HPCC-generated simulation data from different perspectives and perform large-scale, in-depth analyses requiring specialized equipment in a relatively risk-free setting.
The center includes a tiled display wall and an AccessGrid that provides video, audio, and networking equipment with high-speed access to distributed data and computers. This equipment may also be linked to support group-to-group interactions across USCâ&#x20AC;&#x2122;s network and the Internet. The room contains three 65-inch, movable LCD screens that are capable of displaying one continuous image or three separate images.
AccessGrid The AccessGrid allows users to videoconference and collaborate interactively over the Internet. Three projectors, suspended from the ceiling, display computer simulations, films, live television feeds, or websites onto a wall-sized screen. Participants from a variety of locationsâ&#x20AC;&#x201D;ranging from university classrooms and labs to hotel conference rooms and government officesâ&#x20AC;&#x201D; can come together virtually for meetings, collaborative work sessions, seminars, lectures, and training.
Tile Wall The tile wall provides users with a scalable means of displaying wall-sized images or computer simulations with very high resolution. Moving or still digital images are sliced up into a dozen smaller images and then projected by a dozen digital projectors onto a 14-foot-wide and 8-foot-high, monolithic, all-glass screen. The rear-projected composite image, which contains roughly 17.5 million pixels, has remarkable clarity when viewed close up or at a distance.
HPCC i n f r a s tru c tu r e
HPCC comprises a diverse mix of computing and data resources. Two Linux clusters constitute the principal computing resource. In addition, HPCC has a central facility that provides more than 1.2 petabytes of combined disk storage and potential access to nearly a petabyte of tape storage.
Housed in USC's state-of-the-art data center, HPCC allows faculty members to maximize their individual research resources by contributing to HPCC's condominium-style compute environment. Under this arrangement, researchers may use their grant awards to acquire HPCC nodes, which are accessed through the same interface as the standard HPCC nodes. HPCC staff members monitor and maintain the equipment on a round-the-clock basis, enabling researchers to focus on their primary research. Researchers have sole access to their condo nodes except for twice a year when HPCC runs the LINPACK benchmarks.
Linux Clusters HPCC has two Linux clusters. The 2,225-node, quad-core, hex-core, and dodeca-core dual-processor cluster contains Dell, Oracle Sun, Hewlett-Packard, and IBM nodes on a 10-gigabit Myrinet backbone. The second cluster contains 514 dual-processor nodesâ&#x20AC;&#x201D;of which nearly half are dual-core AMD Opteron processorsâ&#x20AC;&#x201D;on a 2-gigabit Myrinet network. For each cluster, the bidirectional, low-latency Myrinet fiber network interconnects the nodes, allowing for the development of massive production jobs that require high-speed communications among computational elements.
Networks HPCC has multiple-gigabit interfaces to the USC campus network. These interfaces facilitate massive data transfers and interactive access to the computing facilities at 10 gigabits per second, creating a private cloud environment for big data through USC’s multiple-gigabit-per-second campus network. Founded in 1988 and headquartered at USC, Los Nettos is a private regional network that spans the greater Los Angeles area, providing high-bandwidth Internet connections to a variety of universities, colleges, research facilities, hospitals, city governments, public schools, and museums. Los Nettos members include the California Institute of Technology (Caltech), the Claremont Colleges, the Jet Propulsion Laboratory, Loyola Marymount University, Occidental College, USC, and various associates. Los Nettos engages in strategic partnerships and peering relationships to provide its members and associates with cost-effective and efficient routing to research and education and not-for-profit networks, as well as the Internet at large.
Los Nettos' infrastructure of dark fiber and leased circuits uses leading-edge optical technologies to enhance the network’s flexibility and provide services such as private virtual local area networks (VLANs), dynamic interconnects, and partitioned wavelengths between member sites. Through its affiliations with the Corporation for Education Network Initiatives in California (CENIC), Internet2, National LambdaRail, Global Lambda Integrated Facility (GLIF), and Internet exchanges such as Pacific Wave, CIIX, and Any2, Los Nettos offers high-capacity access to many national and international research and education networks. Pacific Wave Pacific Wave has developed a distributed Internet exchange facility on the West Coast—currently in Seattle, Sunnyvale, and Los Angeles—that enables interconnection among U.S. and international research and education networks around the Pacific Rim. The Pacific Wave exchange, operated by Pacific Northwest Gigapop (PNWGP) and CENIC, enables data to pass directly between major national and international networks, increasing the efficiency and speed of data transfer while eliminating the costs associated with routing data through multiple circuits and dedicated interconnects. This infrastructure supports a wide variety of advanced science and engineering applications. USC researchers have access to this Internet exchange and its international and national participants through the Los Nettos regional network.
The 514-dual-processor-node Linux cluster also includes 5 large-memory nodes with 64 gigabytes of RAM and 8 dual-core AMD Opteron processors. In spring 2012, 4 new large-memory nodes were deployed on the 2,225-node Linux cluster, each with 1 terabyte of RAM and 20 dual-core Intel Xeon processors. Provisioning of the two low-latency, high-speed Myrinet networks allows for the development of large-scale, parallel production jobs. The combined main memory of the two Linux clusters is more than 48 terabytes; its combined temporary disk storage is more than 1.5 petabytes.
Benjamin B e r man
Benjamin Berman is an assistant professor in the Department of Preventive Medicine at the Keck School of Medicine at USC. One of the first faculty members hired in the department’s recently formed Division of Bioinformatics, Berman’s research focuses on the computational analysis of high-throughput genetic and genomic datasets. In particular, Berman works to identify key DNA sequences that influence a normal cell’s development into a cancer cell. Identifying these sequences may lead to personalized cancer treatments in the future.
Berman also serves as an investigator and founding member of the USC Epigenome Center. Established in 2007 as the first of its kind in the nation, the center provides an interdisciplinary environment for molecular and computational scientists to work side-by-side on cutting-edge, high-throughput epigenetic and population-based research. Berman and others at the Center are at work on a five-year collaboration with Johns Hopkins University (JHU) to gather epigenome data from each of the major types of cancer. The USC/JHU project is responsible for the epigenetic component of The Cancer Genome Atlas, a large, multi-institute effort to comprehensively map molecular changes in cancer. DNA sequencing at the Epigenome Center can produce up to a terabyte a day of raw data; Berman and his group use HPCC resources and a parallel computational framework based on MapReduce to stitch together hundreds of billions of individual DNA sequences into a biologically meaningful genomic map of normal tissue or a single tumor. Thousands of individual compute jobs are coordinated by parallel workflow software developed by researchers at the USC Viterbi School of Engineering’s Information Sciences Institute.
Berman recently led a USC Epigenome Center study that used this computational framework to analyze tumors from colorectal cancer patients. The work, published in Nature Genetics, identified novel biochemical changes occurring genome-wide in cancer cells. Berman’s research is funded by the Kenneth T. and Eileen L. Norris Foundation and the National Institutes of Health.
BELOW: Epigenetic map showing a small section of a colon cancer patient’s genome. A novel epigenetic signature was identified which distinguished parts of the genome that were active versus those that were inactive.
Patrick Lyn et t
Patrick Lynett holds the John and Dorothy Shea Early Career Chair in Civil Engineering and is an associate professor in the Department of Civil and Environmental Engineering at the USC Viterbi School of Engineering. In his research, Lynett creates hydraulic models of coastal processes, studying shallow water wave phenomena with a particular focus on the impact of complex waves and currents on critical coastal infrastructure, such as nuclear power plants. Over the past decade, Lynett has participated in a wide range of studies on numerous coastal disasters, from hurricanes and tsunamis to oil spills.
With the aid of HPCC resources, Lynett is able to produce a hybrid hydrodynamic software tool capable of tackling the complex physics at work in complicated water wave processes. This comprehensive computational model simulates and predicts such processes by incorporating data from a great range of scales: from thousands of kilometers to less than a meter, and from the deep ocean to the shoreline. By evaluating the distinct tsunami hazards present in various regions and exploring how beach systems evolve, Lynett’s work helps to explain phenomena such as rip currents and beach erosion. Lynett’s research is funded in part by the National Science Foundation, the Office of Naval Research, the Nuclear Regulatory Commission, the Army Corps of Engineers, and the NOAA Sea Grant College Program. He was awarded a Guggenheim Fellowship in the area of Natural Sciences in 2010 and has been honored with the Department of the Army Commander’s Award Medal for Public Service by the U.S. Army Corps of Engineers.
Above: Snapshot from a simulation of tsunami-induced currents in Ventura Harbor. The shades of red indicate the areas of greatest current speeds; the circular features are whirlpools.
Excellence across the disciplines
As part of a team of engineers that performed extensive forensic analysis of the failure of levees and floodwalls in New Orleans during Hurricane Katrina, Lynett employed numerical tools to hindcast wave and surge heights. Lynett has also worked with large ports and nuclear power plants to assess tsunami hazards, including chaotic tsunami-induced motions, such as whirlpools and water jets.
Meghan Mille r
Meghan Miller is an assistant professor in the Department of Earth Sciences at the USC Dana and David Dornsife College of Letters, Arts, and Sciences. Her work centers around structural seismology and tectonophysics, through which she studies earthquakes, tectonic processes, and plate motions as a means of understanding the Earthâ&#x20AC;&#x2122;s evolution.
Miller focuses on the interplay between mantle convection and surface geology, and specifically on subduction zone processes and continental dynamics. Much of her fieldwork is conducted near seismically and volcanically active plate boundaries, such as the Caribbean, western North America, and the Mediterranean. In order to gain insight into the deep structure of the planet, she and her team have employed data from recent major quakes, such as those occurring in Japan and New Zealand, recorded by sensitive seismic instruments deployed by her group in Morocco and Spain, to examine how seismic waves travel through the Earth. Working with Professor Thorsten Becker, also in the Department of Earth Sciences, Miller uses HPCC resources to investigate mechanical interactions between subducted oceanic and stable continental lithospheres and how these change mantle convection at the Caribbean-South American and in the western Mediterranean plate margins. Miller and Becker combine seismological observations and computational geodynamic models to better constrain the tectonic history of these regions and more generally investigate plate boundary evolution. Millerâ&#x20AC;&#x2122;s work on detecting an active delamination of continental lithosphere beneath the Colorado Plateau was recently published in the journal Nature. She and her colleagues were able to identify the only known instance of active delamination, a process that may have contributed to the formation of the Grand
Canyon. Miller and her colleagues are searching other regions for possible active delamination. Millerâ&#x20AC;&#x2122;s research is funded by the EarthScope, Continental Dyanmics, and Geophysics programs of the National Science Foundation (NSF). Miller was chosen as an EarthScope Distinguished Lecturer in 2010-2011 and honored by the NSF Faculty Early Career Development Program.
BELOW: Map of broadband seismometers deployed in the Mediterranean region. The orange triangles represent the instruments deployed as part of Miller's NSF-funded project to understand the deep Earth structure beneath the region.
Remo Ro h s
Remo Rohs is an assistant professor of biological sciences, chemistry, and physics in the USC Dana and David Dornsife College of Arts, Letters, and Sciences. Rohs and his team strive to integrate two key aspects of DNA research, those of sequence and structure.
When scientists analyze genomes from the perspective of sequencing, they are looking for the one-dimensional placement of the four possible nucleotides—labeled A, C, G, and T—within a given sample of DNA. Structural analysis, meanwhile, generates three-dimensional representations of these complicated chemical units. There is a nontrivial relationship between DNA sequence and shape. For instance, shape helps determine how accessible a given base pair might be for protein contacts. A clearer grasp of how transcription factors bind DNA could have significant applications for drug design and cancer therapy.
Rohs recently published a paper on this new method, developed by him and his group, in the journal Cell. His other research interests include nucleosome positioning and the effects of chemical modifications, such as DNA methylation and base pairing variants, on DNA shape, as well as on protein-DNA recognition. Rohs’s research is currently funded by a grant from the American Cancer Society, the USC-Technion Visiting Fellows Program, and an Andrew Viterbi fellowship.
Photo by Ian Slaymaker
above: Eight Drosophila Hox proteins bind to very similar target sites but execute distinct in-vivo functions. The figure illustrates that the cofactor Extradenticle (Exd) (yellow) unlocks the specificity of Hox proteins (cyan) for recognizing DNA target sites (metallic). Specificity fingerprints of Hox proteins reveal that DNA shape is a determining factor in achieving specificity.
Excellence across the disciplines
Rohs is at work on creating computational methods for the highthroughput prediction of DNA shape. Supercomputing plays an integral role in analyzing the massive data sets required to predict local DNA shape of whole genomes. The underlying data for the genome-wide shape prediction is generated in computationally extensive all-atom Monte Carlo simulations. Rohs and his team have performed thousands of these simulations on the HPCC cluster. The ability to make DNA shape predictions in a high-throughput manner will have significant impacts on genome analysis.
Center for High-Performance Computing and Communications (HPCC) www.usc.edu/hpcc
Principal I nve s ti g ato r s and Pro j ec t s S upp orted by hp cc Frank Alber Image Analysis in Cryo-Electron Tomograms / 3D Structural Characterization of Genomes Hooman Allayee GWAS for Complex Diseases Andrea Armani Modeling of Optical Devices Prithviraj Banerjee Human Motion Detection and Tracking for Video James Baurley Algorithm to Learn Pathway Structure Thorsten Becker Global Mantle Convection and Crustal Dynamics Todd Brun Quantum Error Correction for Orbital Angular Momentum in Photons Nikolaus Buenning Investigating the Current Drought affecting the Western U.S. Peter Calabrese Genetic Analysis Using Sperm Typing Gary Chen Simulation of Genetic Variation in Mixed Populations Liang Chen Developing Statistical Methods for High-Throughput Sequencing Data Analysis and Gene Expression Regulation Analysis Ting Chen Computational Analysis of Metagenomic Sequencing Data/Robust and Portable Workflow-Based Tolls for mRNA and Genome Resequencing Yibu Chen Implementing Bioinformatics Software on a Custom Computer Cluster Limei Cheng Pneuma Extension Keith Chin Atomistic Simulation of Protein/Surface Interaction and Protein/ Surfactant Interaction David Conti Statistical Methods in Genetic Epidemiology Stephen Cronin Plasmon Resonant Nanostructures Enoch Dames Combustion Chemistry Julian Domaradzki Numerical Simulations of Turbulent Flows Fokion Egolfopoulos Aerodynamic and Kinetics Processes in Homogeneous and Heterogeneous Reacting Flows Ian Ehrenreich Genetic Variation in Microbes Julien Emile-Geay Modeling the Tropical Climate of the Past Millennium Robert Farley ATPase Molecular Dynamics Simulations Masabumi Furuhata Market Mechanisms for Agent Coordination William Gauderman Genome-Wide Association Analysis for Childhood Respiratory Outcomes Panayiotis Georgiou Behavioral Signal Processing for Motivational Interviewing Roger Ghanem Algorithms and Applications for Uncertainty Quantification Aman Goel Semantic Annotation Using Conditional Random Fields Sam Gustman Conversion of the USC Shoah Foundation Institute Video Testimonies into Multiple Digital Formats Stephan Haas Magnetic Fields and Disorder in Quantum Spin Systems John Heidemann Los Angeles Network Data Exchange and Repository (LANDER) Project Phillip Hendrickson Large-Scale Hippocampal Model Jerry Hobbs Machine Reading Mohamadreza Hosseini Southern California Air Quality Study Eduard Hovy Semantics-Oriented Natural Language Processing Research Kung-Chuan Hsu Quantum Key Expansion Manish Jain Finding Stackelberg Equilibria Sewoong Jung Particle Tracking Using Lattice Boltzmann Method Ardeshir Kianercy Boltzmann Q-learning Network Formation Samuel Kim Topic Models Kevin Knight Statistical-Based Natural Language Processing James Knowles Analysis of DNA Sequences Generated with Next-Gen Technology Amy Koid Using Metagenomics to Interrogate the Functional and Genomic Diversity of Picoeukaryotes at the San Pedro Ocean Time Series Georgios Konstantinidis Query Reformulation for Health Data Integration Anna Krylov Theory and Practice of Electronic Structure Calculations C.C. Jay Kuo TREC Video Understanding Experiments Manish Kurse Structure and Function of the Fingers’ Tendinous Apparatus Jun-Young Kwak Sustainable Energy Project Jiin Lee Computational Fluid Dynamic Analysis of Bridges Exposed to Hurricane Waves Mihye Lee Burea van Dijk Data Work Natasha Lepore Morphometry in Brain MRI Images of Babies and Fetuses Kristina Lerman Social Media Dynamics Yiyu Li Molecular Modeling of Protein Structures Michael Lieber DNA Sequence Alignment with Mouse Genome Patrick Lynett Large-Scale Modeling of Wave-Driven Coastal Processes Philip Maechling Cyberinfrastructure for Earthquake System Science Leandro Marcolino The Multi-Agent Monte Carlo Go Algorithm
Cert no. XXX-XXX-XXXX
Paul Marjoram CNV Calling and Evolutionary Modeling for Tumors Mara Mather fMRI Analyses of Emotion and Cognition Studies Patric Muggli Full-Scale Modeling of High-Energy Particle Accelerators Aiichiro Nakano Ultrascale Atomistic Simulation of Chemical Reactions Shrikanth Narayanan Tracking Dynamics of Vocal Tract Shaping Nouri Neamati Virtual Screening of Large Databases of Small-Molecule Drugs Ramakant Nevatia Video Content Analysis Magnus Nordborg Genotype-Phenotype Map Elnaz Nouri Inverse Reinforcement Learning Sergey Nuzhdin Approximate Bayesian Computation and Behavioral Dynamics John O'Brien Energy Frontiers Research Center Michelle Povinelli Large-Scale Simulations of Microphotonic Devices Kristen Pudenz Adiabatic Quantum Algorithms Peter Qin Molecular Dynamics Analyses of Nitroxides Attached to Nucleic Acids Lars Rahm High-Energy Density Materials Edward Rhodes Helioseismic Studies of Solar Internal Structure and Dynamics Remo Rohs High-Throughput Analysis of DNA Shape on a Genome-Wide Basis Daniel Ruderman Modeling Tumor/Host Interaction Robert Sacker Modeling the Formation of E. Coli Morphotypes Muhammad Sahimi Landfill Modeling and Estimation Patrick Schopf Computer Simulations of Chemical Reactions in Enzymes and Solution Fei Sha Analysis of Statistical Methods in Machine Learning Abhishek Sharma Job Scheduling for Data Centers Manoj Singh Computational Biochemistry Andrew Smith Analytic Tools to Examine High-Resolution and Genome-Scale DNA Methylation Data Darko Stefanovski Steered Molecular Dynamics of P-Type ATPases Lowell Stott Analyzing California Climate Change Alexander Stram Increasing Accessibility to 1000 Genomes Data Daniel Stram Genome-Wide Association Scans Fengzhu Sun Computational Methods for Metagenomic Data Analysis Alireza Tabatabaeenejad Airborne Microwave Observatory of Subcanopy and Subsurface (AirMOSS) Yaqi Tao Kondo Effect by Friedel Artificially Inserted Resonance (FAIR) Method Alexander Tartakovsky MURI Air Force Project Simon Tavaré Genomic Analysis of the Genotype-to-Phenotype Map Paul Thomas Reconstructing the Evolution of All Known Genes Barry Thompson Stereoregular Electroactive Polymers Matthew Thornton Stem Cell Microenvironment in the Maintenance of Pluripotency and Reprogramming Cong Trinh Symmetry Breaking of Zinc Dipyrrine Compounds Sinchai Tsao Large-Scale Volumetric Segmentation in Vascular Dementia, Mild Cognitive Impairment, and Alzheimer's Disease Anton Valouev Cancer Genomics Priya Vashishta Billion Atom Molecular Dynamics Simulations of Energetic Materials Tom Vernier Molecular Dynamics Simulations of Biomolecular Structures in Nanosecond, Megavolt-per-Meter Pulsed Electric Fields Greg Ver Steeg Clustering with Prior Information Nathan Walworth Cyanobacterial Transcriptomics Chen Wang Routing Courier Delivery Services with Urgent Demand Fang Wang Asymmetric Catalysis Fangyue Wang Anchor Text Aggregation Research Joseph Wang High-Performance Computer Particle Simulation Kai Wang Next Generation Sequencing Data Analysis Pan Wang Attrition and Skewness in Longitudinal Twin Studies: A Comparative Study of Structural Equation Modeling (SEM) Estimation Methods Arieh Warshel Computational Biophysics and Biochemistry Richard Watanabe The BetaGene Study Michael Waterman Genomic Analysis of the Genotype-to-Phenotype Map John Wood Cortical Mapping in Thalassemia Major Daiying Wu Analysis of ChIP-Seq Data Shanshan Xu Multivariate Regression Maocai Yan Molecular Modeling of APOBEC and Design of Anti-HIV Molecules Size Zheng Molecular Dynamics (MD) Simulation Using GROMACS Software Xianghong Zhou Development of a Knowledge Base for Interconnections among Diseases and a Bioinformatics Infrastructure for the MAPGen Consortium
Design: Warren Group | Studio Deluxe Principal Photography: John Livzey