The Case for ARSC
Keeping the competitive edge for research at Americaâ€™s Arctic university Academic supercomputing at the University of Alaska Fairbanksâ€”research and analysis of phenomena, ranging in size from microns to millions of miles, and ranging in time from fractions of a second to millennia.
The Arctic Region Supercomputing Center, where scientific discovery, analysis and prediction are made possible through high performance computing.
University Relations Historical Photograph Collection, Rasmuson Library
as it really been 50 years since students at the University of Alaska got their first handson experience with a computer? It’s safe to say that in the 1960s few could predict the computational power today’s students at UAF are capable of harnessing.
One of the first computers on the University of Alaska Fairbanks campus for student use was this IBM 1620 Data Processing System, which was considered state-of-the-art in the 1960s.
UAF is home to the Arctic Region Supercomputing Center, the sole provider of open research computing capabilities for the Department of Defense High Performance Computing Modernization Program. ARSC’s petabyte-scale data storage facilities, and supercomputers capable of solving trillions of arithmetic calculations per second, provide 24x7 resources for a worldwide community of researchers.
At UAF, ARSC supports more than 280 scientists by helping them find, write or improve the software they need to do their research and to conduct computationally intensive investigations in areas of high-priority to the state, including climate modeling, atmospheric science and ocean science.
By the numbers: In January 2009, the DoD High Performance Computing Modernization Program (HPCMP) designated ARSC as one of six DoD Supercomputing Resource Centers (DSRC) in the nation. The elevation of ARSC to DSRC status provides increased and sustained funds to the Center at a level equal to other DSRCs in the HPC Modernization Program at about $14 million annually. ARSC is the only DSRC in the program not affiliated with a branch of the military, and is the sole provider of open research computing capabilities for HPCMP. The academic perspective ARSC brings to the HPCMP has broad implications for connecting the research, computing and military communities. 2
ARSC DoD Expenditures $30
GSA on Our Behalf
Core Research Systems Support
Expenditures penditures (millions)
ARSC has heavily invested in supporting computational research at the university. Academic expenditures on direct research and core research systems have totaled more than $40 million in the past eight years.
10 11 projected
UAF Fiscal Year
GSA (Goverment Services Agency) ICR (Indirect Cost Recovery)
Total = $169.9 million
DoD Funding to ARSC
Revised: February 10, 2010
A consistent, high-level of funding through DoD has shifted to base funding from the Office of the Secretary of Defense.
DoD Funding (millions)
$30 OSD OSD $25
DARPA DARPA-â€“Dlab Dlab Congressional Congressional
Federal Fiscal Year
DoD (Departmento of Defense) OSD (OďŹƒce of the Secretary of Defense) DARPA (Defense Advanced Research Projects Agency) Dlab (Discovery Lab)
Revised: January 07, 2010
ARSC-based research has consistently provided ICR totaling $18.4 million since 2002 to the university.
ARSC Indirect Cost Recovery (ICR) $4.0 $3.5
UAF and SW Share of ICR ARSC Share of ICR
$3.0 $2.5 $2.0 $1.5 $1.0 $0.5 $02
06 UAF Fiscal Year
10 11 projected
Northern Problems, Global Solutions
upercomputers are an indispensible tool when it comes to understanding and solving some of today’s most pressing problems. Through collaboration and computation, ARSC provides scientists with the high performance computing resources they need for finding solutions to big problems.
Increasing access to supercomputers is crucial to keeping U.S. research competitive, according to the National Science Foundation’s 2007 report “Cyberinfrastructure Vision for 21st Century Discovery.” Institutions with academic supercomputing centers also provide a competitive edge when it comes to recruiting top-notch faculty and students.
Graph 1: ARSC-supported research projects are spread through a wide range of disciplines at UAF. • GI - Geophysical Institute • SFOS + IMS - School of Fisheries and Ocean Sciences, Institute of Marine Science • IARC - International Arctic Research Center • IAB - Institute of Arctic Biology • CNSM - College of Natural Science and Mathematics • CEM + INE - College of Engineering and Mines, Institute of Northern Engineering • ARSC - Arctic Region Supercomputing Center • UAA - University of Alaska Anchorage
Graph 2: ARSC-supported research projects span many academic units. 4
Graph 3: Some research areas have greater storage requirements than others.
Graph 4: Some research areas require more computational effort than others.
Combined, graphs 3 and 4 demonstrate that ARSC supports a variety of research areas that have widely differing requirements in terms of computational effort and data storage. 5
The number of UAF projects at ARSC and the amount of CPU hours used (processing effort) is broken into broad categories of Computational Research Areas (CRAs): Air Quality Modeling of atmospheric pollutants, particulate matter and volatile organic chemicals. Transport, concentration, temporal cycles and chemical processes; study of phenomena such as wildfire smoke and volcanic ash plumes.
Climate Understanding long-term changes to Earth’s climate including the atmosphere, the ocean and land surface. Use of proxy data for historic measures; prediction of future scenarios based on estimation of numerical parameters and physical processes.
Computational Fluid Mechanics Understanding how fluids such as liquids and gases move through, around or within objects helps engineers design permafrostresistant roadbeds and spacecraft able to withstand reentry into the Earth’s atmosphere.
Cryosphere Modeling Study of ocean ice, including multi-year ice, seasonal ice and shore-fast ice. Land-based ice sheets and glaciers. Permafrost and hydrology from microscopic to landscape scale.
Data Intensive Computing Emphasis on the transport, storage, manipulation and recombination of data, including large-scale datasets. Data may be numeric or textual, structured or unstructured.
Ecosystems Study of the interaction among flora, fauna and the physical environment with a focus on sensitivity to change; natural evolution over time; physical, chemical and biological processes.
Environmental Quality Modeling and Simulation Investigations of environmental impacts include predictive modeling to forecast the effects of melting permafrost on plants and animals, or simulations to gain a better understanding of genetic stock-mixtures, which helps provide information for resource management of Alaska fisheries.
Land Surface Modeling Land surface changes in response to climate change, anthropomorphic impacts, extreme events and other factors. Phenomena of interest include soil composition, subsurface characteristics, hydrology and land cover.
Materials Science Understanding the interaction among physical objects. Material fabrication techniques, interaction of materials with the environment, and resiliency of materials. Microscopic and macroscopic characteristics over time, under different conditions.
Molecular Dynamics Design and understanding of chemical, polymer or biomolecular systems. Topics include pollution, development of new materials like Kevlar, knowing quantum chemical processes for improved energy conversion systems and semiconductors, as well as classical molecular dynamics.
Oceanography Study of ocean processes and dynamics. Transport of energy (such as subsea ocean currents), waves, salinity, density and heat. Ocean interaction with the atmosphere and ice. The role of bathymetry (underwater topography) in ocean processes. May include the impact of physical ocean properties on ecosystems.
Seismology and Tsunami Forecasting Inference of the Earth’s sub-surface properties from seismic observations. Examination of how different initial conditions impact outcomes from earthquake and tsunami events, such as earthquake severity and tsunami run-up.
Space/Astrophysics Formation and lifecycles of galaxies and stars. Models to predict the velocity, density and magnetic field of the solar wind, which impacts global communications and radar systems. Interaction of the ionosphere with the troposphere. Aurora modeling.
Systems and Complexity Focus on how complex systems interact with their environment, and with each other. How does the behavior of these systems change as a result of different input? Complex systems often have non-linear properties, where small permutations can result in large variations in behavior.
Weather and Atmospheric Science Numerical simulations and real-time forecasts of atmospheric phenomena and impacts. Temporal scales from minutes to days, including both highly localized and regional scales. Interaction of the atmosphere with the Sun, oceans and land.
Other This includes computational time and projects supporting ARSC’s Undergraduate Research Challenge Program, HPC training and workshops for UAF and visiting faculty.
Building capacity for cyberinfrastructure at UAF Bold moves by ARSC are bringing additional supercomputing resources to the community while providing for the sustainability of the Center. ARSC employees are at the core of the Center’s success. Center staff are recognized nationally by DoD and the academic supercomputing community for providing expert support to users of supercomputing technology for research.
DoD Capacity The High Performance Modernization Program (HPCMP) has demonstrated its confidence in the expertise of ARSC by approving the acquisition of a new DoD supercomputer and data storage system to be installed, and remotely run by ARSC in the $72 million Petascale Computing Facility at the University of Illinois National Center for Supercomputing Applications (NCSA). The partnership with NCSA, arguably the best and by Summer 2011 the largest academic supercomputing center in the nation, provides
significant growth and sustainment opportunities for ARSC in support of DoD research, data storage and security.
Academic Capacity In partnership and collaboration with UAF researchers, ARSC will install a new supercomputer and a new data storage system in the Butrovich Building at the University of Alaska in Spring 2010. The opportunity for ARSC to focus efforts on supporting academic computational research needs at UAF has been jumpstarted by the $6 million NSF-funded PACMAN project (Pacific Area Climate Monitoring and Analysis Network). The cyberinfrastructure systems supporting PACMAN will be used to investigate the effect of climate change on freshwater resources in Alaska and Hawaii. Those and related systems will also support the 286 UAF researchers in 71 projects whose work currently would not be possible without the use of ARSC supercomputing resources.
“Cyberinfrastructure consists of computing systems, data storage systems, advanced instruments and data repositories, visualization environments, and people, all linked together by software and high performance networks to improve research productivity and enable breakthroughs not otherwise possible.” Testimony by Indiana University Associate Dean of Research Technology Craig Stewart before the U.S. House Committee on Science and Technology July 2008
C A S C
ARSC is one of 60 member institutions in the Coalition for Academic Scientific Computation (CASC). ARSC Director Frank Williams is a former national chair of CASC, a nonprofit organization of supercomputing centers, research universities and federal laboratories dedicated to advocating the use of the most advanced computing technology to accelerate scientific discovery for national competitiveness, global security, and economic success, as well as develop a diverse and well-prepared 21st century workforce. 7
“Problems of national need will not be solved unless more institutions and more professors can use supercomputers for their toughest problems.” —National Science Foundation “Cyberinfrastructure Vision for 21st Century Discovery,” 2007
PHOTO CREDITS, COVER (top-bottom) • An illustration of the UAF Eulerian Parallel Polar Ionoshpere Model (EPPIM), a computer model of the northern polar atmosphere for understanding and predicting space weather events. • ARSC HPC Specialist Anton Kulchitsky provides technical training to participants in the 2009 ARSC Undergraduate Research Challenge. • ARSC graduate students Chao Peng and Siyuan Wang developed this model of an Athabascan village. BACK (top-bottom) • Three ES Login nodes have been added to Pingo, ARSC’s 3,456-processor Cray XT5. • ARSC Consultant Don Bahls provides assistance during the 2007 Engineering Day at the University of Alaska Fairbanks.
UAF is an AA/EO employer and educational institution. 02.11.10