SCIENCE
Data-intensive supercomputing, modeling, and the state of climate science
- Category: SCIENCE
Data accessibility, rather than computing speed, may become the greater challenge in scientific computing in coming years. With data-intensive supercomputing taking center stage during the tenth biennial Computing in Atmospheric Science (CAS) workshop, more than 100 scientists, industry experts, and computing professionals from 13 countries discussed challenges and new approaches to advance climate and weather research in the coming exascale era and beyond.
The workshop, held in Annecy, France, was organized and hosted by the National Center for Atmospheric Research (NCAR), and provided a venue for scientists, facilities managers, computing systems experts, and industry vendors to discuss future needs and craft creative solutions in scientific supercomputing.
“Managing data has become as important as computing speed when it comes to scientific computing, and NCAR has architected its future cyberinfrastructure to be installed in Wyoming around this need,” said Al Kellie, director of NCAR’s Computational and Information Systems Laboratory. “Exploring these options now is key to ensuring that scientific research can proceed at greater haste down the road.”
Presenters at the workshop gave varying estimates for the advent of exascale supercomputing when supercomputers will be capable of 1018 floating point operations per second. Some predicted that it would happen as early as 2017, others closer to 2020. Estimates of the power needed to run such a system ranged from 20–MW. Electricity consumption and its associated costs were identified as limiting factors for the future of scientific computing.
In efforts to reduce operating costs and energy requirements, presenters discussed new, experimental technologies that could limit power use, new architectures to improve efficiency, and alternatives like cloud computing and Internet-based science gateways.
“To help the user with the tuning of the application, to increase the number of scientific results that we can squeeze out on a per megawatt basis, somehow making life easier for the researchers is what is important,” said Thomas Ludwig, director of the German Climate Computing Centre, DKRZ, and professor of scientific computing at the University of Hamburg. “Here, we get input from climate researchers about what they need, with models for example, in computational resources, and we see how we can provide for that.”
In addition to discussions of data-intensive supercomputing characteristics, presenters touched on a number of other topics related to climate and weather modeling, computational benchmarking methods, and the uses of GPU-based computing.
Gerald Meehl, an NCAR scientist, presented an overview of the Coupled Model Intercomparison Project (CMIP). Currently in its fifth phase, it is the most ambitious coordinated multi-model climate change experiment ever attempted. Approximately 20 modeling groups from around the world are using high-end supercomputers to complete a comprehensive set of climate change simulations that will be used to advance scientists’ knowledge of climate variability and climate change, as well as for consideration in the Fifth Assessment Report of the Intergovernmental Panel on Climate Change.
“It is critical that we inform ourselves about technological solutions to the challenges of exascale computing, so that we can ensure our scientific computing develops on a complimentary course,” said Anke Kamrath, who heads the operations and services division of NCAR’s Computational and Information Systems Laboratory. “What we are talking about here in highly technical terms translates into improved weather forecasts, wiser long-term resource planning, and increased safety for everyone.”
{jvotesystem poll==1}
The workshop, held in Annecy, France, was organized and hosted by the National Center for Atmospheric Research (NCAR), and provided a venue for scientists, facilities managers, computing systems experts, and industry vendors to discuss future needs and craft creative solutions in scientific supercomputing.
“Managing data has become as important as computing speed when it comes to scientific computing, and NCAR has architected its future cyberinfrastructure to be installed in Wyoming around this need,” said Al Kellie, director of NCAR’s Computational and Information Systems Laboratory. “Exploring these options now is key to ensuring that scientific research can proceed at greater haste down the road.”
Presenters at the workshop gave varying estimates for the advent of exascale supercomputing when supercomputers will be capable of 1018 floating point operations per second. Some predicted that it would happen as early as 2017, others closer to 2020. Estimates of the power needed to run such a system ranged from 20–MW. Electricity consumption and its associated costs were identified as limiting factors for the future of scientific computing.
In efforts to reduce operating costs and energy requirements, presenters discussed new, experimental technologies that could limit power use, new architectures to improve efficiency, and alternatives like cloud computing and Internet-based science gateways.
“To help the user with the tuning of the application, to increase the number of scientific results that we can squeeze out on a per megawatt basis, somehow making life easier for the researchers is what is important,” said Thomas Ludwig, director of the German Climate Computing Centre, DKRZ, and professor of scientific computing at the University of Hamburg. “Here, we get input from climate researchers about what they need, with models for example, in computational resources, and we see how we can provide for that.”
In addition to discussions of data-intensive supercomputing characteristics, presenters touched on a number of other topics related to climate and weather modeling, computational benchmarking methods, and the uses of GPU-based computing.
Gerald Meehl, an NCAR scientist, presented an overview of the Coupled Model Intercomparison Project (CMIP). Currently in its fifth phase, it is the most ambitious coordinated multi-model climate change experiment ever attempted. Approximately 20 modeling groups from around the world are using high-end supercomputers to complete a comprehensive set of climate change simulations that will be used to advance scientists’ knowledge of climate variability and climate change, as well as for consideration in the Fifth Assessment Report of the Intergovernmental Panel on Climate Change.
“It is critical that we inform ourselves about technological solutions to the challenges of exascale computing, so that we can ensure our scientific computing develops on a complimentary course,” said Anke Kamrath, who heads the operations and services division of NCAR’s Computational and Information Systems Laboratory. “What we are talking about here in highly technical terms translates into improved weather forecasts, wiser long-term resource planning, and increased safety for everyone.”
{jvotesystem poll==1}