We hear a lot about how climate change will alter land, sea and ice. But how will this affect the clouds?
“Low clouds could dry out and shrink like ice caps,” explains Michael Prichard, professor of Earth system science at UC Irvine. “Or they could thicken and become more reflective.”
These two scenarios would result in very different future climates. And that, says Pritchard, is part of the problem.
“If you ask two different climate models what the future will look like when we add a lot more CO2, you get two very different answers. And the main reason for that is the way clouds are included in the climate models. »
No one denies that clouds and aerosols – bits of soot and dust that nucleate cloud droplets – are an important part of the climate equation. The problem is that these phenomena occur on a time and duration scale that current models are far from reproducing. They are therefore included in the models through a variety of approximations.
Analyzes of global climate models consistently show that clouds are the greatest source of uncertainty and instability.
RE-TOOLING COMMUNITY CODES
While America’s most advanced global climate model struggles to approach the 4 kilometer global resolution, Pritchard believes models need a resolution of at least 100 meters to capture the small-scale turbulent eddies that form shallow cloud systems – 40 times more resolved in all directions. . It could take until 2060, according to Moore’s Law, before the computing power is available to capture this level of detail.
Pritchard attempts to fill this glaring gap by dividing the climate modeling problem into two parts: a coarse-grained, low-resolution (100 km) planetary model and many small plots with 100-200 meter resolution. The two simulations run independently, then exchange data every 30 minutes to ensure that neither simulation deviates or becomes unrealistic.
His team reported the results of these efforts in the Journal of Advances in Earth System Modeling in April 2022. The research is supported by grants from the National Science Foundation (NSF) and the Department of Energy (DOE).
This method of climate simulation, called “multi-scale modeling framework (MMF)”, has been around since 2000 and has long been an option in the Community Earth System Model (CESM) model, developed at the National Center for Atmospheric Research. The idea has recently experienced a renaissance at the Ministry of Energy, where researchers from the Energy Exascale Earth System Model (E3SM) have pushed it towards new IT frontiers within the framework of the Exascale calculation project. Pritchard’s co-author, Walter Hannah of the Lawrence Livermore National Laboratory, is helping to lead this effort.
“The model gets around the most difficult problem – modeling the entire planet,” Pritchard explained. “It has thousands of small micromodels that capture things like the realistic formation of shallow clouds that only emerge in very high resolution.”
“The multi-scale modeling framework approach is also ideal for DOE’s upcoming GPU-based exascale computers,” said Mark Taylor, chief computational scientist for DOE’s Energy Exascale Earth System Model (E3SM) project and researcher at Sandia National Laboratories. “Each GPU has the power to run hundreds of micromodels while matching the throughput of the low-resolution, coarse-grained planetary model.”
Pritchard’s research and new approach are made possible in part by funding from the NSF Border supercomputer at the Texas Advanced Computing Center (TACC). The world’s fastest academic supercomputer, Pritchard can run its models on Frontera at a time and duration scale only accessible on a handful of systems in the United States and test their modeling potential in the cloud.
“We have developed a way for a supercomputer to best distribute the work of simulating cloud physics across different parts of the world that deserve different resolutions…so that it runs much faster,” wrote the crew.
Simulating the atmosphere in this way provides Pritchard with the resolution needed to capture the physical processes and turbulent eddies involved in cloud formation. The researchers showed that the multi-pattern approach did not produce unwanted side effects, even when patches using different cloud resolution grid structures met.
“We were happy, so see the differences were minimal,” he said. “This will provide new flexibility for all climate model users who wish to focus on high resolution at different locations.”
Unraveling and reconnecting the different scales of the CESM model was a challenge Pritchard’s team overcame. Another involved reprogramming the model so that it could take advantage of the ever-increasing number of processors available on modern supercomputing systems.
Pritchard and his team – UCI postdoctoral researcher Liran Peng and University of Washington researcher Peter Blossey – tackled this problem by breaking the internal domains of CESM’s embedded cloud models into smaller parts that could be solved. in parallel using MPI, or Message Passing Interface – a means of exchanging messages between multiple computers running a parallel program on distributed memory – and orchestrating these calculations to use many more processors.
“It already seems to deliver four times the speedup with great efficiency. This means I can be four times as ambitious for my cloud resolution models,” he said. “I’m really optimistic that this regionalization dream and decomposition of the MPI leads to a totally different landscape than is possible.”
MACHINE LEARNING CLOUDS
Pritchard sees another promising approach in machine learning, which his team has been exploring since 2017. “I was very provoked by the performance of a dumb sheet of neurons that can reproduce these partial differential equations,” Pritchard said.
Pritchard’s research and new approach are made possible in part by the NSF-funded Frontera supercomputer at TACC. The world’s fastest academic supercomputer, Pritchard can run its models on Frontera at a time and length scale only accessible on a handful of systems in the United States and test their modeling potential in the cloud.
In a paper Submitted last fall, Pritchard, lead author Tom Beucler of the UCI, and others describe a machine learning approach that successfully predicts atmospheric conditions even in weather patterns over which it has no experience. been trained, where others have struggled to do so.
This “climate invariant” model integrates physical knowledge of climate processes into machine learning algorithms. Their study – which used Stampede2 at TACC, Cheyenne at the National Center for Atmospheric Research, and Extent at the San Diego Supercomputer Center – showed that the machine learning method can maintain high accuracy across a wide range of climates and geographies.
“If machine learning of high-resolution cloud physics were successful, it would transform everything about how we do climate simulations,” Pritchard said. “I’m interested to see how reproducibly and reliably the machine learning approach can succeed in complex environments.”
Pritchard is well placed to do so. He is part of the executive committee of the NSF Center for Learning the Earth with Artificial Intelligence and Physics, or LEAP — a new Science and Technology Center, funded by the NSF in 2021 and led by its longtime collaborator on this subject, Professor Pierre Gentine. LEAP brings together climate scientists and data to reduce the range of uncertainty in climate modelling, providing more accurate and actionable climate projections that have immediate societal impact.
“All the research I’ve done before is what I would call ‘limited throughput,'” Pritchard said. “My job was to produce simulations over 10 to 100 years. This limited all of my grid choices. However, if the goal is to produce short simulations to train machine learning models, that’s a different landscape.
Pritchard soon hopes to use the results of his 50-meter on-board models to begin building a large training library. “It’s a really nice dataset to do machine learning on.”
But will AI mature fast enough? Time is running out to understand the fate of the clouds.
“If these clouds shrink, as ice caps will, exposing darker surfaces, it will amplify global warming and all the dangers that come with it. But if they do the opposite of the ice caps and thicken, which they could do, it’s less dangerous. Some have estimated this to be a multi-trillion dollar problem for society. And that has been in question for a long time,” Pritchard said.
Simulation by simulation, federally funded supercomputers are helping Pritchard and others approach the answer to this critical question.
“I’m torn between genuine gratitude for the US National Computing Infrastructure, which is so amazing in helping us develop and run climate models,” Pritchard said, “and the feeling that we need a level of new federal funding and interagency coordination for the Manhattan Project will really solve this problem.
—
The research was funded by the National Science Foundation (NSF) Climate and Large-Scale Dynamics program under grants AGS-1912134 and AGS-1912130; and under the auspices of the US Department of Energy by Lawrence Livermore National Laboratory under contract DE-AC52-07NA27344. The research used computing resources from the Texas Advanced Computing Center and the Extreme Science and Engineering Discovery Environment (XSEDE), which is supported by NSF grant ACI-1548562.
#Cloudless #future #mystery #heart #climate #forecasts