Tuesday, October 5, 2021

Syukoro Manabe on Supercomputers and the Greenhouse Effect (Datornytt, 1990)

(Originally published in the Swedish computer newspaper Datornytt in the summer of 1990)

Supercomputers Track the Greenhouse Effect

Syukoro Manabe

The climate researcher Syukoro Manabe doesn’t eat a lot. However, his appetite for data hardly knows any limits, which is why it helps to have access to a Cray Y-MP/832. But will this $25 million supercomputer be able to erase the question marks about the greenhouse effect?

Mankind has since the 1800 hundreds pumped evermore carbon dioxide, methane gas and hydrocarbons into the atmosphere, trapping heat that otherwise should have escape into space. This is the so-called greenhouse effect that many researchers fear will devastate the Earth’s climate.

There is an intense debate and growing political interest around the greenhouse effect. Is it real, and if so, how fast is the global warming? Researchers are increasingly relying on supercomputers to answer these questions.

“I can’t imagine modern atmospheric and climate science without computers,” says Dr. Syukoro Manabe when I meet him in Princeton, New Jersey, in late May.

He has worked on advanced computer models to study the climate for about a quarter century. In 1966, he and Richard T. Wetherald managed to use computer simulations to demonstrate that the Earth’s average temperature would grow by 4 degrees Celsius (7.2 degrees Fahrenheit) if the level of CO2 (carbon dioxide) in the Earth’s atmosphere doubled (compared to the period before the industrialization).  

Today, Manabe is head of the greenhouse research at the Geophysical Fluid Dynamics Laboratory (GFDL) in Princeton, New Jersey. It is one of the leading centra for climate research in the U.S. and belongs to the National Oceanic and Atmospheric Administration, which belongs to the U.S. Department of Commerce.

The GFDL is not as large as the National Center for Atmospheric Research (NCAR) in Boulder, Colorado, but being small can be a benefit.

“I had over 100 hundred hours supercomputer time left over last month,” says Ronald J. Stouffer, who is one of a handful researchers in Manabe’s group. They have since mid-May had the privilege of hooking up their Sun workstations to one of the largest Cray computers. For the moment they have all the computer power they could wish for. If there is anything they do wish for, it is more storage capacity and better tools for analyzing the data.  

“Things are going to get better,” Lou Umscheid reassures us. He is responsible for the computer systems at GFDL. At the moment, they are evaluating a Silicon Graphics graphical workstation.

“We realize that we need the most advanced technology possible to analyze the data coming out of the supercomputer,” he says. “The problem is that we don’t have software for our particular applications. We need to do quite a bit of adaptation work.”

“It will take time before we have three-dimensional graphics, color and so on, but there are still a lot of things we can do with our Sun workstations,” Umscheid says.

Personally, Syukoro Manabe, doesn’t have a lot of direct contact with computers. He is the only one in the team that doesn’t have a workstation and says that he mostly walks around bouncing off questions to the members of the group. He says that people once thought he was going to far in using computers and building too complicated climate models. Today it’s the other way around.

“It’s not I that have changed,” he says.

What role do supercomputers play for our understanding of climate change?

“We are trying to track the atmospheric circulation, which is ruled by physical laws, using hydrodynamic equations. We need supercomputers to be able to solve these equations as exactly as possible,” he says.

Researchers at GFDL use a theoretical model that describes the atmospheric development through an imagined grid covering the Earth. Flows into and between the grid’s climate boxes decides the global climate. This is called hydrodynamics.


Manabe says that the next goal in building the model is to integrate the atmospheric development with that of the world’s oceans. The latter have a huge impact on the climate since the absorb, reallocate (surface and deep-water currents), and gives off large amount of heat.

“The faster supercomputers we have, the higher resolution we can use in our models, which gives us more precise answers,” he says.

So far, they have used grids where each box is 400 by 400 kilometers. With the new Cray computer, it is possible to use a twice as fine-grained net, using 200 by 200 kilometers as the base.

“We are using exactly the same mathematical operation for each atmospheric box, why this is an ideal situation for computers with parallel processors,” Manabe says. “Which is why we will probably switch to massively parallel computers in the future.”

He is thinking of machines like Thinking Machines’ Connection Machine that has 65,000 simple processors, but costs only a fraction of a Cray.

Why haven’t you already done that?

“We want other people to be the Guinea pigs so we can learn from their experiences! We don’t want to do the mistakes. Another reason is that the Cray Y-MP is more versatile.”

“More powerful computers give us a better understanding of what happens (in the model),” he says. “Instead of conducting one experiment, you can do ten. If you have eight processors, you can run eight experiment concurrently. You can do sensitivity studies by varying different parameters. You can study what role different parameters play for the climate, which helps you understand feedback in the system.”

The science is moving forward thanks to more powerful computers, but Manabe warns against excessive belief in what computers can do.

“People ask if we can improve the precision in the model with more powerful computers. Yes, I say, but it would be highly misleading to say that it is the only thing we can do,” he says.

The climate model Manabe’s group is working on consists of two parts. Once consists of hydrodynamic equations that describe known physical connections. Here better computers generally deliver better results. The other part contains components that are less well known or cannot as easily be described in terms of physical laws. There are many kinds of feedback between the Earth, the atmosphere, and the oceans, presenting very difficult problems for the researchers and their computer models.             

“For example, how does a cumulus cloud affect the state of the atmosphere? How do you treat the heat exchange between the surface of the Earth and the atmosphere? The Earth is covered by trees and land, which makes the whole thing very complicated,” Manabe says.

“Such problems cannot be solved through more computer power. We must conduct field experiments, theoretical studies, diagnostic studies, to figure out how to describe these processes in our models,” Manabe says.

“There will remain uncertainties about the prognoses of the future climate for a long time,” he says.

Forever?

“Yes! Assume that you have a perfect model. But how can you know that it is perfect, when every model is a simplified version of reality? You don’t know which impact that simplification has on your prognosis. How should we value the effect of your ignorance about those parts where we don’t understand the laws of physics well enough?”

He warns against the tendency to respond to uncertainty by adding more and more supporting assumptions to the model.

“The computer is powerful, but it is a double-edged sword. It can hurt us! The trend today is to stuff the model with as much as possible now that the computers are so powerful. This without knowing the consequences.”

“I worry about the uncontrolled march towards ‘complistic’ models. Simplistic models are also dangerous, but right now I am more concerned about ‘complistic” people.”

“People put in everything they can see through the window,” Manabe says. “You can’t do it with the hydrodynamic models, but you can put in lots of different assumptions when dealing with processes on the ground, cumulus clouds and other things that we don’t know how to handle. You create very complicated algorithms, but who know how many programming errors there is in the model?

“People think you can do experiments where you include everything like you are throwing ingredients into a pizza. I don’t like such ideas.,” he says.

“Then they have the model make a random prediction and claim that this is the most sophisticated, complicated model in the world. Hence it must be right. But the main thing is to understand what is going on in the model. Can you trust it?”

“Nobody should believe that the world’s most complicated model will let us go to Mount Sinai like another Moses and get the answer straight from God,” he says sarcastically.

Is there a risk the strong political interest will corrupt the greenhouse research?

“Yes, people will feel that they are in such a hurry that they don’t have time to try to understand what they are doing. People are traveling too much (to give speeches at conferences).”

“We have such a beautiful computer. For it to be worth the effort, we better sit by our desks and do our job.”

Hans Sandberg


Dr. Manabe gets the call from the Swedish Academy.   



No comments: