VISUALIZATION
Even with supercomputers, hurricane prediction is tough
Thousands of miles from the rain and wind of Hurricane Ivan, a model of the storm swirls in the memory and processors of a supercomputer that predicts its likely course and strength. Working through complex mathematical equations that describe the atmosphere's behavior across the globe, hundreds of microprocessors perform billions of calculations each second on observations collected by sensors dropped by aircraft and other monitors. The result, after more than an hour of number crunching at the U.S. Navy's Fleet Numerical weather computing center, is just one of the many predictions generated by supercomputers around the world that help frame such life-or-death decisions as whether to order evacuations and where to safely set up shelters. The programs that model the atmosphere and the high-performance computers that do this work have revolutionized weather forecasting, improving our ability to predict the paths of hurricanes and fluctuations in their intensity. In fact, the hurricane track error in the National Hurricane Center's three-day forecasts has been cut in half since 1998, and since last year the center's meteorologists have been issuing five-day forecasts with increasing confidence and accuracy. "It's one of the most remarkable improvements in forecasting that I've seen in my career,'' said Russ Elsberry, professor of meteorology at the Naval Postgraduate School in Monterey. The forecasts aren't perfect, particularly when it comes to long-term tracking and intensity. A few days ago, Ivan was thought to be headed for the Florida peninsula. Earlier this month, Hurricane Charley's intensity unexpectedly jumped just before slamming into Punta Gorda, Fla. Still, emergency officials were able to order 2 million people to evacuate for higher ground once Ivan turned toward the Gulf Coast, thanks in part to the models run by supercomputers and the confidence forecasters have in the results. Improving predictions is not just a matter of buying more hardware, though that helps. The numerical models themselves undergo continuous revisions as researchers learn more about the atmosphere and improve the accuracy of their algorithms. The models -- actually complicated software written in a computer language called Fortran -- attempt to account for everything happening in the atmosphere on a global basis. "You don't just sit down and produce one overnight,'' said Mike Clancy, chief scientist at the Navy's Fleet Numerical Meteorology and Oceanography Center in Monterey. The programs incorporate laws familiar to anyone who passed a high school physics course, then factor in the chaos of the atmospheric developments over time -- an infinitely complicated process. To organize it all, a three-dimensional grid -- a virtual net -- is layered over the Earth. At each point where the grid intersects, initial observations of air pressure, humidity, temperature, wind speed and other factors are processed by the computer to figure out how they will change at future fixed points in time. Just as a 5-megapixel digital camera more accurately depicts reality than a 1-megapixel device, higher resolution grids can capture a better picture of the atmosphere and help produce accurate forecasts. That, of course, requires more computing power. "You're always trying to make the resolution higher and higher. What limits that is the power of the supercomputer,'' Clancy said. "At the same time, you have to do these computations in a reasonable amount of time.'' Still, the initial observations -- usually extrapolated from the readings made by instruments called dropsondes that are tossed into storms by the crews of hurricane-hunter aircraft -- are often just approximations themselves -- a factor that can cause results to be less reliable. To increase confidence, there are several different types of models, each of which produces different results. Emergency planners can then make their own judgments. In the next few years, researchers at the National Atmospheric and Oceanographic Administration plan to unveil a new, higher resolution model that better addresses the interaction of the sea, land and atmosphere. Among other measurements, it will use data from NOAA's Gulfstream IV jet, which is being outfitted with a Doppler radar that will provide a three-dimensional description, from sea level to the top of a storm. "We believe this is of fundamental importance in addressing and improving the intensity forecast,'' said Naomi Surgi, advanced project leader for hurricanes at the National Weather Service's Environmental Modeling Center. It also should help scientists better understand and predict behavior when there aren't strong steering currents in the atmosphere. The new model will run at a Maryland supercomputer facility that currently ingests 116 million observations each day to provide guidance, not just for hurricanes but for all forecast models, said Surgi. The new facility consists of clusters of IBM Corp. servers with hundreds of microprocessors of the type used in personal computers. The Fleet Numerical computers, which were mostly built by SGI, also are based on so-called commodity chips. Such systems can cost from $1 million to $30 million, but the extra computing muscle will be needed as researchers struggle to do evermore calculations, evermore quickly. "The data we predict is very perishable,'' Clancy said. "If we took three days to do a three-day forecast, it wouldn't be relevant.'' By MATTHEW FORDAHL, Associated Press Fleet Numerical: https://www.fnmoc.navy.mil/ National Hurricane Center: http://www.nhc.noaa.gov/