NSF Launches Grid Computing Weather Project


As Hurricane Isabel raked the mid-Atlantic coast, the National Science Foundation (NSF) announced a $11.25 million project for the National Center for Supercomputing Applications (NCSA) to collaborate with the University of Illinois and seven other institutions to improve researchers’ ability to study and predict dangerous weather.


The NSF Information Technology Research grant will support the Linked Environments for Atmospheric Discovery (LEAD) project to allow researchers, educators, and students to run atmospheric models and other tools in much more realistic, real-time settings than is now possible.


Currently, weather forecasting models typically run on fixed schedules over fixed regions, regardless of weather conditions. The LEAD project will develop grid computing environments for on-demand detection, simulation, and prediction of thunderstorms, tornadoes, and other destructive weather.


With LEAD, users will be able to access, manage, analyze, and display data; their desktop computers will connect them to a broad array of tools and to national databases. Dynamic orchestration tools will allow the system to automatically respond to evolving weather by ingesting data at the most crucial times.


NCSA will integrate the components of LEAD as they are developed at various institutions and is also one of five sites acting as a grid-and-Web-service testbed. The plan is for LEAD’s features to be tested and rolled out in three phases over five years.


“LEAD will give weather researchers the robust, flexible cyber-infrastructure they need to better understand and better predict these destructive events,” said NCSA Director Dan Reed.


For example, he said, a better understanding of the conditions that create tornadoes could lead to improved prediction and more timely, accurate warnings. A researcher pursuing this goal could use the LEAD portal to access and sort years of data stored in national databases. This culled data could be stored at distributed sites where the LEAD components will be developed and tested. Data assimilation and data mining tools could then be used to further refine and categorize the information.


Based on the data, the researcher will be able to develop hundreds of numerical simulations of storms to understand in more detail why some storms produce tornadoes and some do not. Using data mining tools, the researcher will be able to sift the hundreds of terabytes of output from these simulations, “rapidly gaining insight” into the conditions that are most likely to produce tornadoes.


Ultimately, the researcher will be able to move beyond simulations based on historical data to running real-time forecasts with streaming data feeds. The LEAD system will allow identification of thunderstorms as they form, automatically triggering data-gathering tools, requesting grid computing resources, and generating results as the weather unfolds.


“Our ultimate goal is to create a system that takes full advantage of all the atmospheric data that is constantly being collected, the power of supercomputers, and the speed of high-performance networks,” said Bob Wilhelmson, a senior research scientist at NCSA and a co-principal investigator with LEAD. “Being able to analyze this data in real-time and constantly update our models and forecasts could help us pinpoint where a tornado is likely to occur or where a hurricane will hit land.”

Get the Free Newsletter!

Subscribe to our newsletter.

Subscribe to Daily Tech Insider for top news, trends & analysis

News Around the Web