OIL & GAS
Real-time 'Movies' Will Predict Wildfire Behavior
- Written by: Writer
- Parent Category: TOPICS
Someday fire fighters will be able to manage wildfires by computer. Rochester Institute of Technology recently won a $300,000 grant from the National Science Foundation to translate remote-sensing data about wildfires into real-time "mini-movies" that fire managers can download on laptop computers at the scene of a blaze. The model and visualization will predict the fire's behavior for the following hour. The process will work like this: overhead and ground sensors will collect real-time data about a fire to feed into the model. The data will be transferred to a super computer where the model is run and then sent back to the field. This four-year collaborative project also involves researchers from the National Center for Atmospheric Research (NCAR) in Boulder, Texas A&M, University of Colorado at Denver and University of Kentucky. Leading RIT's research effort is Anthony Vodacek, assistant professor in the Chester F. Carlson Center for Imaging Science. Vodacek also heads RIT's Forest Fire Imaging Experimental System (FIRES), a precursor to the Wildfire Airborne Sensor Program (WASP). Other team members will include CIS senior research scientist Robert Kremens and postdoctoral fellow Ambrose Ononye. FIRES and WASP research were made possible through the efforts of Congressman Jim Walsh, chair of the House VA/HUD Independent Agencies Appropriations Subcommittee, who has provided nearly $8 million through the NASA budget over four years to support wild fire-detection research at RIT. The RIT team has two roles in its new project: -- To collect real-time data about wildfires using the airborne sensor, WASP, and ground-based sensors; and -- To use computer animation to visualize predicted fire behavior In between those two steps, is a unique fire behavior model that forms the core of the project. Information collected by the RIT team will be fed into the model created by Vodacek's colleague Janice Coen at NCAR. Based on fire-behavior models in use by the U.S. Forest Service, Coen's model will combine RIT's data with the influence of weather conditions. The model will output a 3-dimensional "movie" about the fire sophisticated enough to predict dangerous fire behavior, such as leaping flames. Other members of the collaborative team will be in charge of feeding a wide variety of raw data to the model for rapid retrieval at the fire scene. In order for this relay of information to be successful, Vodacek will need to make the scientific data meaningful to the fire fighters. "Coen's model can track smoke and hot gases in the atmosphere," he says. "We need to translate that into what a fire looks like by using computer animation. It fits very well into what we've been doing in the FIRES project." Vodacek's team will create synthetic scenes of fires to visualize live blazes based on Coen's model, which will tell them where flames will be in any particular situation. "We would translate it into what a person would see," Vodacek says. "Essentially, a little movie would be generated. In the end, the goal is to make it look real to the fire manager." The process will work like this: overhead and ground sensors will collect real-time data about a fire to feed into the model. The data will be transferred to a super computer where the model is run and then sent back to the field. (The fire could be in Montana and the super computer in Georgia, Vodacek notes.) The link is the laptop that the fire manager will use to watch how the fire is predicted to behave for about an hour. "The idea is that all of this will occur as close as possible to real time," Vodacek says. "By the time it takes to collect the data, run it through the model and send it back to the field, it may be 15 minutes old. But, still, that gives you a 45-minute outlook, potentially." The overall goal of the project is to demonstrate the entire system at the end of four years.