Monday, January 10, 2011

UT researchers map oil spill destruction

Less than two weeks after the BP Deepwater Horizon oil rig explosion killed 11 people and began leaking between two and four million liters of oil per day, the calls started coming in.

The oil would soon reach the Louisiana coast, where it would do untold amounts of damage to the local marshes, wetlands and channels. Could the team that successfully modeled hurricane storm surges along the Louisiana, Mississippi and Texas coastlines help?

"We started working on the project fairly quickly, probably around the 10th of May," said Clint Dawson, head of the Computational Hydraulics Group at the Institute for Computational Engineering and Sciences.

With the highly accurate descriptions of the Gulf of Mexico's coastline that Dawson and his colleagues previously used for hurricane simulations, the team hoped to model the spread of oil into the complex maze of coastal marshes and wetlands — something other existing models simply cannot do. Watch a video of Dawson discussing hurricane modeling and storm simulations.

Rapidly responding in an emergency

However, in order to process information quickly enough to be of use to emergency response teams in the Gulf, Rick Luettich, a professor and head of the Institute of Marine Sciences at the University of North Carolina Chapel Hill; Joannes Westerink, a civil engineering professor at the University of Notre Dame; and Dawson needed funding and computational hours as soon as possible.

Normally, this sort of request would take months to get approved. Luckily, both the National Science Foundation and TeraGrid have rapid response programs for exactly this sort of situation. The researchers were quickly promised part of the funds they needed, with the remainder to come from the Department of Homeland Security.

The research team began laying the groundwork for the new model using computational resources donated by the Texas Advanced Computing Center (TACC) from their own allocation. And on May 28, TeraGrid announced an emergency allocation of one million computing hours on TACC's Ranger supercomputer.

Getting it done

"We have a large group of people working on it basically 24/7," Dawson said.

And they certainly have their work cut out for them.

The simulations use satellite imagery of the spill from the Center for Space Research at the university, along with a 72-hour forecast of meteorological data from the National Centers for Environmental Protection, which is released every six hours. Next, they convert that data into an appropriate format, and plug it into the Advanced Circulation Model for Oceanic, Coastal and Estuarine Waters (ADCIRC), which runs on Ranger.

Three to four hours later, ADCIRC produces a 72-hour forecast of the area's ocean currents, with a 50 meter resolution — 10 to 20 times more detailed than other models of the oil spill.

0 comments:

  © Blogger templates Psi by Ourblogtemplates.com 2008

Back to TOP