The National Grid analytics team: Pavel Ozhogin, Han Linda Wang, Rich Wester, Kristen Vincent and Jorge Calzada. Credit: Joyce Silvia
When a storm approaches, Michael G. McCallan, director of emergency planning at National Grid, has to predict which pieces of the power company's infrastructure would take the biggest hit and then calculate the best response.
National Grid, which serves parts of the Northeast, used to rely on institutional knowledge and operational experience when preparing for storms. Now it uses big data and analytics to make much more targeted and accurate predictions.
In collaboration with MIT, National Grid developed a tool called the Predictive Storm Damage Model. Deployed in early 2014, the tool uses real-time local weather data, advanced weather forecasting models and outage modeling technology to pinpoint areas that are likely to sustain damage during a big weather event.
"We're trying to predict the likelihood of damage to a certain region based on the weather forecast," says Jorge Calzada, National Grid's director of business planning and performance. The modeling tool allows the utility to prepare resources in advance with a greater degree of accuracy. And better predictions can translate into shorter power outages and lower costs for the company, McCallan says.
Calzada says one of the biggest challenges was getting real-time weather data from areas that didn't have many weather stations. So as part of this project, National Grid donated 55 professional-grade, Internet-connected WeatherBug Weather Stations and installed them at various municipal sites.
Sign up for Computerworld eNewsletters.