Australian oil and gas giant, Woodside, prides itself on its technology and engineering heritage. So when it came to tapping historical and streaming data stores to improve operational efficiency, as well as predict and circumvent potential issues in its production facilities, the company went for the most cutting-edge machine learning technology on offer: IBM’s Watson.
Speaking at the recent Chief Analytics Officer Forum in Sydney, Woodside’s principal data scientist, Elsa Jordan, shared with attendees the company’s journey to build a data science capability from scratch in one year that could be utilised by employees right across the organisation.
Woodside established its data science practice in January 2015 and as part of its approach, is running the largest commercial instance of the Watson advisor engine.
“Our premise was think big, prototype small, scale fast,” Jordan said. “Key to that was using machine learning algorithms. What appealed was that we could learn from history, predict from streaming data and keep on learning as new information becomes available.”
The key question was how Woodside could capture and utilise the 20-odd years of valuable data on its energy projects, utilise and exploit this know-how from historical projects, then make this information available to employees at any time, Jordan said.
“Data science is the answer to that – it provides the speed to access knowledge and the ability to derive insights from our data,” she said.
Any tech or business investment of such considerable size usually has a trigger, and for Woodside it was a rare event in 2013 with the potential to shut down an entire production facility set the data science train in motion.
The event revolved how a production facility strips acid gas from carbon stream prior to liquefaction. Much like gas in a shaken-up Coke bottle, this ‘foaming’ can build up inside the unit and in Woodside’s case, result in significant production outages. The problem is that engineers can’t tell whether that ‘bottle’ has been shaken up or not.
“Similarly, we can’t tell if firming is happening in our cold vessels, and there isn’t a piece of equipment that can measure this,” Jordan explained. “We have to rely on proxies, like pressure measurements.”
Woodside pulled together its engineers to look at data after such an event to see whether there were any telltale signs that it was looming.
“The question was whether we had the pre-event data that could warn us in the future, and the answer was yes,” Jordan continued. “But it’s encoded into millions of rows, from readings across thousands of sensors, to logs and regular data.”
Woodside turned to machine learning to handle the vast volumes of data needing to be analysed. The resulting model provided a probabilistic reading on the likelihood of the foaming situation happening, and importantly, could flag it well in advance.
Sign up for Computerworld eNewsletters.