AIChE: Big Data for Safety

Tuesday, March 28, 2017 @ 03:03 PM gHale


By Gregory Hale
Industry collects petabytes of data which can end up used for predictive maintenance to improve asset integrity and reliability, but the catch is manufacturing generates the data, but hardly anyone is using it.

“One to two percent of data produced is being utilized for predictive maintenance,” said Carl March with McKinsey & Co. during a Monday presentation at the AIChE Spring Meeting and 13th Global Congress on Process Safety in San Antonio, TX. “It is good to have the insight, but you have to put it in action. We have more data where we can see things early on. People have to change the way they work.”

RELATED STORIES
AIChE: Chemical Engineers at Heart of Safety
AIChE: Safety Taken Personally
Chemical Leak Hurts 1, Forces Evac
TX Bill to Create Chem Incident Amber Alert

What is also at issue — which is similar to the idea of workers accepting automation a decade or so ago – is people do not want to hang their hat on the numbers they see.

“People don’t trust the data,” March said. “They can have all the information, but won’t do anything about is because they don’t trust the data. Advanced analytics generates previously untapped insights in predictive maintenance from troves of historical data.”

Predictive maintenance has been a huge issue in the industry for years and while people have talked about it, there has not been a huge shift toward it because there has not be a total acceptance of the technology. That, however, is changing.

March said the benefits of implementing predictive maintenance include increased availability and a reduction in costs.

He mentioned one case in point where an oil and gas operator in an offshore rig had an issue with a low-pressure gas compressor. There was plenty of data available about the compressor, but the unplanned downtime was unacceptable.

There was logging and maintenance data available, as was information from over 1,200 sensors.

“The question was how could we reduce the cost of maintenance and improve reliability,” he said.

That is where the Big Data move came into play.

“We had to collect data to understand how things can impact each other,” March said.

They had 120 gigabytes of data available to them. What they had to do was understand the target variable so they collected data from several sources. After identifying failures and constant target variables, they created a model to predict time to failure.

The goal was to reduce downtime by 70 percent by planning an intervention when a failure was imminent. Through their use of a Big Data model, the prediction of time to failure reduced from 14 to four days.

They originally received data from 1,236 sensors and narrowed that down to 43 sensors by understanding and culling through all the data. After all the number crunching, they were able to predict time to failure with about an 80 percent accuracy. After the initial results, they were able to tweak the number further to reduce the amount of false negatives.

As a result of this model, an early warning of an imminent failure means workers can plan out repairs, spares can end up gathered and the equipment can now be shut down in a safe and orderly manner.

Introducing predictive maintenance by leveraging data dramatically boosted asset integrity and gained insights with advanced analytics methods not possible with conventional technologies today.

Implementing predictive maintenance generated three benefits that reinforce one another:
• Predict when equipment will fail and help reduce the safety and integrity threats
• Avoid failures and positively contribute to reduce the scale of the value at risk
• Enhance specific knowledge and learn more about the behavior of critical equipment



Leave a Reply

You must be logged in to post a comment.