Closing the Loop on Safety

Tuesday, May 30, 2017 @ 09:05 AM gHale

By Gregory Hale
When the IEC 61511 second edition ended up approved the world-renown safety standard released new requirements some of which increased emphasis on data quality and performance monitoring.

“There is a new emphasis on real world data,” said Stephen Thomas, SIS Engineering Lead – OASIS Program at Chevron during his talk Wednesday at the 2017 27th Annual Triconex User Group meeting in Lake Forest, CA. “We need to close the uncertainty of reliability. The world we live in right now is open loop control. We start collecting data with new tools, but without a mechanism to recognize and (understand) the data.”

Securing Safety: A Global Perspective
WannaCry: Time to Implement Holistic Security
How to Protect Against ‘WannaCry’
WannaCry from a Safety Perspective

That is where the Bayesian inference comes into play.

Bayesian inference is a method of statistical inference in which Bayes’ theorem is used to update the probability for a hypothesis as more evidence or information becomes available. Bayesian inference is an important technique in statistics, and especially in mathematical statistics. Bayesian updating is important in the dynamic analysis of a sequence of data.

“You have to have a framework to collect the data and understand it,” Thomas said. “We need to close the loop.”

In short, the traditional way to assess values on failure rates, which the entire industry uses, has its flaws.

Classical statistical methods make no assumptions about the population prior to taking sample data, which means the probabilities are initially equally likely, Thomas said. He used the failure rate of valves as an example over 1 year, one hundred years or 100,000 years.

Compare and Contrast
He gave a scenario regarding a traditional approach to finding a failure rate estimate and a Bayesian approach to finding a failure rate.

“We have 20 identical vales installed in similar services across a large site. We have five years of prior use data where we have experienced one failure with 876,000 hours ( 100 service years) for the valves. We want to use one fo these valves in a new SIF in a similar service. What failure rate based on prior use should we use in our SIL calculations?”

Without going through all the math, the traditional method was way more conservative than the Bayesian approach.

In essence, they could use true numbers from their own use to make failure rate assumptions and not rely upon generic industry numbers.

“We can have Prior use with Bayesian inference. Over time we can correct our assumptions. The prior distribution represents prior knowledge about probability,” Thomas said.

It only makes sense as “classical methods are not suited to sparse data. Data for high reliability devices is always sparse,” he said. That could mean data confidence levels get harder at higher SIL levels.

Like anything else in the industry, moving to this level means having to change a way of thinking.

“People are not successfully using prior use right now,” he said. “Classical methods take a lot of data, take a lot of time, and sometimes luck is involved.”

Bayesian Approach:
• Leverages prior knowledge to scarce failure rate data
• Closes the loop by updating models with new user data
• Can be implemented step-wise as data and resources are available
• Completely can be tailored to meet user needs
• Can save time and money by reducing hardware and testing

The Bayesian approach develops prior distribution, gathers new evidence and updates prior use to create prior distribution.

The true power of the Bayesian approach can end up leveraged using a hierarchical approach and information can be used from one device to the next, he said.

Even a single layer generic prior use can reduce data requirements by 40 to 50 percent. A user can start with a single layer and add layers as appropriate, he said.

A user can start with generic data and update with site and service-specific data as it is gathered. That allows for knowledge to build over time, he said.

It is possible to reduce hardware and testing costs for “good” services and concentrate efforts on difficult services.

“When we are talking about low failure rate, data is not abundant. Leverage the data you have instead of using industry data points,” Thomas said. “Traditional methods make it hard to do this. Bayesian uses actual data to help make is easier.”

Leave a Reply

You must be logged in to post a comment.