Plant control is key for manufacturers to run their operations. Since the advent of computers, process industries have witnessed the impact of new technologies in terms of safety, stability and resiliency. Here is a five-decade overview of those technologies:
- 1950s: Electromechanical systems set the basis of modern process control by replacing mechanical processes that required intensive human monitoring and intervention.
- 1970s: Distributed control systems (DCS) translated those physical systems into logical operations as part of a plantwide infrastructure for faster response times and flexibility.
- 2000s: Almost every process manufacturer rolled out historians to obtain and store heterogeneous process and asset information in a single platform.
The adoption dilemma
Those technologies seem so common nowadays. It leads us to the conclusion that their adoption was well-received back in the day given the perceived benefits. However, adoption has always been a dilemma in the manufacturing environment. Why should a manufacturer take a risk at improving an architecture that allowed them to make a product for several decades?
Let’s use the same historians as an example. Even though they became the norm in the 2000s, they had been around since the 1980s. For nearly 20 years, process manufacturers tried to implement the native historian functions in their incumbent DCS.
And they failed in this mission.
It took significant sunk cost for process manufacturers to understand that a resilient automation architecture will involve a best-of-breed approach with multiple solutions working in unison.
Fundamentally, whenever manufacturers added any of those layers to the manufacturing stack, data generation increased proportionally. And so did the ability to generate insights from it. Advancements in computational capabilities have enabled the next step toward better process control – which is data analytics and machine learning.
Making the same, old mistakes
However, the same problems keep resurfacing. Manufacturers are still trying to leverage their current automation backbone for functions it was not necessarily designed for.
- “Why can’t we use our PC-based control systems to implement machine learning models?”
- “If I can build cockpits in my historian, why can’t we take it to the next level and optimize formulations and control set points with it?”
- “There are some new function blocks in my DCS that can build forecasting models for key process variables. Why don't we try them?”
The answer to such questions comes with good and bad news.
- Analytics and machine learning capabilities are supplemental features and not the core capability of process control tools. They tend to be limited to few techniques and hit a technical ceiling quickly with analytics. For example, linear regression has become synonymous with modelling for process industries. But it is just one of many techniques to model a physical process.
- Process control tools lack proper model governance. Relying on conventional automation solutions to manage analytical models properly is like relying on an analytics solution to manage an inventory of control loops. It will not yield good results because the model life cycle is different than the control loop life cycle.
- It took decades to build and refine the process control tools that are offered in the market. In the same fashion, it takes decades to develop a solid analytics and machine learning platform.
The answer is the best-of-breed approach that I mentioned earlier. I will summarize three use cases for process manufacturers.
While DCS is great at keeping control loops at bay, they don’t necessarily lead the optimal process control settings. Advanced process control systems are also hard to self-tune and account for process variations and disturbances in the long run. Machine learning is the natural next step for modelling complex processes. For example, a special polymer manufacturer was able to increase plant output by over 30% by providing detailed, real-time guidance to the plant’s operators on how to adjust process set points to optimize performance using SAS.
DCS operates under control limits (or thresholds). Most industries have tight tolerances and still face quality problems even when process variables are within those limits. Small variations add up throughout the production process, which can lead to off-spec products. Machine learning techniques can predict and detect quality problems before they arise by determining optimal values based on past behavior, current process conditions and current feedstock specifications. For example, a steel manufacturer was able to detect 99% of defects using SAS computer vision and machine learning capabilities compared to 63% with an incumbent solution suite.
Some DCS, historians and enterprise asset management solutions offer the ability to monitor asset health. However, they are usually capped by absolute and trend values on the data side. The typical rules (“greater-than/lower-than”) enable manufacturers to capture some asset-health events based on suggested operational conditions but fail to detect complex correlations and unknown factors that will truly indicate potential asset failure.
In one case, a SAS customer was able to minimize mean time to repair from 38 to 10 days by anticipating critical events on large rotary equipment. By predicting failure weeks in advance, the customer could get maintenance events into the production schedule and purchase made-to-order spare parts, avoiding rush fees.
Consider SAS' 40-plus years of innovation developing a platform when adding data analytics to your automation stack. Our ability to integrate with DCS, historians and other data sources will boost your time to value no matter what projects are imminent in your programs.
Why choose SAS for manufacturing analytics?