Advanced Analytics:
Accelerating Insights for Engineers
The surge in data volumes within the chemical process industries has made it difficult for engineers to find the insights they need to improve processes. Advanced analytics applications enable engineers and others to harness the data for improved operations
Every day, the chemical process industries (CPI) produce data at an unprecedented rate, including historian-based process data, and business and manufacturing system contextual data. Plants can generate terabytes of sensor data per day, and companies can generate scores of terabytes of data per day. With the rise of sensors and other equipment connected via the industrial internet of things (IIoT), these data volumes are expanding exponentially.
And, the problem of too much data and too little insight will only accelerate as IIoT takes hold. IIoT forecasts correlate to the amount of data generated, and the International Data Corporation (IDC) is expecting worldwide spending on IoT to reach $745 billion this year, led by the manufacturing sectors [1]. That represents a massive increase in sensor data. Despite the hype around “smart” sensors, the data will go to waste absent robust analytics to enable insights.
In fact, “data rich, information poor” (DRIP), threatens to drown organizations in a sea of underutilized data (Figure 1), driving a search for new solutions. For the last 30 years, spreadsheets were the analysis tool of choice in process manufacturing, but this general-purpose tool is no longer sufficient for complex analysis of the expanding volumes of time-series data.
At all levels of the manufacturing enterprise, from operators to engineers to field staff to quality control, and up to the executives, everyone needs useful insights for daily decision-making. And they need self-service access to data without the need for assistance from information technology (IT) experts and data scientists. Fortunately, a solution is at hand in the form of advanced analytics.
Improved solutions for analyzing data
A new class of analytics applications leverages new technologies, including big data and machine learning (ML) innovations, to accelerate engineers’ time to insights. Advanced analytics applications cover the crucial “last mile” between data and insights, enabling employees in many different roles to do their own analysis and share their findings in the moment, without the required intervention of a data scientist or other IT expert.
A number of compelling scenarios come into focus with the use of today’s advanced analytics. For example, as explained in more detail in a subsequent section of this article, predictive analytics on equipment can warn of impending failure so operators can take action to prevent an unplanned event. In one example, accurate batch-cycle-time analysis helped to focus process improvement efforts, which shaved off 10 minutes of total batch time and led to an annual production increase of 300 batches. In another example, management received near real-time alerts about commodity pricing, enabling them to make decisions which boosted plant profitability.
These improvements are not only within reach, but are achievable today with the use of advanced analytics applications. Furthermore, chemical processors can now realize significant benefits without needing to implement an expensive and time-consuming digital transformation initiative, because advanced analytics integrate new capabilities into existing operational technology (OT) environments and can connect with cloud offerings for greater agility.
This is a welcome development for CPI companies, which have been quick to embrace the cloud, especially for analytics workloads. The benefits of a cloud deployment model are clear: faster time to solution, an elastic infrastructure that can grow and shrink with ease, greater agility and reduced complexity. That said, cloud deployment is not a requirement for advanced analytics deployments, and many users either choose, or are tethered to, on-premise deployments.
A taxonomy for analytics
Analytics per se is nothing new, but advanced analytics is, and it accelerates the outcomes for all the different types of analytics applied in process manufacturing:
- Descriptive analytics describes what happened, using reports, charts and key performance indicators (KPIs) based on collected data, all of which may be shared in near real time
- Monitoring analytics tracks asset, batch or operations performance and seeks to answer the question “what is happening now?”
- Diagnostic analytics seeks to identify why something happened based on analysis of historical data, often called root-cause analysis
- Predictive analytics helps engineers identify what will likely happen based on real-time and historical data, enabling corrective action to be taken in advance
- Prescriptive analytics aims to optimize outcomes by informing plant employees of their best actions based on existing conditions
The benefits yielded by advanced analytics applications are not speculative, but are instead being realized today by chemical manufacturers to improve outcomes, including increased efficiencies, boosted quality, better visibility, increased agility and higher profit margins. Once advanced analytics is placed in the hands of plant engineers and other experts (Figure 2), improvements quickly follow, as shown in the following examples.
Applications
Speeding batch cycle times. For a large specialty chemical company, a small improvement in the cycle time of a batch process often results in huge financial improvements. In one case, a company had a process that produced 10 batches per day, with a cycle time per batch of 2.4 hours or 144 minutes, resulting in 3,640 batches per year. The engineering team used advanced analytics for current-state analysis and “what-if” modeling, identifying optimizations that shaved 10 minutes off of each batch. The new rate was 10.7 batches per day, or 3,912 batches per year, almost 300 additional batches from what at first glance appeared to be just an incremental improvement.
In the past, cycle-time analysis required an engineer to review past batches to determine the slowest or most variable phases of the process. Specifically, an engineer looked at a spreadsheet of numbers and dates instead of a trend. After creating “phases” with a start and end trigger for one batch, an engineer would make manual adjustments until the desired batch construct was determined.
Defining these start and stop triggers was exceedingly tedious with spreadsheets, and more manual work would then be required to apply this analysis against additional batches for more detail. Excessive time and work were required before an engineer could find the best cycle times and implement the changes required to realize those incremental gains across all batches.
Now, during regular daily meetings, operations personnel are able to quickly analyze batches produced yesterday, compare the cycle times to the best time, and, if necessary, quickly investigate why the cycle times were slow or variable. The results of these investigations are then used to improve the process, with actions typically taken within hours, instead of after weeks of spreadsheet-based analysis.
Optimizing clean-in-place cycles. Clean-in-place (CIP) is a technique often used by specialty chemical companies to avoid contamination from one batch to the next when switching among different product types. Because no product can be produced during a CIP cycle, it is imperative to reduce the time it takes to complete cleaning.
Figure 3 illustrates how an advanced analytics application was used to optimize CIP cycles by focusing on the critical process parameters, which, in this example, are conductivity of the cleaning fluid, along with fluid flow. The first step is definition of the CIP phases using data drawn from the batch historian or event database. Once the phases are defined, Step 2 adds conductivity and flow signals to the display. The engineer then performs Step 3 by searching for times with a positive value for flow combined with low conductivity. A condition is then created for instances when flow is above a defined threshold and conductivity is below another defined threshold.
Overcleaning is then identified, in Step 4, as times when flow is above threshold and conductivity is low. Step 5 shows potential utility savings by calculating total flow during periods of overcleaning. The final step is to create a scorecard with KPIs to show potential utility savings, along with reductions in CIP cycle times by minimizing periods of overcleaning.
Improving quality. Beyond boosting efficiency, the use of advanced analytics can go right to the heart of a CPI company’s most important performance indicator: quality. In one case, a large-scale specialty chemical manufacturer needed to ensure tight control of finished-product properties, which corresponds to product quality, and ultimately profitability.
Most companies in the industry, including this one, control the finished product properties based on feedback from laboratory results received some hours after material is produced. If a predictive model could be developed to reliably and accurately forecast the product properties based on conditions in an upstream part of the process, the company could make process adjustments in near real time, minimizing margin loss from product downgrades.
Using advanced analytics, company engineers built a model to predict product quality excursions (Figure 4). When these events were detected, engineers were able to quickly adjust the process to improve quality.
Using the model for near real-time quality control, rather than relying on traditional feedback methods based on laboratory results, the company reduced product margin losses by more than $1 million annually.
The analytics advantage
The surge in data volumes within the CPI is increasing the difficulty of finding insights. Yet the pressure to find insights and be “analytics-driven” is higher than ever, placing demands on plant executives and process engineers alike.
Insights that take too long to discover languish because they cannot easily be published and shared with others. Advanced analytics applications address these and other issues by connecting with data from a wide array of sources to surface insights much more quickly in a format that is easy to share, enabling actions to improve business results and profitability.
Advanced analytics bridge the gap between the glut of process data and the engineers who need the insights. Advanced analytics enable engineers, operators, executives and other team members to do their own analysis in the moment, enabling breakthrough improvements.
Edited by Dorothy Lozowski
Reference
1. IDC Forecasts Worldwide Spending on the Internet of Things to Reach $745 Billion in 2019, Led by the Manufacturing, Consumer, Transportation, and Utilities Sectors, www.idc.com/getdoc.jsp?containerId=prUS44596319, January 2019.
About the Author
Michael Risse is the CMO and vice president at Seeq Corporation, (1301 2nd Avenue, Seattle, WA 98101; Telephone: 206-909-9852; Email: [email protected]) a company building advanced analytics applications for engineers and analysts that accelerate insights into industrial process data. He was formerly a consultant with big data platform and application companies, and prior to that worked with Microsoft for 20 years. Risse is a graduate of the University of Wisconsin at Madison, and he lives in Seattle.