Since the introduction of the first PLC devices to automate industrial processes, there has been a pull from different groups within the organization (engineering & operations folks) to have more and more visibility into that data to improve reliability, performance, and consistency across various operations. This automation data can drive integrated workflows and processes along with assigning a corresponding urgency to issues detected when it is coupled with the right business data. There are many ways to interact with the PLC sensor data, but more often than not the data is being simplified and persisted in a data historian in order to provide engineering level detail of all values captured off of the automation layer. There is typically a huge volume of raw sensor data and probably more than anyone one person will be able to sift through, unless they have some guidance and tools to point out where in the data there are anomalies. By anomalies we are not talking about simple threshold violations but rather complex interdependencies that can wreak havoc on the quality of a product being produced based. This complex event detection can leverage standard statistical process control (SPC) rules and also custom defined ones that are specific for a material or to a process (multiple machines in a work center) or both. The trick here is to find operations folks whom also understand this and are looking for new and improved tools that enable them to track this and generate insight to drive the decision making process inline with various production operations or phases. The reality is that this often spans multiple disciplines and areas of expertise on the business side (Engineering, Quality, Supervisors/Team Leads, and operations folks) and the more coordination across these disparate groups that the IT leadership, when sponsored by thought leaders from the business, can unlock hidden opportunities for performance improvements and productivity gains.
This type of inline quality data collection and analysis is possible today with existing software from SAP and for technical details and insight into how this integration is achieved you can see my BLOG here SAP MII & ESP Integration Guide (Technical).
Overview of the Components
There are various components involved here and each one plays a different role in how metrics and KPIs like quality, production volumes, utilization, and asset health (really these are all inter-related). The components each have their own use cases that all help to programmatically populate these metrics and drive actionable data and intelligence into business workflow processes rather than having to have to collect and assemble this data into information as separate pieces. Having a fully integrated environment that can reach across heterogeneous execution, historian, SCADA, DCS, HMI, etc (the list goes on and on and on some more) and enterprise systems can provide end-users a single channel to interact with and analyze data, in the same context as the rest of the business, can provide tremendous process simplification and coordination value to an organization. It can also yield real-time insight into how profitable operations decisions actually are as they are being made in order to influence consistency in problem remediation and to ensure that priority of issues are upheld and work is completed in a timely manner.
SAP Manufacturing Integration & Intelligence (MII) and Plant Connectivity (PCo) - HOMEPAGE
In case you are unaware, SAP has a product that is designed to work in the industrial space to bridge the gap between the operations and enterprise levels and this product is SAP MII (currently SAP PCo is bundled/licensed with SAP MII). SAP MII, current version is 15.0, is designed to provide out of the box integration with ERP down to the operations layer and back up again along with performance management reporting driven from the automation, execution, and operator levels. This performance management and operations management style of reporting is nothing new and has been around for a long time now but the main difference is the out of the box data model, integration, and worker UI screens designed specifically for the operations folks. It has a built-in data model for tracking Overall Equipment Effectiveness (OEE) across your operations and driving real-time views of your efficiency along with integration to manage the classification of events (in ERP) and multi-site analytics views of performance (in HANA).
The SAP Edge Services (including DEP, formerly called Event Stream Processor) engine is one that enables the inline processing and correlation of multiple parallel streams of data. Sound confusing, it's really not and it's something that you do every day in your head! Think about what goes through your mind when you are planning to do something as simple as go for a walk around your neighborhood. You walk outside to see the temperature, wind speed, are the clouds in the distance, what time of day is it and all of this determines what you should wear and when you should plan on being home by. This is essentially what Edge Services is providing for you but with the automation layer as the inputs to drive the parallel correlation and analysis of data and highlight key areas of concern inline with the process. It can do this much faster and more consistently than people can and stands to provide huge value around driving corrective actions for improving product quality and also asset reliability.
The SAP HANA in-memory data (in case you haven't heard of it yet) is an analytical DB engine with built-in application capabilities that can drive a huge number of analytical transactions and can have raw data from the Streaming Engine (Edge Services & Smart Data Streaming) or operations data from MII loaded into it in order to drive centralized analytics on both large and small datasets. There are many more things that can be done with the HANA engine and lots of collateral online that can help to guide you on ways to use it across a huge variety of different business verticals but very useful to drive predictive analytics as this is a big growth area for the HANA engine.