
Enriching Readings from MQTT Sensors with the Applicable Manufacturing Context
This article examines how recent innovations in SAP Digital Manufacturing can be leveraged to enrich readings from an MQTT sensor. Adding the applicable manufacturing execution context to the readings of the sensor can generate meaningful insights. We'll explore this through an example that creates an alert when a soldering process at a work center exceeds a certain temperature threshold.
Introduction
Since the advent of Industry 4.0, factories have become increasingly intelligent. Intelligence at the shopfloor level is often driven by device connectivity, the vertical integration of sensor or machine data into a horizontal business process. This IT/OT convergence allows further automatation of manufacturing operations.
But how can this work in practice? Data from sensors is meaningless if not put into the applicable manufacturing context. Once enriched with context, however, it can deliver valuable insights and turn into a powerful tool to identify and respond to production exceptions as they occur.
With release 2405 of SAP Digital Manufacturing, the following powerful capabilities were delivered:
In this article we’ll look at how these innovations can work together to analyze and enrich temperature readings of an MQTT sensor with the applicable manufacturing execution context at a work center when executing a soldering operation for a given production lot or SFC (Shopfloor Control), respectively. Only when the temperature reading is outside a permitted range, we’ll enrich the sensor data and as an example response create an alert displaying to the user both the applicable manufacturing execution context and the sensor values. Alternatively, you could send the enriched data to any custom web server for further processing.
Alerts in SAP Digital Manufacturing introduces an intended step of human decision making and can include configurable action buttons to resolve the alert. For example, we can do this triggering the creation of an issue in SAP Quality Issue Resolution, among many other options. For more details, please refer to my earlier article Create an Issue in SAP Quality Issue Resolution from an Alert in SAP Digital Manufacturing
Flow at Runtime
Let’s look at the scenario in some more detail as depicted in the figure below. It consists of three components:
The usage of global variables requires that both production processes share the same design in the Design Production Process app and run in the same runtime – here in the process engine in the cloud.
Configuration of Component 1: Writing Context from POD to Global Variable
Component 1 in the above figure writes the manufacturing execution context from the POD into a global variable when an operation activity is started for a selected SFC. Its configuration requires the following steps:
Let’s see in more detail how the above steps 1-4 are done:
Step 1: In the “Design Production Processes” app create a new design “Sensor2Alert”.
Step 2: Open the header detail view so that the design details become visible. They also contain a counter for global variables that have been created for this design. Currently it shows the number zero. Clicking on the “0” will open a popup window to create global variables. Select the “Design-wide” tab, because you need to create a design wide global variable, which can be shared across the two production processes still to be created within the design “Sensor2Alert”. Create a design-wide global variable “ME_context” with multiple fields to accommodate for the manufacturing execution context from the POD. Therefore, you need to choose “Structure” as the Data Type and select “AlertPropertySetDMEShopfloor” as the schema, which comes shipped since DM release 2408. Lastly, select “DMC_Cloud” as the runtime environment, because the global variable can only be shared within one runtime environment, which can be DMC_Cloud or an instance of a production connector, for example.
Note that the selection of the SAP predefined schema “AlertPropertySetDMEShopfloor” introduces all the fields we need to cater to the manufacturing execution context from the Work Center POD:
Step 3: Create a new production process “Write_GlobalVariable” of runtime type “Cloud” and check the box “Visible to Production Connector / Plant Connectivity Runtime”. Then, draw the production process consisting of a “Start” and an “End” control and the SAP pre-delivered service “Write Global Variables” between them. Next, you need to add start parameters to accommodate the desired manufacturing execution context from the POD that is to be written into the global variable “ME_Context”.
Now, you need to edit the header of the production process and activate the “Publish to Service Registry” toggle. To get there, click on the “…” as highlighted in the screenshot below. This is needed later when selecting the production process in the “POD Designer” app. Otherwise, it wouldn’t show up there.
Next, select the WriteGlobalVariable box and click “Add” on the Input tab in the right pane. In the popup, select the design-wide global variable “ME_Context”.
Then, clicking on the icon next to the listed global variable “ME_Context” in the right pane opens a popup to map the input parameters to the fields of the global variable as needed. Note that there you can leave some fields of the global variable empty, if they are not needed to contextualize the sensor readings:
Click on Save All and then on Quick Deploy. Once done, you can click on the Debug button for a quick test of the production process. It will open a popup to enter the start parameters of the production process. Enter some arbitrary values and use the Debugger to click on the button “Next Step” multiple times until you’ve executed the last step. In the lower right frame “Parameters and Variables”, you should see “ME_Context_successStatus” as last entry with the Boolean value “true”.
Step 4: Open the “POD Designer” app and select your work center POD – in our case it is called “DEMO POD”. Click on the “Activities Tab” and click the icon with the two gear wheels on the top menu bar to open the configuration panel on the right. Then, click on the green Start button and on the “Assign Actions” button in the configuration panel.
It will open the “Configure Action Button” popup. Click on Add, select “Production Process” as type from the drop down. As Type Definition, click on the selection icon and search & select the production process “Write_GlobalVariable”. It will show up as “P_Sensor2Alert_Write_GlobalVariable”. Then, click Create.
The production process should now show up in the “Configure Action Button” popup. Click on the icon with the two gear wheels to open its configuration section. In here, you can map the POD variables containing the manufacturing execution context of the work center POD to the listed start parameters of the production process.
Close the Popup and save the changes you made in the POD designer.
To test your configuration, open your work center POD from the launchpad, select a work center and resource, and then a work item from the list. This will open the POD for the selected SFC. Select the first operation activity and click the Start button. Now open the “Monitor Production Processes” app and select the “Write_GlobalVariable” process the Start button just triggered. Select the “Process Parameters View” and click on the global variable values link. It will show a popup with the manufacturing execution context values passed on from the POD as start parameters to the production process, which has then written them as values into the global variable.
Configuration of Component 3: Create Alert
Deviating a bit from the flow at runtime, we’ll continue with the configuration of component 3, which consists of a production process with two steps. It retrieves the sensor reading with timestamp from input parameters and, in step 1, reads the applicable manufacturing execution context from the global variable. In step 2, it maps these values to the payload of the custom service “CreateTemperatureAlert”. Note that you need a custom alert type with two property set types to cater for both the manufacturing execution context and the sensor reading with timestamp. We have named this custom alert type with “Temperature_Alert---Soldering_Station”. For the context, you can reuse the SAP defined property set type “DME_Shopfloor” whereas you need to create the custom property set type "Sensor_Reading" for the sensor reading.
Therefore, the configuration of component 3 requires the following steps:
Configuration steps 1-4 go beyond the scope of this article. Steps 1 and 2 are documented in the Alerts section of the Application Help for Execution, while steps 3 and 4 are documented in the Operations Guide for SAP Digital Manufacturing in the Enable the Alert Service section. We have attached the schemas used in our setup to this article so that you can easily reproduce it in your DM subaccount. For the configuration of the alert type in step 2, we took advantage of the deduplication feature available for alert types: If the temperature sensor creates high frequency temperature readings outside of the permitted range, we don’t want it to trigger a new alert to appear in the “Manage Alerts” app with each new reading. Therefore, we have enabled deduplication in the header settings of our alert type and set its period to 2 minutes.
It will suppress the creation of new alerts within a two-minute time window after the first occurrence of the event that triggered the alert – here, the sensor reading. Instead, it will increase the counter “duplications” of the alert with each repeated occurrence. If the event occurs again two minutes after its first occurrence, a new alert will be generated and shown in the “Manage Alerts” app (and so on).
Step 5 & 6: Create a new production process “Create_Temp_Alert” for runtime environment DMC_Cloud within the design “Sensor2Alert”. When maintaining the header parameter of the new production process, make sure that you switch on the toggles to make the process visible to production connector and to publish to service registry. From the “Controls” and “Services and Processes” panels on the left, drag and drop the “Start” control, the SAP pre-delivered service “Read Global Variable”, your custom service “CreateTemperatureAlert”, and the “End” control into the canvas and connect them with lines as shown in the screenshot below. Your custom service “CreatetemperatureAlert” is listed below the webserver it has been created for. To make it appear in the “Services and Processes” panel, you need to click the “Select Services” button and select the check boxes along the tree structure to your service and save.
Now, select the “Start” control in the canvas and click on the “Manage Parameters” on the panel that becomes visible on the right. Add the two parameters “in_Temperature” and “in_Timestamp” – both with data type “String”.
Step 7: Select the box “Read Global Variable” on the canvas, click the “Add” button in the panel that becomes visible on the right, and select the global variable “ME_Context”.
Step 8: Now, select the box “CreateTemperatureAlert” in the canvas and maintain its payload parameters in the right panel as follows:
Then continue with the structure “Processor” if you want a default processor to be assigned to the alert at the time of creation. Please note that, in the 2408 release, you can only enter a single processor:
Continue with the structure array “DME_Shopfloor” containing the alert properties representing the manufacturing execution context the alert is created for. Here, you need to map each alert property to the respective field of the global variable:
Once done, it should look like in the screenshot below. You can also click on “Switch to Code” if you prefer the code view for a quick check.
Lastly, you need to maintain the structure array “Sensor_Reading”. Here you need to map the start parameters “in_Temperature” and “in_Temperature” to the respective alert properties:
Now, click on “Save All” and then “Quick Deploy” on the top of the screen. Again, use the button “Debug” to test the production process by providing arbitrary values for the start parameters in_Temperature and in_Timestamp. For the timestamp, it needs to use the ISO standard format, e.g. "2024-09-05T09:10:00Z"
Configuration of Component 2: MQTT Subscription
Component 2 consists of the following subcomponents:
The configuration of the connectivity between the 3rd party message broker and the production connector instance happens in the cloud and is then deployed and activated on the production connector instance, which runs in our set up on a VM Ware. It requires client certificates to comply with the security standards of SAP Digital Manufacturing. The details are documented in the Third-Party MQTT Broker and Manage Message Brokers sections of the Application Help for Production Connectivity Model.
As mentioned earlier, we are using MQTTX as client to simulate a temperature sensor and publish temperature reading with a timestamp to the topic “dmc/temperature” on the 3rd party MQTT broker. We have also created a subscription to this topic on MQTTX to mirror the message back to our MQTTX client so that we know that the message broker has correctly received our message. Of course, this is optional.
Before setting up the MQTT subscription, we need to add the topic on the 3rd party MQTT broker that the production connector will access. This is done in the “Manage Message Brokers” app. Select your production connector instance in this app and open the tab “Production Connectors”.
Click on the “Edit” button and open the detail view of the production connector. In there, click on “Add” and enter the topic name(in our setup, it’s “dmc/temperature”) that you want to subscribe to. Also, check the “Subscribe” and optionally the “Publish” check mark next to the topic.
Then, click on the “Update” button on the bottom of the page.
Now, we are ready to create the MQTT subscription in the “Manage Automatic Triggers” app. Open the app and select “Message” in the filters section as subscription type and your message broker and click on “Go” to display the MQTT subscriptions that have already been created for your production connector acting as an MQTT client.
Click on “Create”. You will need to make settings in the three main sections “Context”, “Trigger Condition”, and “Action”. Let’s start with the “Context” section and make the following selections:
In the table “Message Payload Structure”, you need to maintain the “Aliases”. Enter the respective fields of the payload sent by the sensor (MQTTX client). In our case, they are identical to the field names in the schema. The “Context” section should now look as shown in the screenshot below:
The “Trigger Condition” section defines the condition the messages published to the subscribed topic “dmc/temperature” must match so that the action is triggered. We expect that a sensor would publish most of the time temperature readings within the permitted value range. Only in exceptional cases they would not, and only then we need to trigger our production process “Create_Temp_Alert” in the cloud to create an alert. Using a trigger condition helps us to keep workload away from the cloud and do the analysis whether or not the temperature readings are inside the permitted range of values on the production connector running on-premise in the shopfloor.
In our example, the permitted value range for the soldering temperature is the interval [190, 230] in degrees centigrade. Temperature readings less than 190 and greater than 230 degrees centigrade are considered as exceptions and shall trigger an alert.
Choose condition type “While true” and use the expression editor to compose the condition using the alias temperature and the available operator <, >, and ||.
Let’s move on to the “Action” section. Select “Production Process” as Action Type. Then, in “Available Processes”, first search and select your production process design “Sensor2Alert”. Then, you can select the production process “Create_Temp_Alert” from this design.
You need to map the sensor payload aliases to the start parameters of the production process.
Lastly, save and click on “Quick Deploy” on the top.
Now you can test the setup by publishing the payload message shown in the first screenshot of this subsection from the MQTTX client. It should trigger a new alert. The properties of this alert should contain the published message (simulated sensor reading) and the manufacturing execution context last saved from the work center POD to the global variable via the production process “Write_GlobalVariable”.
This concludes our example of how sensor readings from MQTT sensors can be enriched with the applicable manufacturing execution context and used for further processing inside or outside of DM. We created an alert from it; however, you could also send the enriched data to an external endpoint for further processing or analytical use cases.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
8 | |
6 | |
4 | |
3 | |
3 | |
2 | |
2 | |
2 | |
1 | |
1 |