Supply Chain Management Blogs by SAP
Expand your SAP SCM knowledge and stay informed about supply chain management technology and solutions with blog posts by SAP. Follow and stay connected.
cancel
Showing results for 
Search instead for 
Did you mean: 
Frank_Rambo
Product and Topic Expert
Product and Topic Expert
1,487

Enriching Readings from MQTT Sensors with the Applicable Manufacturing Context

This article examines how recent innovations in SAP Digital Manufacturing can be leveraged to enrich readings from an MQTT sensor. Adding the applicable manufacturing execution context to the readings of the sensor can generate meaningful insights. We'll explore this through an example that creates an alert when a soldering process at a work center exceeds a certain temperature threshold.

Introduction

Since the advent of Industry 4.0, factories have become increasingly intelligent.  Intelligence at the shopfloor level is often driven by device connectivity, the vertical integration of sensor or machine data into a horizontal business process. This IT/OT convergence allows further automatation of manufacturing operations.

 But how can this work in practice? Data from sensors is meaningless if not put into the applicable manufacturing context. Once enriched with context, however, it can deliver valuable insights and turn into a powerful tool to identify and respond to production exceptions as they occur.

 With release 2405 of SAP Digital Manufacturing, the following powerful capabilities were delivered:

  • Reading and writing data to global variables across instances of one process or across same-runtime processes in one design in the Design Production Processes app
  • Integration of 3rd party MQTT brokers (SAP MQTT broker as part of SAP Digital Manufacturing for edge has already become available with release 2402)
  • Creating subscriptions for MQTT messages published to 3rd party message brokers
  • The Alert Service in SAP Digital Manufacturing aims to notify production operators or supervisors about events or exceptions happening on the shop floor and introduce human decision making when responding to them. Alerts can either be created manually within PODs or automatically by production processes. They contain, as alert properties, the relevant execution context such as plant, work center, resource, operation, material, SFC, order, and so on.

 In this article we’ll look at how these innovations can work together to analyze and enrich temperature readings of an MQTT sensor with the applicable manufacturing execution context at a work center when executing a soldering operation for a given production lot or SFC (Shopfloor Control), respectively. Only when the temperature reading is outside a permitted range, we’ll enrich the sensor data and as an example response create an alert displaying to the user both the applicable manufacturing execution context and the sensor values. Alternatively, you could send the enriched data to any custom web server for further processing.

 Alerts in SAP Digital Manufacturing introduces an intended step of human decision making and can include configurable action buttons to resolve the alert. For example, we can do this triggering the creation of an issue in SAP Quality Issue Resolution, among many other options. For more details, please refer to my earlier article Create an Issue in SAP Quality Issue Resolution from an Alert in SAP Digital Manufacturing

 Flow at Runtime

Let’s look at the scenario in some more detail as depicted in the figure below. It consists of three components: 

  1. When starting an operation activity for an SFC at the work center, production process 1 is executed in the cloud, which writes the manufacturing execution context of the started SFC to a global variable. In our case, the context consists of an order number, finished material, SFC, plant, work center, resource, and operation activity.
  2. We are simulating a sensor with the MQTT client toolbox from EMQ known as MQTTX. It publishes a temperature reading with a timestamp to a topic on a 3rd party MQTT Broker. We are using Apache ActiveMQ Artemis in our setup, but it could be any other broker as well. We have connected the 3rd party MQTT broker with an instance of the DM production connector. The production connector runs an MQTT subscription on the same topic and triggers an action, if the temperature reading is outside the permitted value range. In our scenario, the action triggers production process 2 in the cloud and passes over the simulated temperature reading and timestamp as the start parameters.
  3. Production process 2 reads the ME context from the global variable and maps it together with the temperature reading and timestamp to the payload parameters of the “CreateTemperatureAlert” service to create an alert of custom type “Temperature_Alert---Soldering_Station”.

The usage of global variables requires that both production processes share the same design in the Design Production Process app and run in the same runtime – here in the process engine in the cloud.

Frank_Rambo_0-1726578754468.png

Configuration of Component 1: Writing Context from POD to Global Variable
Component 1 in the above figure writes the manufacturing execution context from the POD into a global variable when an operation activity is started for a selected SFC. Its configuration requires the following steps:

  1. Create a new production process design within where the global variable will be shared
  2. Create a global variable for the manufacturing execution context
  3. Create a new production process with a process step that writes input parameters containing the context to the global variable
  4. Configure Start button in Work center POD to execute the production process when an operation activity for a SFC is started

Let’s see in more detail how the above steps 1-4 are done:

Step 1: In the “Design Production Processes” app create a new design “Sensor2Alert”.

Step 2: Open the header detail view so that the design details become visible. They also contain a counter for global variables that have been created for this design. Currently it shows the number zero. Clicking on the “0” will open a popup window to create global variables. Select the “Design-wide” tab, because you need to create a design wide global variable, which can be shared across the two production processes still to be created within the design “Sensor2Alert”. Create a design-wide global variable “ME_context” with multiple fields to accommodate for the manufacturing execution context from the POD. Therefore, you need to choose “Structure” as the Data Type and select “AlertPropertySetDMEShopfloor” as the schema, which comes shipped since DM release 2408. Lastly, select “DMC_Cloud” as the runtime environment, because the global variable can only be shared within one runtime environment, which can be DMC_Cloud or an instance of a production connector, for example.

Frank_Rambo_1-1726578754478.png

Note that the selection of the SAP predefined schema “AlertPropertySetDMEShopfloor” introduces all the fields we need to cater to the manufacturing execution context from the Work Center POD:

Frank_Rambo_2-1726578754484.png

Step 3: Create a new production process “Write_GlobalVariable” of runtime type “Cloud” and check the box “Visible to Production Connector / Plant Connectivity Runtime”. Then, draw the production process consisting of a “Start” and an “End” control and the SAP pre-delivered service “Write Global Variables” between them. Next, you need to add  start parameters to accommodate the desired manufacturing execution context from the POD that is to be written into the global variable “ME_Context”.

Frank_Rambo_3-1726578754490.png

Now, you need to edit the header of the production process and activate the “Publish to Service Registry” toggle. To get there, click on the “…” as highlighted in the screenshot below. This is needed later when selecting the production process in the “POD Designer” app. Otherwise, it wouldn’t show up there.

Frank_Rambo_4-1726578754496.png

Next, select the WriteGlobalVariable box and click “Add” on the Input tab in the right pane. In the popup, select the design-wide global variable “ME_Context”.

Frank_Rambo_5-1726578754500.png

Then, clicking on the icon next to the listed global variable “ME_Context” in the right pane opens a popup to map the input parameters to the fields of the global variable as needed. Note that there you can leave some fields of the global variable empty, if they are not needed to contextualize the sensor readings:

Frank_Rambo_6-1726578754504.png

Click on Save All and then on Quick Deploy. Once done, you can click on the Debug button for a quick test of the production process. It will open a popup to enter the start parameters of the production process. Enter some arbitrary values and use the Debugger to click on the button “Next Step” multiple times until you’ve executed the last step. In the lower right frame “Parameters and Variables”, you should see “ME_Context_successStatus” as last entry with the Boolean value “true”.

Step 4: Open the “POD Designer” app and select your work center POD – in our case it is called “DEMO POD”. Click on the “Activities Tab” and click the icon with the two gear wheels on the top menu bar to open the configuration panel on the right. Then, click on the green Start button and on the “Assign Actions” button in the configuration panel.

Frank_Rambo_7-1726578754510.png

It will open the “Configure Action Button” popup. Click on Add, select “Production Process” as type from the drop down. As Type Definition, click on the selection icon and search & select the production process “Write_GlobalVariable”. It will show up as “P_Sensor2Alert_Write_GlobalVariable”. Then, click Create.

Frank_Rambo_8-1726578754513.png

The production process should now show up in the “Configure Action Button” popup. Click on the icon with the two gear wheels to open its configuration section. In here, you can map the POD variables containing the manufacturing execution context of the work center POD to the listed start parameters of the production process.

Frank_Rambo_9-1726578754517.png

Close the Popup and save the changes you made in the POD designer.

To test your configuration, open your work center POD from the launchpad, select a work center and resource, and then a work item from the list. This will open the POD for the selected SFC. Select the first operation activity and click the Start button. Now open the “Monitor Production Processes” app and select the “Write_GlobalVariable” process the Start button just triggered. Select the “Process Parameters View” and click on the global variable values link. It will show a popup with the manufacturing execution context values passed on from the POD as start parameters to the production process, which has then written them as values into the global variable.

Frank_Rambo_10-1726578754523.png

Configuration of Component 3: Create Alert

Deviating a bit from the flow at runtime, we’ll continue with the configuration of component 3, which consists of a production process with two steps. It retrieves the sensor reading with timestamp from input parameters and, in step 1, reads the applicable manufacturing execution context from the global variable. In step 2, it maps these values to the payload of the custom service “CreateTemperatureAlert”. Note that you need a custom alert type with two property set types to cater for both the manufacturing execution context and the sensor reading with timestamp. We have named this custom alert type with “Temperature_Alert---Soldering_Station”. For the context, you can reuse the SAP defined property set type “DME_Shopfloor” whereas you need to create the custom property set type "Sensor_Reading" for the sensor reading.

Therefore, the configuration of component 3 requires the following steps: 

  1. Create custom property set type “Sensor_Reading” with the two properties “temperature” and “timestamp” – both of data type “String”
  2. Create custom alert type “Temperature_Alert---Soldering_Station” with the two property set types “DME_Shopfloor” and “Sensor_Reading”
  3. Create request body and response schemas in service registry for new custom service to create alerts of type “Temperature_Alert---Soldering_Station”
  4. Create a new custom service in service registry to create alerts of type “Temperature_Alert---Soldering_Station”
  5. Create a new production process in the production process design “Sensor2Alert” consisting of the two steps “Read Global Variable” and “CreateTemperatureAlert”
  6. Create the two input parameters “in_temperature” and “in_timestamp” (both as Strings)
  7. Select the global variable to read for service “Read Global Variable”.
  8. Maintain payload parameters of the service “CreateTemperatureAlert”

Configuration steps 1-4 go beyond the scope of this article. Steps 1 and 2 are documented in the Alerts section of the Application Help for Execution, while steps 3 and 4 are documented in the Operations Guide for SAP Digital Manufacturing in the Enable the Alert Service section. We have attached the schemas used in our setup to this article so that you can easily reproduce it in your DM subaccount. For the configuration of the alert type in step 2, we took advantage of the deduplication feature available for alert types: If the temperature sensor creates  high frequency temperature readings outside of the permitted range, we don’t want it to trigger a new alert to appear in the “Manage Alerts” app with each new reading. Therefore, we have enabled deduplication in the header settings of our alert type and set its period to 2 minutes.

Frank_Rambo_11-1726578754526.png

It will suppress the creation of new alerts within a two-minute time window after the first occurrence of the event that triggered the alert – here, the sensor reading. Instead, it will increase the counter “duplications” of the alert with each repeated occurrence. If the event occurs again two minutes after its first occurrence, a new alert will be generated and shown in the “Manage Alerts” app (and so on).

Frank_Rambo_12-1726578754534.png

Step 5 & 6: Create a new production process “Create_Temp_Alert” for runtime environment DMC_Cloud within the design “Sensor2Alert”. When maintaining the header parameter of the new production process, make sure that you switch on the toggles to make the process visible to production connector and to publish to service registry. From the “Controls” and “Services and Processes” panels on the left, drag and drop the “Start” control, the SAP pre-delivered service “Read Global Variable”, your custom service “CreateTemperatureAlert”, and the “End” control into the canvas and connect them with lines as shown in the screenshot below. Your custom service “CreatetemperatureAlert” is listed below the webserver it has been created for. To make it appear in the “Services and Processes” panel, you need to click the “Select Services” button and select the check boxes along the tree structure to your service and save.

Now, select the “Start” control in the canvas and click on the “Manage Parameters” on the panel that becomes visible on the right. Add the two parameters “in_Temperature” and “in_Timestamp” – both with data type “String”.

Frank_Rambo_13-1726578754542.png

Step 7: Select the box “Read Global Variable” on the canvas, click the “Add” button in the  panel that becomes visible on the right, and select the global variable “ME_Context”.

Frank_Rambo_14-1726578754556.png

Step 8: Now, select the box “CreateTemperatureAlert” in the canvas and maintain its payload parameters in the right panel as follows:

  • Destination: Select SAP_DMC_DEFAULT_SERVICE_KEY from the dropdown, if empty
  • Header parameter alert-type: Enter your alert type name in double quotes, e.g. "Temperature_Alert---Soldering_Station"
  • AlertType: Repeat your alert type name "Temperature_Alert---Soldering_Station"
  • Deduplication Key: Any String in double quotes – for example "P_Sensor2Alert_Create_Temp_Alert"
  • Info: Any title you want to display in the “Manage Alerts” app in the alert details – for example "Soldering temperature outside of permitted range"
  • Severity: Any of the severity IDs defined for the selected alert type – in our case, "WARNING" from the standard severity set
  • Status: Any of the status IDs defined for the selected alert type – in our case, "NEW" from the standard status set
  • TriggeredOn: Click on the dropdown and select the start parameter “in_Timestamp”

Then continue with the structure “Processor” if you want a default processor to be assigned to the alert at the time of creation. Please note that, in the 2408 release, you can only enter a single processor:

Frank_Rambo_15-1726578754563.png

Continue with the structure array “DME_Shopfloor” containing the alert properties representing the manufacturing execution context the alert is created for. Here, you need to map  each alert property to the respective field of the global variable:

 

Frank_Rambo_16-1726578754576.png

Once done, it should look like in the screenshot below. You can also click on “Switch to Code” if you prefer the code view for a quick check.

Frank_Rambo_17-1726578754587.png

Lastly, you need to maintain the structure array “Sensor_Reading”. Here you need to map the start parameters “in_Temperature” and “in_Temperature” to the respective alert properties:

Frank_Rambo_18-1726578754595.png

Now, click on “Save All” and then “Quick Deploy” on the top of the screen. Again, use the button “Debug” to test the production process by providing arbitrary values for the start parameters in_Temperature and in_Timestamp. For the timestamp, it needs to use the ISO standard format, e.g. "2024-09-05T09:10:00Z"

Configuration of Component 2: MQTT Subscription

Component 2 consists of the following subcomponents: 

  1. DM production connector instance
  2. 3rd party MQTT broker connected production connector using client certificates
  3. MQTTX client connected to 3rd party MQTT broker (optionally using client certificates)
  4. MQTT subscription

The configuration of the connectivity between the 3rd party message broker and the production connector instance happens in the cloud and is then deployed and activated on the production connector instance, which runs in our set up on a VM Ware. It requires client certificates to comply with the security standards of SAP Digital Manufacturing. The details are documented in the Third-Party MQTT Broker and Manage Message Brokers sections of the Application Help for Production Connectivity Model.

As mentioned earlier, we are using MQTTX as client to simulate a temperature sensor and publish temperature reading with a timestamp to the topic “dmc/temperature” on the 3rd party MQTT broker. We have also created a subscription to this topic on MQTTX to mirror the message back to our MQTTX client so that we know that the message broker has correctly received our message. Of course, this is optional.

Frank_Rambo_19-1726578754613.png

Before setting up the MQTT subscription, we need to add the topic on the 3rd party MQTT broker that the production connector will access. This is done in the “Manage Message Brokers” app. Select your production connector instance in this app and open the tab “Production Connectors”.

Frank_Rambo_20-1726578754615.png

Click on the “Edit” button and open the detail view of the production connector. In there, click on “Add” and enter the topic name(in our setup, it’s “dmc/temperature”) that you want to subscribe to. Also, check the “Subscribe” and optionally the “Publish” check mark next to the topic.

Frank_Rambo_21-1726578754619.png

Then, click on the “Update” button on the bottom of the page.

 Now, we are ready to create the MQTT subscription in the “Manage Automatic Triggers” app. Open the app and select “Message” in the filters section as subscription type and your message broker and click on “Go” to display the MQTT subscriptions that have already been created for your production connector acting as an MQTT client.

Frank_Rambo_22-1726578754620.png

Click on “Create”. You will need to make settings in the three main sections “Context”, “Trigger Condition”, and “Action”. Let’s start with the “Context” section and make the following selections:

  • Subscription Type: Message
  • Message Broker: <your 3rd party message broker>
  • Production Connector: <your production connector connected to the selected message broker>
  • Topic: dmc/temperature
  • Message Payload Data Type: Structure
  • Schema: Name of the schema describing the sensor payload structure. You can reuse the schema you have created in the service registry for the alert property set type “Sensor_Reading”. In our case, it is called “Sensor_Reading”

Frank_Rambo_23-1726578754621.png

In the table “Message Payload Structure”, you need to maintain the “Aliases”. Enter the respective fields of the payload sent by the sensor (MQTTX client). In our case, they are identical to the field names in the schema. The “Context” section should now look as shown in the screenshot below:

Frank_Rambo_24-1726578754623.png

The “Trigger Condition” section defines the condition the messages published to the subscribed topic “dmc/temperature” must match so that the action is triggered. We expect that a sensor would publish most of the time temperature readings within the permitted value range. Only in exceptional cases they would not, and only then we need to trigger our production process “Create_Temp_Alert” in the cloud to create an alert. Using a trigger condition helps us  to keep workload away from the cloud and do the analysis whether or not the temperature readings are inside the permitted range of values on the production connector running on-premise in the shopfloor.

In our example, the permitted value range for the soldering temperature is the interval [190, 230] in degrees centigrade. Temperature readings less than 190 and greater than 230 degrees centigrade are considered as exceptions and shall trigger an alert.

Choose condition type “While true” and use the expression editor to compose the condition using the alias temperature and the available operator <, >,  and ||.

Frank_Rambo_25-1726578754625.png

Let’s move on to the “Action” section. Select “Production Process” as Action Type. Then, in “Available Processes”, first search and select your production process design “Sensor2Alert”. Then, you can select the production process “Create_Temp_Alert” from this design.

Frank_Rambo_26-1726578754631.png

You need to map the sensor payload aliases to the start parameters of the production process.

Frank_Rambo_27-1726578754634.png

Lastly, save and click on “Quick Deploy” on the top.

Now you can test the setup by publishing the payload message shown in the first screenshot of this subsection from the MQTTX client. It should trigger a new alert. The properties of this alert should contain the published message (simulated sensor reading) and the manufacturing execution context last saved from the work center POD to the global variable via the production process “Write_GlobalVariable”.

This concludes our example of how sensor readings from MQTT sensors can be enriched with the applicable manufacturing execution context and used for further processing inside or outside of DM. We created an alert from it; however, you could also send the enriched data to an external endpoint for further processing or analytical use cases.

8 Comments
Sayuj
Participant
0 Kudos

Hi Frank,

Thank you so much for the Blog. It is an intersting topic. 
Just want to ask about the TLS certificate step. I have generated the Production connector SHA1 certificate. I would like to know the next step to create the authentification in MQTT Broker. 

Could you please explain in short how did you establish this connection with some images. At the moment Im using the Mosquitto Broker. 

KR,
Sayuj

Frank_Rambo
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hi Sayuj, I didn't do this part of the set up myself but was relying on an existing setup. Therefore, I have added the relevant links to the SAP Help Portal, which explain this part of the setup. If the information in there isn't helpful, please create a ticket and the respective developers will help you with it.

vishalvk828
Explorer
0 Kudos

Hi Frank,

Thanks for sharing the information on MQTT. 

For our OPC UA communication we have a API in Production process design to write the value to the tag, do we have such option with MQTT interface ? If i need to send some value to MQTT topic on ad-hoc basis via Production process, is it possible ? 

Regards,

Vishal 

ankit12
Active Participant
0 Kudos

Hi Frank,

we are also trying to connect to MQTT broker in our DM system. The only problem that we are facing is that our MQTT broker is having  user authentication by using username and password. so when we are trying to connect MQTT broker using production connector we are unable to connect as it gives below error:

"Unable to connect to the broker.Connecting with MQTT server failed (BadUserNameOrPassword)."

Now Question is, we are unable to find where to maintain this username and password in the SAP DM or something we are doing wrong. Can you please help.

Quick response will be appriciated.

Regards,
Ankit Gupta

Frank_Rambo
Product and Topic Expert
Product and Topic Expert

@vishalvk828 It is possible to publish from a Subscription (on indicators or MQTT broker topics) data / data structures to a 3rd party or SAP broker topic. Unfortuntatley, we don't have a service "publish to MQTT Broker" available so that is currently not possible to publish data to a MQTT broker topic out of a production process or automation sequence.

 
vishalvk828
Explorer
0 Kudos

@Frank_Rambo Thanks for the reply. 

I hope this is part of roadmap list.  

Frank_Rambo
Product and Topic Expert
Product and Topic Expert

Hi Vishalvk828,

the "publish to MQTT Broker" service is planned to 2502. I've already seen a demo of it. It will make it possible to publish basic and complex data types to a topic on an MQTT broker.

vishalvk828
Explorer
0 Kudos

@Frank_Rambo This looks good.