Automating data collections in Production Operator Dashboards (PODs) by reading indicators from shopfloor systems is a well-known scenario in SAP Digital Manufacturing used by many customers. Learn in this article how to leverage readings from sensors published to 3rd party MQTT brokers as well.
Introduction
Since release 2405 you can enable your own 3rd-party MQTT brokers to integrate with SAP Digital Manufacturing (DM) and use the “Manage Automatic Triggers” app to subscribe to MQTT topics and trigger actions when new data from MQTT devices or sensors has been published. These innovative features open the door to new types of scenarios that weren’t possible in DM before. A first type of scenarios is device or sensor driven. This is when, for example, a sensor publishes a new reading onto a topic on a third-party MQTT broker from where it is picked up by a MQTT subscription in DM, which triggers an action within or outside of DM. For a detailed example of such sensor driven scenario please refer to my former blog “Enriching Readings from MQTT Sensors with the Applicable Manufacturing Context” explaining how sensor readings published to an MQTT broker can be enriched with manufacturing execution context and create an alert in DM, if they violate given thresholds.In this blog, I’d like to tackle a second type of scenarios that is initiated from a production operator dashboard (POD), for example by a production operator clicking a button. I’ll walk you through the example of automating a data collection from a POD with data published to your 3rd-party MQTT broker. Automating data collections with data by reading indicators mapped to tags in shop floor systems is a well-known scenario that is based on a request-response pattern. Clicking a button on a POD triggers a production process, which invokes the “Read Indicator” service fetching the indicator value from the respective tag in the shopfloor system that is mapped to the indicator.
Unlike shopfloor systems that follow a request-response pattern, MQTT brokers use a publish-subscribe pattern and clicking a button on a POD does not trigger a sensor to publish a new reading. Here, the trick is buffering the latest published reading of the sensor in a global variable in DM and fetch it from there when clicking the button.
Flow at Runtime
Let’s look at the scenario in some more detail as depicted in the figure below. The flow can be splitted into 3 steps:
It is important to understand that steps 1 and 2 on one hand and step 3 on the other hand run fully asynchronously. The frequence of execution of the first two steps is driven by the sensor which may publish its temperature readings with timestamps in far higher frequency to the MQTT broker than production operators trigger data collections in step 3 manually from the POD. Therefore, we recommend running the built-in services to write and read the global variable on plant level as we do in our example. This way you avoid unnecessary data transfer to the cloud triggered by each new temperature reading happening potentially in high frequency. If you have control over the sensor, we recommend throttling its readings down to a reasonable frequency that matches your requirements with regards to precision and accuracy of the collected data.
Configuration
In the remainder of this blog, I’ll walk you through the configuration of this scenario. It is done in the following sequence of steps:
Certain parts of the configuration are the same as in my former blog “Enriching Readings from MQTT Sensors with the Applicable Manufacturing Context”. I’ll refer to it where appropriate to avoid duplication of content.
Create Production Process Design with Global Variable
In the “Design Production Processes” app create a new production process design “Sensor2DC”. Clicking on the chevron icon opens the global variables area of the canvas. Then, create a new design-wide global variable “Sensor_Reading” of Data Type “Structure” and select the schema “Sensor_Reading” which I have described my former blog. It consists of the two fields “temperature” and “timestamp”. Select as runtime environment your production connector that is connected to your MQTT broker.
Create automation sequence “Write Global Variable”
Click on the “Create” button and create a new production process with the selections shown in the screenshot below:
Click on the pencil icon on the left pane and drag the Start control into the center canvas. Select the start control and click on the “Manage Parameters” button on the right pane. Create the two start parameters “in_Temperature” and “in_Timestamp” as shown in the screenshot below:
In the left pane select the “Write Global Variable” service you find under “Built-in Services” and drag it into the central canvas next to the Start control. Next, drag the “Stop” control from the left pane into the central canvas and connect the “Start” control with the “Write Global Variable” service and this service with the “Stop” control by drawing arrows. Select the service and click on “Add” in the right pane. In the popup window select the previously created global variable “Sensor_Reading”.
In the next popup window map the start parameters as values to the fields of the global variables as shown in the screenshot below:
Now, click “Save All” and “Quick Deploy”. Once deployed you can already do a quick test by clicking on the “Run” button. Enter values for temperature and timestamp in the correct format shown below and hit “Run”. Watch out for the glow-up message shown in the screenshot below:
Note that automation sequences running production connectors cannot be monitored with the “Monitor Production Processes” app. Once you have completed the next step of the configuration you can read the values from the global variable that you’ve written here.
Create automation sequence “Read Global Variable”
Click on Edit to create a new version of your production process design. Then, click on the “+” on the right edge of the canvas area to create a new production process. Make the selection as shown in the screenshot below:
Like in the previous configuration step draw an automation sequence that consists of a “Start” control, the built-in service “Read Global Variable”, and a “Stop” control connected by arrows. Then, select the built-in service and click “Add” in the right pane. In the popup window select the global variable “Sensor_Reading” as shown below:
You don’t need start parameters for this automation sequence but output parameters. Select the “Stop” control on the canvas and click “Manage Parameters” in the right pane. Create the two parameters shown in the screenshot below:
Click “Save” on the popup, then map the respective fields of the global variable to the output parameters as shown below:
Once done, click “Save All” on the top followed by “Quick Deploy”. Now, click “Run” to test the automation sequence by checking the temperature value and timestamp you have written into the global variable during the test of the previous configuration step:
Create production process “Log Data Collection Values” in the cloud
Create a new production process “Log Data Collection (DC) Values” in the cloud and make sure you have checked “Visible to Production Connector / Plant Connectivity Runtime” and you have also set the toggle to “Publish to Service Registry”.
Start by dragging the “Start” control into the canvas. Select it and click “Manage Parameters” on the right pane to create the start parameters as shown in the screenshot below:
Then, click on “POD Connection” in the right pane and select the POD Type. In our example we are using a customized POD of type “Order”. Map the respective start parameters as shown in the screenshot below:
Drag the production process “Read Global Variable” from the left pane into the canvas and place it next to the “Start” control. Then, drag the service “Log DC Values” into the canvas and place it next to the subprocess “Read Global Variable”. You find the service in the left pane under “DMC_Cloud > Data Collection”. Lastly, place the “Stop” control next to it in the canvas and connect the 4 objects with arrows:
Now, you need to configure the payload of the “Log DC Values” service. Select the service and work through the payload parameters listed in the right pane:
1. Click on the structure “group” and enter the name of your data collection group and its version:
In our example we want to automate the data collection “0_TEMPERATURE_READING” with the parameters “TEMPERATURE” and “TIMESTAMP”, which is assigned to the work center “MIXING”:
2. Click on the structure “operation” and map the start parameter “in_Phase” to the field “operation” and enter as version “ERP001”. Your settings may differ here depending on the details of your use case:
3. Continue with the structure array “parameterValues”. The popup shows only the the fields needed for the first parameter “TEMPERATURE” of our data collection. For the field “name” enter the string “TEMPERATURE” and map the respective output parameter of the subprocess “ReadGlobalVariable” to the field “value”:Then click “Switch to Code” and select the code that belongs to the structure “TEMPERATURE”:
Copy and paste and click “Switch to Form”:
In the form view appear now the fields needed for the second parameter “TIMESTAMP” of the data collection. Enter as “name” the name of the parameter “TIMESTAMP” and as value map again the respective output parameter of the subprocess “ReadGlobalVariable” as shown below. You can ignore the warning “The variable is of the different data type” when you click save.
4. Click on the next structure array “sfcs” in the right pane and map the start parameter “in_Charge” to it:
5. Map the remaining payload fields “plant”, “resource”, “stepid” and “workCenter” to the respective start parameters as shown in the screenshot below. Click “Save All” and “Quick Deploy”:
Note that it is easier to test the production process “Log DC Value” from the custom order POD once you have completed the next configuration step.
Customize POD for automated data collection
In our example we are using the custom-built order POD that was built from scratch with the POD Designer. It consists of the pages “Main Page”, “Activities”, “Goods Receipt”, “Work Instructions”, and “Quality Instruction Results”. In the screenshot below, we have selected the page “Activities”. This page is composed of the “Order Card” plugin on the top followed by the “Operation/Phase List” plugin. Lastly the page includes an icon tab bar with multiple plugins appearing as tabs. Among them is the “Data Collection” plugin. We have configured the properties of this plugin in the right pane to include a button “Data Collection” that appears under the icon tab bar in the central canvas.
Selecting this button in the canvas shows its configuration properties in the right pane. They include a button “Assign Actions”:
Clicking on it opens the “Configure Action Button” popup. In there, on click
“Add”. In the “Add Action” popup choose “Production Process” from the “Type” dropdown and select as service in our example “P_Sensor2DC_LogDCValues” and click “Create”:
Click on the gear wheel icon in the “Configuration” column so that the configuration settings on the right become visible and map the respective POD variable to each listed start parameter of the selected production process as shown in the screenshot below and click “Close”:
Now, on the top bar of the POD Designer open the POD Notifications popup by clicking the icon between the information and the copy icon. Ensure that subscriptions to “Work Center” and “Resource” are checked, and the event “Data Collection List” is checked. This activates the progress bar in the data collection list plugin when clicking on the “Collect” button in the POD at runtime:
Close the popup and click “Save”. Now, you can test whether POD and production process work properly together. Click on the glasses icon on the top bar. In our example, this will bring up the main page of our custom order POD with the order selection list. We select our work center “MIXING” and a list of process orders matching the selected planned start data range shows up:
We select the last order on the list and move on the to the “Activities Page”:
Once we have started Phase ID “0020” we click in the icon tab bar on “Data Collections” and in there on the “Collect” button. While the buffered sensor readings are read from the global variable and reported back the progress bar in the “Parameters” is moving from left to right:
Also check the production process in the “Monitor Production Processes” app:
Create MQTT subscription
For the creation of the MQTT subscription I’d like to refer to the section “Configuration of Component 2: MQTT Subscription” of my former blog mentioned above. You need to configure all 4 subcomponents mentioned there. Only the condition and the action of the MQTT subscription differ. As condition you simply choose “always true” and as action you select “Production Process” as action type and select from the production process design “Sensor2DC” the production process “Write Global Variable”. As a last thing you need to map the MQTT payload fields “temperature” and “timestamp” to the respective start parameter of this production process as shown in the screenshot below:
As mentioned above we are using the MQTTX client tool instead of a sensor to publish a temperature and timestamp on the topic “dmc/temperature” of our 3rd-party MQTT broker. This requires a respective connection to be setup in MQTTX.
Each time the MQTTX client publishes a new temperature reading to the topic dmc/temperature the automation sequence “Write Global Variable” is triggered by the MQTT subscription and updates the global variable “Sensor_Reading”. You can test your setup by clicking the “Collect” button in your POD as explained in the previous section.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
7 | |
2 | |
2 | |
2 | |
1 | |
1 | |
1 | |
1 | |
1 | |
1 |