Enterprise Resource Planning Blogs by Members
Gain new perspectives and knowledge about enterprise resource planning in blog posts from community members. Share your own comments and ERP insights today!
Showing results for 
Search instead for 
Did you mean: 
Active Contributor
In this series I would like to take you on a journey. Starting with the marketing term "Intelligent Enterprise", ending with a concrete technical architecture and a recipe how to implement it with little costs.

Problem definition

How does a manager get the status of his manufacturing plant? He will likely open a report with the daily production figures. This is way better than in the past, where he got the numbers in printed form once a month, but still far from perfect. The main issue is the time-to-react.

One aspect is the latency of the information. An entire business day might be lost before taking action, before even knowing about the problem. The solution is S/4Hana. Because its technical foundation allows operational reports directly on the ERP data and the ERP data is current, the manager knows the current situation. Nice.

The other aspect is the triggering event. What happens in reality? Does the manager refresh the report every minute? Of course not. Very likely he will open the report in the morning, see all is green and then work on his many other tasks. As consequence, an entire business day might be lost before taking action, before even knowing....

What is the root problem here? We could call it: Push vs pull of the information.

In above setup the manager triggers the action "open report" and pulls out the information. What would be needed is a push of the information. The manager should be called as soon as there is a problem.

We have the same problem in communication between employees. Do we call the subordinates every five minutes asking "Is the status still green?". Of course not. We rather create a company culture where problems are proactively communicated. Why not doing the same in the IT processes?

Information Push

The unique selling point of the SAP ERP is its tight integration between the modules. A sales order is entered in the Sales & Distribution module, this causes a material to be reserved for that sales order in the Material Management module, in parallel a production order is triggered. All is one large workflow where one change is causing other actions to happen. In that sense the proactive information flow exists within the ERP system.

The process breaks apart however, as soon as external systems are involved. No wonder there are so many customer complains about missing data integration between ERP and the SAP own cloud offerings today. And it gets worse when integrating non-SAP systems.

Yes, such integrations can be built, are built, but they are complex and expensive.

Now imagine the S/4Hana system would broadcast every kind of change. Always. Nothing to configure or build, every change is pushed immediately. Changes are flowing through an array of pipes and everybody can just tap into the stream at their own discretion.

What would be the impact of that?


Business value

Is such a solution needed today? Partly. But boy, the things it would enable if built properly! Ask yourself and for your job: "What could you do if all systems tell the changes proactively?"

Some examples:

Alerting Services

In above use-case of the manager, he would get an alert telling there is a problem plus the link to the report. He might even have a dedicated screen visualizing the plant status in realtime like in a mission control center. One timely reaction enabled by this solution might be worth a lot.


Business Rule Service

Coming to the Intelligent Enterprise, additional little services could be attached to these change events. Above manager has the time to look at the grand scheme of things only. He might see a satisfactory amount of parts are produced but that the build quality is sub par because one robot is misaligned. Adding a person to monitor just that aspect would be expensive, but since all changes are available now, a little piece of code could be added listening on interesting data and starts to signal as soon as there is a problem. This type of service could be categorized as business rules services. Well defined if-then conditions.


Machine Learning Services

Each rules does monitor just one or two variables, usually. What is a "good outcome?". The manufacturing speed slightly lower but the output quality perfect? If the shipping backlog is high, the balance might tend towards superb quality and significant higher speed instead. If cost is important, a slower, and hence low-wear on the tools, manufacturing takt is the desired optimum. Using higher order mathematical methods, currently all summarized under the term Artificial Intelligence, enable that. Such machine learning services can tap in into the various data streams and monitor the complex interdependent variables to summarize the current situation.


Data Integration Services

Connecting other systems would be trivial in such a world. All change events the other system is interested in are simply loaded into it. Only thing to do is transforming the data from the one format into the other.



Nobody knows what the future holds. But a realtime message bus for all change events can be a backbone for future developments.


What is new?

These ideas are nothing new. In fact it has been tried multiple times already. Enterprise Data Model, Enterprise Information Integration, Service Orientated Architecture, Enterprise Data Integration, Enterprise Service Bus, Pub/Sub, API driven architecture,... Any of these ring a bell?

All of these initiatives failed to bring the full value, mostly for technical reasons. Looking at it from today, it is obvious why: Contradictions all over the place.

  1. Data needs to flow through in parallel for performance reasons but sequentially for consistency reasons.

  2. Data needs to be well structured but flexible.

  3. Data needs to be easily accessible but the volumes are huge.

  4. Data needs to be available to many services but the TCO should be little.

  5. Data needs to be offered but not forced.

  6. Data needs to support security and privacy regulations without adding burdens.

  7. Data needs to be processed where it resides, yet the source system is strained for resources already.

The main thing that changed are the technologies invented for the Big Data space. These have little to do with above challenges but provide all the ingredients to build the perfect solution.

The key component, Apache Kafka, we will look at next. What is it, where is it different, how does it resolve above contradictions.