We’ll begin with Data Federation, which, in this proof of concept, is demonstrated conceptually without actual implementation. As mentioned earlier, this process involves federating Waste Data from Smart Bins and Cost Data from SAP S/4HANA Cloud into SAP Datasphere. This integration aims to facilitate comprehensive and informed analytics and forecast for Bob the sustainability analyst on sustainability metrics of medicate waste, and John the financial analyst on financial metrics of medical waste.
If you are interested in learning more about IoT integrated with SAP BTP, please refer to our session on the Flexible Energy Grid to explore further insights into the IoT solution approach.
The waste cost data comes from SAP S/4HANA Cloud look like this;
To help Bob the sustainability analyst on keep tracked waste sustainability KPIs such as Waste Volume in kilogram and Greenhouse Gas Emission produced by the waste, John the financial analyst on better budget allocation of the cost for processing the waste for the coming months, machine learning is very helpful in forecasting the waste metrics for the next 12 months based on the historical data.
Next, we’ll move on the SAP Federation Machine Learning, learning how to build a medical waste forecasting model on Amazon SageMaker with the help of SAP FedML.
Before delving into how-to, you are recommended to read this blog post about an overview introduction to SAP Federation Machine Learning(FedML) by sangyrak1 for why and what it is about SAP FedML, and Federated Machine Learning using SAP Datasphere and Amazon SageMaker 2.0by karishma_kapur
Amazon SageMaker offers a broad selection of purpose-built tools that cover every step in ML development, encompassing data preparation, model building, training, deployment, and model management. However, our focus remains on the integration of SAP FedML with Amazon SageMaker.
Now let’s have a look at the process illustration of the Waste Metrics Forecast using Federated Machine Learning onSAP Datasphere and Amazon SageMaker.
Next, we are going through the end-to-end process of building a medical waste forecast model with Amazon SageMaker using SAP Federation Machine Learning library.
We'll start with some initial configurations, such as the creation of a technical data user in the designated space of SAP Datasphere. This user will be granted appropriate read-write privileges, enabling SAP FedML to retrieve and write data seamlessly from and to SAP Datasphere.
In this step, let’s see how to retrieve the data from SAP Datasphere into Amazon SageMaker via SAP FedML to prepare for the training.
The Waste Transaction data has been successfully loaded into Amazon SageMaker as a pandas dataframe. Given the goal of forecasting waste sustainability and financial metrics for the next 12 months, it becomes essential to preprocess the waste transaction data by aggregating the target metrics based on calendar months, facilities, and waste categories.
Now, the waste data has been aggregated by calendar month. Next, we’ll apply time series forecast on it to forecast for next 12 months.
The forecast has been complete, next let’s check it by plotting it.
We are happy the forecast result, let’s write them back to SAP Datasphere.
Now the forecast results have been written back to the open schema of SAP Datasphere, let’s see how to use it for data modeling.
Well! Through the previous sections, we have seen the vital role of SAP Datasphere in the data ingestion & federation across multiple data sources, as well we have explored how-to leverage Daatsphere, FedML lib & Hyperscaler AI Platform to source medical waste metrics data, build, train and deploy ML models its forecasting.
Below is our solution architecture again, where we will particularly focus on Data Modeling in Datasphere;
Datasphere offers multiple modeling capabilities for everybody which can address different personas from Business Analysts with deep business understanding to tech-savvy Developers and Power users, providing powerful NCLC built-in graphical editors for modeling.
In our use case, we will leverage Datasphere Data Builder since it is the central place for data modeling, whereby you can find various editors to create artifacts in the data layer.
In addition, we created a fact table to represent the attributes of the transactional table - waste disposal orders along with its relevant IoT federated waste measurements such as CO2 emissions, Weight, location, incurred cost…etc, then on top of our fact table we did build a fact view to project the waste transactions and its sustainability & cost metrics, afterwards we expose it for consumption.
We have also created another fact view, to project the waste metrics forecasting for weights, CO2 emissions and costs. This view is actually based on the Open SQL Schema which was created in our Datasphere space and finally populated to carry the waste metrics forecasting data for the upcoming 12 months as a result of inferencing the waste forecasting ML model (which was previously trained on a historical data of last 5 years), that was indeed mentioned earlier.
In our use case scenario, the consumption will be through SAP Build app & SAP Analytics Cloud as well!
This is where the SAP Analytics Cloud Stories will be built on top of our analytic models for the Medical Actual Waste Metrics as well the Forecasting data.
In our use case we did create a live data connectivity towards SAP Datasphere from SAP Analytics Cloud in order to retrieve live the waste related metrics data which we will analyze in a demo video later on.
It’s time to pack all of those nice features together in the Business User App that we designed with Build Apps and Cloud Application Programming Model. This is the scope of the implementation for the app:
Ok, so let’s get started with the first leg, it is, binding the Datasphere view containing the smart bins released to disposal to the App itself.
Ok, so the transactional part is covered, let’s move to the analytical part, it is, how to embed the cool analytical stories into our Build App?
With the SAP Analytics Cloud, embedded edition, you can build and embed reports, dashboards, and visuals into your business application to make confident decisions. You can explore your business data via live connection between your SAP Analytics Cloud tenant and the remote SAP HANA database on SAP BTP.
It is available as a service on top of BTP under CPEA. This variant is meant for the application developers who would embed SAC and makes the analytics available for the end users within the context of the business application’s UI. From features point of view, it only offers the BI capabilities (no planning, predictive and analytics designer capabilities available with this variant).
While on the other side, SAC Enterprise Edition is the complete license of analytics that supports the 360-degree analytics with full capabilities like BI, planning, predictive and Analytics designer with both live and import data connection.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
33 | |
13 | |
11 | |
11 | |
10 | |
9 | |
9 | |
9 | |
8 | |
7 |