In this Blog, we'll demonstrate how to create a custom data product from a replication flow that can be delta-shared with SAP Databricks or BDC connect.
There are few steps to follow:
1- Create a SAP HANA Data Lake Files (HDLF) Space in Datasphere
2- Create a replication Flow from your source (in this blog we'll use SAP S/4HANA) to the created HDLF Space.
3- Create Data Provider Profile in Datasphere
4- Create and list a Data product
5- List the Data product
6- Share the Data product from the BDC cockpit
7- Check the Data product from the BDC cockpit
From the space management, you need to create a new space (HDLF)
This step can be skipped if you already have an HDLF space.
Once the space is created, you need to establish the connection to the source system (supporting replication flows) that you need.
Once the space an the right connection are done, we can start creating our replication flow=
For this scenario, I will use the CDS view I_CUSTOMER:
As you can see, I enabled Initial and delta, so my data product well be refreshed as soon as a delta is loaded (it may take some refresh time to be reflected on the data product)
As a result of this replication flow, I will get a local table :
This table will be the source of our Data Product.
In this step we need to create our data profile provider.
Data Sharing Cockpit -> My Data Provider Profile -> Create Data Provider Profile
You can follow this help page to create your Data Provider Profile: https://help.sap.com/docs/SAP_DATASPHERE/e4059f908d16406492956e5dbcf142dc/4d298f8654fe4a6c9b6a4399a9...
In Order to be able to share the data product within BDC, you MUST set the "Data Provider / Data Product Visibility" to Formations.
Now, you will be able to create Data Products.
In this step we'll use our Data Provider Profile to create the Data products.
Data Sharing Cockpit -> My Data Products -> Create Data Product
This link is helpful to create the Data Product: https://help.sap.com/docs/SAP_DATASPHERE/e4059f908d16406492956e5dbcf142dc/bbcbf42b0cb541529e63628d95...
But Here are some of the important field that you need to fill:
You have the possibility to add multiple tables to your data product.
You will see the added artifacts to your DP.
Last, you need to save the changes for your data product.
After saving the Data product, you will be able to see your DP like this:
In order, to share the DP within BDC you need to List your DP :
The status will change to:
The Data Product will be available in the BDC cockpit after the next synchronisation job is done
Now we will switch to the BDC cockpit.
From the menu we Navigate to "Catalog & Marketplace" -> Search
And we can search for our Data Product " CustomerDataP"
From here we can share our Data Product and see it's lineage.
- Share -> Add Target :
- Choose your target(SAP BDC or BDC Connect), In this Scenario, we'll choose SAP Databricks:
Then we add a Share Name : "customerdp", specify the Workspace in SAP Databricks, and Share
Once shared, you will se a notification telling you so.
In this step we'll check our shared data product in SAP Databricks.
From the previous screen, once the shared is completed we can open our data product in SAP Databricks:
In SAP Databricks' Catalog, we can see our shared data product
From here we can visualize a sample data :
From here, it is up to your creativity in Databricks 😉
In this blog, we saw how we can create custom Data Products in SAP business Data Cloud. We started from a creating a replication flow in Datasphere to a HANA Data Lake File space, we created a Data Provider Profile that allowed us to created a Data Product from the local table (result of the rep flow). Then we were able to list tha data product so we can find it in the BDC cockpit catalog and finally share it with SAP Databrick.
I hope this will help you create you custom data products from Datasphere.
Kind regards,
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.