Data and Analytics Blog Posts
cancel
Showing results for 
Search instead for 
Did you mean: 
aelbouayadi
Associate
Associate
2,467

In this Blog, we'll demonstrate how to create a custom data product from a replication flow that can be delta-shared with SAP Databricks or BDC connect.

There are few steps to follow:

1- Create a SAP HANA Data Lake Files (HDLF) Space in Datasphere

2- Create a replication Flow from your source (in this blog we'll use SAP S/4HANA) to the created HDLF Space.

3- Create Data Provider Profile in Datasphere

4- Create and list a Data product

5- List the Data product

6- Share the Data product from the BDC cockpit

7- Check the Data product from the BDC cockpit

1- Create a SAP HANA Data Lake Files (HDLF) Space in Datasphere

From the space management, you need to create a new space (HDLF)

2.png

2.png

This step can be skipped if you already have an HDLF space.

Once the space is created, you need to establish the connection to the source system (supporting replication flows) that you need.

2- Create a replication Flow from your source (in this blog we'll use SAP S/4HANA) to the created HDLF Space.

Once the space an the right connection are done, we can start creating our replication flow=

For this scenario, I will use the CDS view I_CUSTOMER:

3.png

As you can see, I enabled Initial and delta, so my data product well be refreshed as soon as a delta is loaded (it may take some refresh time to be reflected on the data product)

As a result of this replication flow, I will get a local table :

 

4.png

This table will be the source of our Data Product.

3- Create Data Provider Profile in Datasphere

In this step we need to create our data profile provider.

Data Sharing Cockpit -> My Data Provider Profile -> Create Data Provider Profile

5.png

You can follow this help page to create your Data Provider Profile: https://help.sap.com/docs/SAP_DATASPHERE/e4059f908d16406492956e5dbcf142dc/4d298f8654fe4a6c9b6a4399a9...

In Order to be able to share the data product within BDC, you MUST set the "Data Provider / Data Product Visibility" to Formations.

 

6.png

Now, you will be able to create Data Products.

4- Create and list a Data product

In this step we'll use our Data Provider Profile to create the Data products.

Data Sharing Cockpit -> My Data Products -> Create Data Product

7.png

 

This link is helpful to create the Data Product: https://help.sap.com/docs/SAP_DATASPHERE/e4059f908d16406492956e5dbcf142dc/bbcbf42b0cb541529e63628d95...

 

But Here are some of the important field that you need to fill:

  1. Artifact Space: the HANA Data Lake File space created in the first step and used for replication8.png
  2.  Data category: you need to choose a category of your data product. In this example, I choose "Company Data"

    9.png

  3.   Product Artifacts: Here is where we add our local table to the data product.

10.png

 

You have the possibility to add multiple tables to your data product.

You will see the added artifacts to your DP.

13.png

Last, you need to save the changes for your data product.

 

5- List the Data product

After saving the Data product, you will be able to see your DP like this:

 

14.png

In order, to share the DP within BDC you need to List your DP :

  • Switch Status -> List

15.png

The status will change to:

16.png

The Data Product will be available in the BDC cockpit after the next synchronisation job is done

6- Share the Data product from the BDC cockpit.

Now we will switch to the BDC cockpit.

From the menu we Navigate to "Catalog & Marketplace" -> Search

 

And we can search for our Data Product " CustomerDataP"

18.png

From here we can share our Data Product and see it's lineage.

19.png

- Share -> Add Target :

 

- Choose your target(SAP BDC or BDC Connect), In this Scenario, we'll choose SAP Databricks:

Then we add a Share Name : "customerdp", specify the Workspace in SAP Databricks, and Share

Once shared, you will se a notification telling you so.

7 - Check the Data Product in SAP Databricks

In this step we'll check our shared data product in SAP Databricks.

From the previous screen, once the shared is completed we can open our data product in SAP Databricks:

22.png

In SAP Databricks' Catalog, we can see our shared data product

23.png

From here we can visualize a sample data :

 

24.png

From here, it is up to your creativity in Databricks 😉

 

Summary

In this blog, we saw how we can create custom Data Products in SAP business Data Cloud. We started from a creating a replication flow in Datasphere to a HANA Data Lake File space, we created a Data Provider Profile that allowed us to created a Data Product from the local table (result of the rep flow). Then we were able to list tha data product so we can find it in the BDC cockpit catalog and finally share it with SAP Databrick.

 

I hope this will help you create you custom data products from Datasphere.

Kind regards,

 

9 Comments
Top kudoed authors