Technology Blog Posts by SAP
cancel
Showing results for 
Search instead for 
Did you mean: 
biswajit_biswas
Product and Topic Expert
Product and Topic Expert
1,080

In today’s data-driven organizations, turning enterprise data into shareable, consumable products is critical. With the introduction of SAP Business Data Cloud (BDC)  SAP plans to expand and explore this area, and help customer to achieve it with reducing the data footprints. This blog walks through a practical, end-to-end process for creating a Data Product from SAP BW and exposing it securely to Databricks using SAP DSP (Data Sphere/Platform) and the BDC Cockpit.

In this article we would expose SAP BW 7.5 artefacts as custom into SAP Business Data Cloud (BDC) and the expose the data products to an SAP Databricks environment for consumptiom.

Prerequisites

The Data Product Generator for SAP Business Data Cloud is available for the releases SAP BW/4HANA 2023 SP0 or higher, SAP BW/4HANA 2021 SP4 or higher, and SAP BW on SAP NetWeaver 7.50 SP24 or higher.For the configuration of the Data Product Generator for SAP Business Data Cloud, see SAP Note 3590400

 
The DW Administrator should create at least one of the following types of scoped roles and assign users to them:
  • DW Space Administrator: to manage users and do monitoring

  • DW Integrator: to monitor data integration in the space

  • DW Modeler: to share the tables to other spaces for consumption

You should have access to Databricks environment.

Creating Data Products from SAP BW System

Step 1: Logon to your SAP BW 7.5 

Start by connecting to the BW system using SAP Logon. The environment should be preconfigured to work with SAP BDC, and access credentials are typically shared by an administrator.

biswajit_biswas_0-1765124665002.png

Step 2: Create BW Subscription

To create BW subscription to SAP BDC object store

 

  • Launch  the transaction RSDPS.

  • Choose Create and select the relevant InfoProvider (InfoCubes, DSOs, ADSOs etc.) as the source.

  • Select the Extraction Mode, the option available are "Full", "Delta" and "Streaming".
  • The Target is automatically selected "Datasphere Local Table".
  • Save the subscription with a clear description, as it is mandatory step.
 

 

biswajit_biswas_0-1765136971422.png
  • Activate it using Activate Subscription. Once the subscription is "Active" run the subscription to generate a local table in DSP
biswajit_biswas_1-1765137526813.png

Once active, this subscription creates the underlying data structure that DSP will use.

 

Step 3: Validate the Data Products (Tables) in SAP BDC Datasphere

Connect to SAP BDC Datasphere and open Data Builder on object store space to see the newly created table by the subscription, the table name should match the “description” entered in the subscription.

biswajit_biswas_2-1765138005363.png

Deploy the table and preview the data to ensure everything looks correct.

biswajit_biswas_3-1765138069200.png

The source from BW system is exposed as tables in the object store ( HANA Data Lake ) space in SAP BDC Datasphere, now to share these tables to Databricks.

 

Step 4: Create a Data Provider Profile in SAP BDC Datasphere

Before creating Data Products, you must set up a Data Provider Profile:

In SAP BDC Datasphere navigate to Data Sharing Cockpit → My Data Provider Profile, and Create Data Provider Profile.

biswajit_biswas_5-1765139971147.png
Fill in required fields, enable Formations, and complete the namespace, provide an email id and save the data profile.
 
biswajit_biswas_8-1765140438119.png

Step 5: Build Your Data Product

  • In SAP Datasphere select “Data Sharing Cockpit” -> “My Data Products” and select the blue + sign biswajit_biswas_10-1765140984858.png to create a new “Data Product”.
  • Make sure you select the correct space as "Artifact Space", complete the required fields.
biswajit_biswas_11-1765141075480.png

 

  • Go down in the product creation page until Product Artefacts, Add an Artefact and select the local table created in previous Step
  • Save the Data Product
  • Go to Switch Status and select “List” to make the data Product available in the BDC cockpit.
biswajit_biswas_12-1765141187325.png

 

Step 5: Share the Data Product with Databricks

To make your Data Product available to Databricks follow the steps below

  • Open the BDC Cockpit.

  • Search for your Data Product in Catalog & Marketplace.

 

biswajit_biswas_13-1765141404153.png
 
  • Select the Data Product and click on the share button as shown below.
    biswajit_biswas_14-1765141650791.png
  • Select the appropriate target in the list, here we select the DBX server associated to this environment. If it is not listed just click “Add Target” and select it from the list

 

biswajit_biswas_15-1765141804420.png
  • Once Data Product is successfully shared, you can open it in Databricks using the Open icon in the sharing panel (in red above)
  • Databricks login page opens, use your email and “continue with SSO”, as we are using SAP Databricks landscape.
  • Databricks opens and displays the catalog objects created for the Data Product.
biswajit_biswas_16-1765141965773.png

The above steps  creates a secure, governed data sharing channel with no manual file exports or insecure data copies, reducing Data Footprints accross landscape.

 

 

 

Summary

This process transforms raw BW data into a governed, shareable Data Product that business and engineering teams can immediately use in Databricks. With proper subscriptions, profiles, and sharing, you can create a seamless data pipeline from SAP to modern analytics platforms. 

This end-to-end flow demonstrates how organizations can:

Govern enterprise data
Package it as reusable products
Share it securely across platforms
Enable faster analytics and innovation

Instead of copying data between systems, this approach builds a single, trusted pipeline from SAP BW to SAP BDC to Databricks.

2 Comments