In today’s data-driven organizations, turning enterprise data into shareable, consumable products is critical. With the introduction of SAP Business Data Cloud (BDC) SAP plans to expand and explore this area, and help customer to achieve it with reducing the data footprints. This blog walks through a practical, end-to-end process for creating a Data Product from SAP BW and exposing it securely to Databricks using SAP DSP (Data Sphere/Platform) and the BDC Cockpit.
In this article we would expose SAP BW 7.5 artefacts as custom into SAP Business Data Cloud (BDC) and the expose the data products to an SAP Databricks environment for consumptiom.
The Data Product Generator for SAP Business Data Cloud is available for the releases SAP BW/4HANA 2023 SP0 or higher, SAP BW/4HANA 2021 SP4 or higher, and SAP BW on SAP NetWeaver 7.50 SP24 or higher.For the configuration of the Data Product Generator for SAP Business Data Cloud, see SAP Note 3590400
DW Space Administrator: to manage users and do monitoring
DW Integrator: to monitor data integration in the space
DW Modeler: to share the tables to other spaces for consumption
You should have access to Databricks environment.
Step 1: Logon to your SAP BW 7.5
Start by connecting to the BW system using SAP Logon. The environment should be preconfigured to work with SAP BDC, and access credentials are typically shared by an administrator.
Step 2: Create BW Subscription
To create BW subscription to SAP BDC object store
Launch the transaction RSDPS.
Choose Create and select the relevant InfoProvider (InfoCubes, DSOs, ADSOs etc.) as the source.
Once active, this subscription creates the underlying data structure that DSP will use.
Step 3: Validate the Data Products (Tables) in SAP BDC Datasphere
Connect to SAP BDC Datasphere and open Data Builder on object store space to see the newly created table by the subscription, the table name should match the “description” entered in the subscription.
Deploy the table and preview the data to ensure everything looks correct.
The source from BW system is exposed as tables in the object store ( HANA Data Lake ) space in SAP BDC Datasphere, now to share these tables to Databricks.
Step 4: Create a Data Provider Profile in SAP BDC Datasphere
Before creating Data Products, you must set up a Data Provider Profile:
In SAP BDC Datasphere navigate to Data Sharing Cockpit → My Data Provider Profile, and Create Data Provider Profile.
Step 5: Build Your Data Product
Step 5: Share the Data Product with Databricks
To make your Data Product available to Databricks follow the steps below
Open the BDC Cockpit.
Search for your Data Product in Catalog & Marketplace.
The above steps creates a secure, governed data sharing channel with no manual file exports or insecure data copies, reducing Data Footprints accross landscape.
Summary
This process transforms raw BW data into a governed, shareable Data Product that business and engineering teams can immediately use in Databricks. With proper subscriptions, profiles, and sharing, you can create a seamless data pipeline from SAP to modern analytics platforms.
This end-to-end flow demonstrates how organizations can:
✅ Govern enterprise data
✅ Package it as reusable products
✅ Share it securely across platforms
✅ Enable faster analytics and innovation
Instead of copying data between systems, this approach builds a single, trusted pipeline from SAP BW to SAP BDC to Databricks.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
| User | Count |
|---|---|
| 48 | |
| 23 | |
| 20 | |
| 18 | |
| 16 | |
| 16 | |
| 13 | |
| 13 | |
| 13 | |
| 12 |