Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
9,758
In March 2023, with the SAP Data Unleashed Event, SAP announced the evolution of SAP Data Warehouse Cloud into SAP Datasphere which will expand the scope beyond data warehousing into the entirety of data management as a true business data fabric - especially with data integration, data governance, and self-service use cases.


With the enlarged technical capabilities, the organization of users and their data within the stack becomes even more important to become a data-driven intelligent enterprise. Within SAP Datasphere, the spaces remain the central architectural paradigm to help represent both the organizational (Finance Space, Marketing Space, Country ABC Space) as well as the data value chain (Integration Space, Harmonization Space, etc.). With the growing adoption, the collaboration mechanisms as well as change management needs increase significantly – as emphasized by the growing importance of the data mesh principles.


And as one the picture above, the data marketplace marks a central component in this setup. It started with the use case of external data integration that now offers more than 3.000 data products from 100+ Data Providers with a public data marketplace which is comprehensively described in this successor blog.

Now, the same technology as a foundation will now be leveraged for two additional use cases which is the focus of this blog:

  • Public & Private Data Sharing with a companies ecosystem that is based on SAP Datasphere or 3rd Party Technology Stacks such as planned for Catena-x

  • Internal Data Sharing within the same tenant or in a decentral multi-tenant landscape.




This becomes possible with the introduction of Context Management which allows data product owners to define the visibility of a data product through the assignment of one or several contexts. The context can be the public one owned by SAP or any user can create one where the users of a specific tenant are invited or selected ones are invited via a license key. As a consequence, a consuming user always gets the full search result across all contexts and can pick between internal and external data products based on the user’s demand.


For the Internal Data Marketplace use case, this now opens significant new opportunities in architecting your data analytics stack. While it so far made sense to have everything in one single (productive) tenant, you can now run in a more decentral fashion where you have one tenant that is owned by IT and acts as a stable reporting backend while these assets are made available on the data marketplace that on LoB-owned tenants can be consumed for enhancement with required local data and perspectives. This can be an interesting option for organizations that have more decentralized organizations and decision processes and become thus more agile while keeping the data connected.

To get started with an Internal Data Marketplace use case, we would like to share 5 easy first steps to implement for a steep start into the data democracy journey with SAP Datasphere:

  1. Start with one corporate-wide context with non-sensitive data that can be accessed without a license key to facilitate a low-barrier consumption experience, e.g. company locations, org structure, product hierarchy.

  2. Create Data Products for (at least) the most important Dashboards to deliver both analytics and data to the internal customers to foster data democratization. These have been proven use cases that other employees might want to extend for their purposes.

  3. Promote public / open data products that are for free or consider company-wide licenses for commercial data products

  4. Establish MS Teams / Sharepoint site where processes are described, LoB users can contact and also raise requests for demanded data products

  5. Create a Data Provider Profile for each LoB Space that is being created to share access key(s) with Space Admin(s) so that any space can share its artifacts within the enterprise instead of establishing silos.


These steps ensure that the data-savvy consumer base within an organization directly finds relevant content to also demand the right additions of the internal data product portfolio. Many customers already have existing internal data exchanges to which you can connect the SAP Datasphere Data Marketplace via API/CLI. But even then it is still recommended to test and validate your data products within your SAP Datasphere user base.

When running an internal data marketplace in a multi-tier landscape (Dev, Test, Prod), you should familiarize yourself with the opportunities and implications to consider when working with data products for which a blog will be posted soon and then linked here.


The mentioned API capabilities of the SAP Datasphere Data Marketplace become even more important when establishing Public & Private Data Exchanges with your ecosystem that can run on SAP Datasphere or where certain members run on a different stack. One interesting example is the German Automotive Network Data Catena-X which will be based on the EDC connector to which an SAP Datasphere user will be able to connect and thus list Data Products in SAP Datasphere which become listed and deliverable in the Catena-X network.

The following roadmaps item are of interest for these scenarios where the Data Marketplace interacts with other non-SAP sharing infrastructures:

With this evolution of the Data Marketplace, the Data Products become a central piece of the data value chain within SAP Datasphere and obviously, very interesting questions are currently being discussed internally and with customers such as

  • Shall data products also turn into content products that ship analytical artifacts or even stories?

  • Should data products be findable in the released data catalog and be creatable out of metadata assets?

  • How should organizations help their data consumers if they do not find what they are looking for?


It is an interesting time for people that are passionate about taking data and its value to the next level. Let us know what you think and need, for example by submitting your ideas in the SAP Influence Tool for SAP Datasphere.

Stay Hungry. Stay Foolish. Stay Tuned.