
This is the second Blog regarding the SAP Datasphere and Confluent integration. Here is the link to the first one, where I show how to connect both tools.
I'm working at SAP as Customer Advisor since 2015. Previously I was a Consultant for SAP Data Services. So I'm familiar with the SAP Integration tools. Since March 8th 2023, with the announcement of SAP Datasphere, there is a new aspect in Data Integration. It's the Business Data Fabric approach. The benefits of Business Data Fabric is described in this Blog
So the idea is not to have lot's of ETL jobs in between, rather let the data inside of the source applications and only if necessary, store it in SAP Datasphere and use the capabilities inside. Data Products modeled and created inside of SAP Datasphere can be accessed by external tools. In some cases, this is not enough and customers just need to push their data to external systems too. One way is to use the "Replication Flow" in SAP Datasphere which enables you to replicate the data from SAP Datasphere to specific targets or directly from SAP Source Systems to several targets.
Here you can find the corresponding information about how to create a Replication Flow on SAP Help
The list of available Replication Flow sources are shown on SAP Help
Replication Flow sources
And this is the overview of the currently available Replication Flow targets (SAP Help)
Replication Flow targets
So this is the actual status and can and will change. For everyone who wants to see, what comes next, please have a look in the SAP Datasphere Roadmap Explorer:
Roadmap
There you can see, that Confluent is planned to be available as a source in Q2 2024! This means that you even can get your streams FROM Confluent INTO SAP Datasphere, which allows you lot's of more possible sources for ingesting data into SAP Datasphere!
When you now want to know the difference between Confluent Cloud and Apache Kafka, just have a look here.
So back to Confluent as target. On their website you can see all possible target's they offer themself:
Confluent connectors
What you can see is, that there are lot's of targets available, including several ones, I often hear from customers, which want to connect with SAP data. Mainly they want to get the SAP data into these targets:
The following targets were requested as well from some customers:
As you can see, the hyperscalers are most common as targets for SAP data. With SAP Datasphere customers are already able to connect the 3 Hyperscalers within the Replication Flow. But there are still some targets, which are not available in SAP Datasphere. Luckily we already have the possibility to replicate the data in realtime and with CDC functionality in combination with Confluent for all the missing targets, so there is no need anymore to use an additional tool in between, but directly SAP Datasphere with Confluent to feed all relevant systems with SAP data. |
So why using SAP Datasphere in combination with Confluent?
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
13 | |
12 | |
9 | |
7 | |
7 | |
7 | |
6 | |
6 | |
5 | |
5 |