This blog is part of a blog series from SAP Datasphere with the focus on the integration of SAP Datasphere and Databricks. In this blog, we will provide an overview of the integration between SAP Datasphere and Databricks by using the premium outboun...
This blog is part of a blog series from SAP Datasphere product management with the focus on the Replication Flow capabilities in SAP Datasphere:Replication Flow Blog Series Part 1 – Overview | SAP BlogsReplication Flow Blog Series Part 2 – Premium Ou...
This is my second part of the blog series about SAP Data Intelligence ABAP integration with generation 2 operators. After going through a lot of theory including prerequisites, concepts etc. for generation 2 ABAP integration in my blog post part 1-Ov...
In this blog post, I want to describe the scope and usage of the new generation 2 ABAP operator “Read Data from SAP System” as part of ABAP Integration starting with SAP Data Intelligence 3.2 and SAP Data Intelligence Cloud 2110 . To demonstrate the ...
Introduction:
In this new blog post I worked together with Michael Sun about the usage of ABAP function module calls with SAP Data Intelligence using Custom ABAP Operators.
You may have already seen a similar blog post from Britta Thoelking about ho...
Hi Ramakrishnan,
you may want to check out the Custom ABAP Operator we have that could fit to your scenario depending on your exact requirements. At the moment it is existing as a Generation 1 operator.
Kind regards,
Daniel
Hi Chandra,
there is already a Part 2 available in case you want to have a look:
https://blogs.sap.com/2022/02/15/sap-data-intelligence-abap-integration-with-generation-2-part-2-sample-scenario-to-replicate-data-from-a-cds-view-to-google-cloud-stor...
Hi Rajarshi,
You may want to try out the pause / suspend & resume mechanisms provided by generation 2 pipelines to achieve your goal.
Kind regards,
Daniel
Hi Frank,
if you want to perform 1:1 data replication with minor projections or filters I would recommend to check out the Replication Flow functionality instead of using pipelines.
Kind regards,
Daniel