This blog is part of a blog series from SAP Datasphere product management with the focus on the Replication Flow capabilities in SAP Datasphere: Replication Flow Blog Series Part 1 – Overview | SAP Blogs Replication Flow Blog Series Part 2 – Premium ...
This blog is part of a blog series from SAP Datasphere with the focus on the integration of SAP Datasphere and Databricks. In this blog, we will provide an overview of the integration between SAP Datasphere and Databricks by using the premium outboun...
This blog is part of a blog series from SAP Datasphere product management with the focus on the Replication Flow capabilities in SAP Datasphere:Replication Flow Blog Series Part 1 – Overview | SAP BlogsReplication Flow Blog Series Part 2 – Premium Ou...
This is my second part of the blog series about SAP Data Intelligence ABAP integration with generation 2 operators. After going through a lot of theory including prerequisites, concepts etc. for generation 2 ABAP integration in my blog post part 1-Ov...
In this blog post, I want to describe the scope and usage of the new generation 2 ABAP operator “Read Data from SAP System” as part of ABAP Integration starting with SAP Data Intelligence 3.2 and SAP Data Intelligence Cloud 2110 . To demonstrate the ...
Hi @AnkurGoyal03 ,we do not have such a push based approach available and depending on when the last delta was replicated, it can lead to impact on the source system if the logging tables are growing fast. But one option as a "workaround" could be th...
Hi all,couple of comments and remarks from my end regarding some of the topics that are discussed in the threads above.In the meanwhile certain changes can be done on a running replication flow without restarting the entire replication flow, e.g. add...
@qisu : When using HANA Cloud as target in Replication Flows, costs of Premium Outbound INtegration (=POI) dod not apply in this scenario and you only need to consider the data integration / execution hours as part of the SAP Datasphere sizing.The pr...
@DJSmith : At the moment it would push the data directly into HANA Cloud an dif you want to push the data to another HANA engine, you would need to selecta adifferent target connection, e.g. HANA Data lake Files in case you want to push the data into...
Hi Ramakrishnan,
you may want to check out the Custom ABAP Operator we have that could fit to your scenario depending on your exact requirements. At the moment it is existing as a Generation 1 operator.
Kind regards,
Daniel