cancel
Showing results for 
Search instead for 
Did you mean: 

Read CDS View with millions of records in SAP Data intelligence

0 Kudos
1,451

Hey all,

The aim here is to read CDS views which contains 100Millions+ records and i want to have a full load first and then subsequent changes(Delta) for everyday.

Now we want to write the data into s3 bucket. Now the problems are

1) Maximum batch size of "ABAP CDS Reader" is 1 Million record at a time and S3 Buckets does not support Appending of data inside the same file. So how can we gather all the data first and then write it to s3 as a single file.?

2) I tried to build a pipeline which follows the same scenario stated above and it always gets failed stating "roundtrip failed when logging: abap: Operator health check failed". Logs says it gets timedout with the source system but we changed that as well and having the issue right now.

3) This error comes up even if i want to write a one separate file per batch(Records per roundtrip).

4) There is no functionality of monitoring the last batch for "Replication" Transfer mode inside ABAP CDS Reader v2 as it wont set to true in case of Replication or delat mode.

Can anyone suggest how to resolve such scenario? Hoping for a good guidance,

Thanks,

Samarth

View Entire Topic
0 Kudos

Hey James,

Thank you for your inputs.

I tried the option that you suggested and getting the same "roundtrip failed when logging: abap: Operator health check failed" error.

Can you help me with this strange error?

Thanks,

Samarth

jimgiffin
Product and Topic Expert
Product and Topic Expert
0 Kudos

Have you checked the logs in your source ABAP system? Check under t-codes ST22 and SLG1. I would search for DHAPE* for CDS and LHAPE* for SLT replication scenarios.