on 2021 Nov 02 1:12 PM
Hey all,
The aim here is to read CDS views which contains 100Millions+ records and i want to have a full load first and then subsequent changes(Delta) for everyday.
Now we want to write the data into s3 bucket. Now the problems are
1) Maximum batch size of "ABAP CDS Reader" is 1 Million record at a time and S3 Buckets does not support Appending of data inside the same file. So how can we gather all the data first and then write it to s3 as a single file.?
2) I tried to build a pipeline which follows the same scenario stated above and it always gets failed stating "roundtrip failed when logging: abap: Operator health check failed". Logs says it gets timedout with the source system but we changed that as well and having the issue right now.
3) This error comes up even if i want to write a one separate file per batch(Records per roundtrip).
4) There is no functionality of monitoring the last batch for "Replication" Transfer mode inside ABAP CDS Reader v2 as it wont set to true in case of Replication or delat mode.
Can anyone suggest how to resolve such scenario? Hoping for a good guidance,
Thanks,
Samarth
Request clarification before answering.
Hey James,
Thank you for your inputs.
I tried the option that you suggested and getting the same "roundtrip failed when logging: abap: Operator health check failed" error.
Can you help me with this strange error?
Thanks,
Samarth
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
51 | |
6 | |
6 | |
5 | |
5 | |
4 | |
4 | |
3 | |
3 | |
3 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.