2 weeks ago
We need to move 200,000 records daily from a Data bricks environment on GCP to a custom table in SAP.
Since Data bricks can't expose workbook APIs due to their current capabilities , SAP will provide an OData API with a POST operation for data transfer. With the huge volume of data , performance can be a bottleneck . Below are the questions -
a) Is OData API intended for large data transfer ? What are the limitations ?
b) what are the best practices while transmitting huge volume data through ODATA rest API . Already aware of the Pagination / batching .
c) What other interfacing methods are possible with data bricks other than FTP and without any level of integration.
P.S : Exploring PyRFC from data bricks to call SAP RFC as an alternative.
We don't use SAP Datasphere
User | Count |
---|---|
75 | |
8 | |
7 | |
7 | |
6 | |
6 | |
5 | |
5 | |
5 | |
5 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.