Overview
Do you want to externally perform DML operations on your DWC objects, but unable to decide which method to choose? Well this blog is for you to check on the pros and cons of each use case and methods that can help a little on deciding the methods to use for DML operations on DWC.
As a context, DWC application layer allows you to create tables locally and edit it inside from the same layer, but for an adding data from an external source that is not listed on the standard DWC connection, one must add it from the HANA Database layer of the DWC. For accessing that HANA DB layer we have many option too, on which this blog will help.
Proposed solutions and its use cases
1. OData API calls directly on to the views of DWC
a. Reference
- Using the Data Warehouse Cloud OData API with SAP Analytics Cloud
b. Summary
👍 Pros:
- OData URL is available for view SQL or Graphical view data in JSON format
- Any view with consumption enabled can be viewed via URL
- No coding required, just need to tweak the URL of DWC tenant
👎 Cons:
- Not possible to perform CRUD operation on the DWC application layer table
- DML is not possible for view SQL or Graphical view
- Cannot add the URL as destination in BTP where DWC tenant is different from BTP tenant
- OData cannot be consumed directly like north wind service
🎯 Used For:
- Can be used for external analytical platforms like Excel, Power BI, etc.
- Consumable in ABAP layer upon added basis configs
- DWC tenant is exposed as external API
2. Python/Nodejs application in SRV layer for CRUD operation
a. Reference
- SAP HANA Client Interface Programming Reference for SAP HANA Service
- Deployment of Python Application on BTP for API Integration & Creation
b. Summary
👍 Pros:
- Can perform read on the Graphical and SQL views and can perform CRUD operations on the HANA database layer of DWC
- Using the python application one can directly connect to HANA database available with DWC tenant using standard connectors
- Python applications can create API that is hosted in BTP account or can be called internally in a capm project application using Node.js
- The connection will be secured, and all the credentials will be stored as user provided variable in BTP
- Standalone application can also be created for services
- DWC credentials can be stored as a user provided variable in BTP
👎 Cons:
- Connection won’t be optimized, and every call must go via hdbcli library, which could be difficult when considering very large amount of data say in millions
- DML operations can be performed on remote tables of DWC but not application layer tables of DWC
🎯 Used For:
- ETL applications and Job scheduled data load applications
- Custom API creations
- UI applications (Configurations would be little complex compared to a capm application)
3. CAPM approach using cross container (recommended)
a. Reference
- Optimized SAP recommended approach Expose SAP Data Warehouse Cloud artefacts as OData v4 service
b. Summary
👍 Pros:
- Can perform read on the Graphical and SQL views and can perform CRUD operations only from the HDI container created from BAS. (DML operation will be performed on table created on the BAS and the objects will be consumed in the DWC by cross container service)
- Using general CAPM model by referencing with DWC cross container which accesses the DWC HDI container from BAS which is usually added to the DWC space
- Connection is optimized and smooth as the OData connection is established
- DWC schema credentials is stored as a user provided service in BTP for accessing the DWC DB layer
👎 Cons:
- DML operations can be performed on remote tables of DWC but not application layer tables of DWC
- Complex initial configuration of multi target application
🎯 Used For:
- SAP UI5 application for CRUD operation
- ETL and BTP workflow management
- Simple JS logic and smart tables with entity binding can be used
- Simple coding in DB and SRV layer configured using cloud application programming (CAPM)
- To handle complicated scenarios, use functions for function inputs, actions for put and post operations, entity sets for get operations in the SRV layer and this SRV layer talks to the backend procedures
- The SRV layer coding is done using advance Node.js where SRV layer validation is incorporated
Conclusion
After reading this blog, you can decide on the method to use for DML operations on the DWC.
While exploring on the DML operations, there were many discrete blogs but couldn't figure out which method was more useful compared to the other, that's when got this idea to list the POV. Do let me know if this blog helps anyone who is in need to decide on the approach.
Hope you have enjoyed the blog!