This blog post is part of a series on SAP Datasphere and SAP HANA Cloud CI/CD. I recommend checking follow-on blog “SAP Datasphere SAP HANA Cloud HDI Automation CI/CD Pipelines Details” for implementation details, including a code walkthrough.
Introduction
In this blog post, I'll go over a lifecycle management approach for an end-to-end scenario involving SAP Datasphere, SAP HANA Cloud - HDI container and SAP Analytics Cloud. For illustration, I'll go over a scenario, leveraging database artifacts from the existing SAP HANA Cloud, modelling and enriching it in SAP Datasphere and visualizing the results in SAP Analytics Cloud. While doing so, analyze some of the typical questions and challenges faced by the IT department as listed below.
- How can I leverage my existing footprint and investments in the SAP HANA Cloud and SAP SQL Data Warehousing?
- Can we build CI/CD pipeline for use in SAP Datasphere and SAP HANA Cloud?
- Can GIT be used for the SAP Datasphere?
- Can multiple developers work on artifacts in SAP Datasphere?
- Can I use my local development environment and command line interface (CLI) to develop on SAP Datasphere and SAP HANA Cloud? Similarly, can I leverage Business application studio to develop on SAP Datasphere and SAP HANA Cloud?
This blog post builds on top of some of the earlier blogs listed below from SAP HANA Database & Analytics, Cross Product Management and Product Management teams. This blog does not intend to showcase SAP's best practices, an all-in-one guide, or a hands-on follow-through tutorial. The goal is to analyze challenges, point out gaps and discuss possible solutions/mitigations, if any.
The following paragraphs cover the use case scenario and scenario illustration, followed by CI/CD automation approach and conclusion. Let's jump in.
Use Case Scenario
- The marketing team in your organization wants to launch a promotional campaign to help with strategic organizational goals. How are products doing in the Americas region? What does the sales number look like per product? Which product should we develop? These are the questions they have for the data team, and they want reports and visualization for further analysis.
- The IT department wants to address marketing teams’ requirements quickly and take care of software lifecycle elements and CI/CD aspects, laying a platform for innovation agility in meeting such needs from other lines of business (LOBs).
Scenario Illustration
Figure (a) shows 3 step solution approach for the illustration scenario, and Figures (b) and (c) elaborate on the 3 steps.
Figure (a) Illustration scenario solution approach
The first step will be to review and reuse an existing
Salesorder model, tables or ready calculation view from SAP HANA Cloud HDI. Using business application studio or VS code, deploy (push the design time model artifact to runtime) on the SAP HANA Cloud DB tenant. This HDI container will be linked to the SAP Datasphere space.
Figure (b) Solution Approach Model
The second step, enhance the SAP Datasphere model with product-relevant information sourced from a CRM system and expose it as an analytical data set with relevant measures per the requirements. Finally, in the third step, use the SAP Datasphere model to publish an SAP Analytics Cloud Story answering marketing departments' questions by using two charts, one table showing product-wise sales, discount analysis under North America Sales Org and another linked heat map to visualize gross sales of each product.
Figure (c) SAP Analytics Cloud Story using the SAP Datasphere Model
Automation CI/CD Pipelines
Let's start with the transport landscape and then jump into pipelines. Figure (d) depicts the transport landscape. The main point is that both DEV and QA HDI containers would be under the same subaccount (linked cloud foundry org) and space. This can be extended to a 3-system landscape with DEV QA and PRD, with production on a separate SAP HANA cloud tenant or similar approach. Currently, SAP Analytics Cloud artifacts can be promoted only through Analytics Content Network (ACN) and no public APIs are available for automation. Hence I have not included SAP Analytics Cloud in the transport landscape.
Figure (d) Transport Landscape setup
Figures (e) and (f) show the tooling stack and flow used to realize automation. There are two pipelines linked to two separate GIT repos for the HDI container and SAP Datasphere artifacts. For automation on the SAP Datasphere side, we will leverage
@Sisn/dwc-cli. project "Piper" and its Jenkins build server are used for automation execution, coordination, and sequencing. SAP Continuous Integration & Delivery is not included in the diagram as Piper is used for that purpose since it offers more flexibility with SAP Datasphere and coordination. SAP Datasphere pipeline's build and deploy steps are executed from within a Docker container, ensuring the
@Sisn/dwc-cli dependencies are met. Additional Docker repo for maintaining the
@Sisn/dwc-cli dependencies.
In general, for each development, one would create a feature branch under the GIT repo, then, based on the process setup, finish development, unit testing, and validation and create a pull request to merge the branch with the main branch. While merging, a rebase and adjustment may be required if the same entities are changed in the main branch. The webhook will trigger automation and transport through the landscape.
Figure (e) DevOps Architecture and tooling stack
Figure (f) Automation flow
As shown in figure (f), The flow can either start from the HDI container pipeline or the SAP Datasphere pipeline. Suppose it involves committing HDI container artifacts via VS code or Business Application Studio. In that case, it will trigger the HDI pipeline with build deploy, validation and uploading MTA archives to SAP Cloud Transport Management. SAP Cloud Transport Management will move the MTA archives through the landscape. If all the earlier steps are successful, it will trigger the SAP Datasphere pipeline. SAP Datasphere pipeline flows through the build, deploy and validation of SAP Datasphere artifacts deploying them into QA space.
Conclusion
This blog introduces the use case scenario, automation flow, the challenges, and the approach which can be used for
CI/CD automation with SAP Datasphere and SAP HANA Cloud!. With SAP Cloud Transport Management, project "Piper" and @
sap/dwc-cli CI/CD automation can be realized. Check the follow-on blog
"SAP Datasphere SAP HANA Cloud HDI Automation CI/CD Pipelines Details” for implementation details, including a code walkthrough.
Let me know what you think about the approach and feel free to share this blog. All your feedback and comments are welcome. In case you have any questions, please do not hesitate to ask in the
Q&A area as well.