CRM and CX Blog Posts by SAP
cancel
Showing results for 
Search instead for 
Did you mean: 
StefanM-AU
Product and Topic Expert
Product and Topic Expert
5,584

The ability to analyse customer related data in near real-time enables businesses to make informed decisions, personalise customer experiences, and respond swiftly to changing market trends. To achieve this, data integration into a central analytics platform must be triggered immediately as transactional events occur. I.e. an alternative to typical batch triggered ETL data replication processes from the transactional system to the analytical system would be desirable.

In this article, we'll explore an option to build an event-driven integration between SAP Sales and Service Cloud V2 ("SSCV2") and SAP Datasphere, enabling near real-time analytics in the context of a centrally consolidated enterprise data model. While this article is specific to SAP Datasphere, the approach of processing and transforming the SSCV2 events using SAP Cloud Integration can also be used as a template for other integration scenarios. 

The second part of this article series, which dives deeper into aspects of integrating different types of references, based on the initial scenario setup, can be accessed under the following link: 

Event-Driven Data Integration from SAP Sales and Service Cloud V2 to SAP Datasphere (Part 2 of 2)

Let's start with a straight forward Case integration scenario that introduces the major building blocks of the integration architecture.

The integration is established through a message-based integration flow (iFlow) on SAP Cloud Integration, leveraging the JDBC sender adapter to send data records directly into the underlying SAP HANA Cloud database of SAP Datasphere. The iFlow handles the transformation of multiple different types of SSCV2 data events into JDBC update messages.

To keep the architecture simple, we’re skipping the use of an Event Broker like SAP Advanced Event Mesh for now. For a production configuration it is highly recommended to use SAP Advanced Event Mesh or a similar event/messaging service to improve throughput and resiliency. 

For this example, SSCV2 events are sent directly to the iFlow, which streamlines the process while maintaining near real-time data transfer. An asynchronous message queue, which is recommended for de-coupling sender from receiver, while keeping events in order, is established by the iFlow through the use of an intermittent Data Store write and read step. 

SSCv2-to-Datasphere.png

For additional background, a basic architecture for capturing business events in SAP Datasphere (using SAP Advanced Event Mesh as event broker) has previously been outlined in this SAP Community blog:

Exploring Capturing Business Events in SAP Datasphere using SAP Integration Suite

Configuration steps are required across three environments and described in detail in the following chapters.

  • SAP Datasphere Configuration Steps
  • SAP Cloud Integration Configuration Steps
  • SAP SSCV2 Configuration Steps

To be able to configure the integration scenarios described in this article (part 1 and part 2), some artefacts are required: 

  • The iFlow definition file 
  • DDL statement files for the creation of the required DB tables in the SAP HANA Cloud DB of SAP Datasphere

These artefacts can be obtained from the following GitHub repository:

https://github.com/sapstefan/SSCV2-Datasphere-Integration 

 

Datasphere Configuration Steps

Prerequisite: An SAP Datasphere tenant has been provisioned for your SAP BTP Sub-Account. Details about using the Datasphere service, incl. free tier options, can be found in the Discovery Center: 

https://discovery-center.cloud.sap/serviceCatalog/sap-datasphere 

The main activities required, before data transfer to the Datasphere tenant can happen, are:

  • Create a space that will contain the SSCV2 related content, namely database tables and analytical views
  • Create a database user for the new space, which will also lead to the creation of a DB schema ("Open SQL Schema") for that user
  • Create DB tables for the supported SSCV2 data objects from DDL statements (provided in the above mentioned GitHub repo as zip file). We start with the "CASES" table to establish a simple integration scenario, then will add further tables / data sources to establish relationships between data sources on Datasphere.

Detailed Steps are as follows.

Create a Space.

Example name here: CNSSANDBOX

create-space-2.png
Add a space administrator user to the space that has a scoped role assigned, where the scope relates to the space created.add-users-to-space-2.png
Create a database user for the space.create-db-user-2.png

Provide the DB user suffix (here "SM").

Set "Enable Write Access", which is required to be able to send data with the DB user to the HANA Cloud database (open sql schema) via the JDBC adapter.

The "Enable Read Access" flag is optional. It provides read access to the analytics content on space from external systems.

db-user-attributes-2.png

Open the Database Explorer with the newly created DB useropen-db-explorer.png

Open the SQL Console to create the CASES table. The required DDL script is provided with the file cases_create_table.sql on the GitHub repo.

At this stage, all other HANA Cloud tables, which represent additional SSCV2 entities (Individual Customer, Employee, etc.) can already be created through the other DDL statements / files provided in the GitHib repository.

create-table-from-script.png
Verify that the CASES table has been created.check-db-table.png

 

SAP Cloud Integration Configuration Steps

On the Integration Suite / Cloud Integration tenant, two configuration steps are required:

  • Create JDBC Material
  • Create and Deploy iFlow

Create JDBC Material

A "JDBC Material" configuration is required on Cloud Integration, to provide connection parameters to the HANA cloud database of the DataSphere tenant. Steps are as follows.

In your SAP Cloud Integration tenant, navigate to "Monitor" --> "Integrations and APIs" --> "JDBC Material"

monitor-jdbc-material.png

Add an entry to the "JDBC Data Source" list. The user is the DataSphere space name + DB user suffix as created previously.

Provide the DB user password.

The JDBC URL format is:

jdbc:sap://<hana db hostname>:443/?encrypt=true&validateCertificate=true

--> Don't forget the "/" between "443" and "?" 🙂

add-jdbc-ds.png

jdbc-ds-result.png

Create and Deploy iFlow

The example iFlow named "SCV2 to DataSphere - Standard Events" can also be downloaded from the GitHub repository as zip file. Create an integration package in the "Design" section of your SAP Cloud Integration tenant. 

The iFlow includes multiple message mappings, based on supported SSCV2 events, including Case, Individual Customer, Employee, Registered Product, etc. Source message schemas are OpenAPI 3.0 specifications based on the SSCV2 event JSON messages. Target message schemas are xml schema definitions (xsd), which match to the DataSphere Hana Cloud table DDLs, which are also attached to this blog as zip files.

iFlow-snagit.png

On the JDBC sender adapter, provide the JDBC Data Source name, and set "Batch Mode" to active with "Batch Operation" as "Atomic". Batch Mode will be required for one of the advanced topics described later (using relationship tables for managing complex references).

jdbc-sender-adapter.png

Save and deploy the iFlow. Then navigate to the deployed artefact in the Monitoring view. Copy the iFlow endpoint URL for later usage in the SCV2 event communication configuration. 

iFlow-depoyled.png

 

SSCV2 Configuration Steps

The official help guide on how to configure outbound (standard) events using the Event Bridge service in SSCV2 can be found under the following link:

https://help.sap.com/docs/CX_NG_SVC/56436b4e8fa84dc8b4408c7795a012c4/6baf9c8623ed410f9df51d48b33a6f6...

Following is a more detailed description, including connectivity to SAP Cloud Integration.

Activate Standard Events

Logon to your SSCV2 tenant and navigate to the "Settings" application.sscv2-logon-settings.png
Find the "Standard Events" item.settings-standard-events.png
Activate all required standard events - choose the "created" and "updated" event for all relevant entities.standard-events-enable.png

Create Communication System

Creation of a Communication System in SSCV2 is required to establish the connectivity for integration to external systems, in this case the Cloud Integration tenant, which at this point should already be available (the iFlow can be deployed later). 

Before the Communication System can be created, we need the Process Integration Runtime credentials, which can be obtained from the Service Instance of the relevant BTP Sub-Account.

Open the BTP Sub-Account of the Cloud Integration tenant and navigate to the Service Instances view. Then open the Service Key ("Credentials") of the relevant integration-flow entry.btp-sub-account-service-instance.png
Copy the values of clientid, client secret and url into a notepad for later usage in the SSCV2 Communication System definitionbtp-credentials.png

Back in the SSCV2 tenant, navigate to the Settings app and there to the Communication Systems view.

Press the "+" button to create a new Communication System.

create-comm-sys.png

Enter a reasonable display id (here "CI_TENANT") and description.

Then navigate to the the "Outbound" tab and press the "Create" button to edit the outbound connectivity parameters.

comm-sys-popup.png

Choose Authentication Method as "Basic".

Paste the parameters as obtained in the previous step from the Cloud Integration / Process Integration Runtime service key:

  • URL (without https:// prefix) --> Host Name
  • Client id --> User Name
  • Client Secret --> Password

Finally, press "Save and Activate".

com-sys-outbound.png

Create Communication Configuration

In the Settings app of SSCV2, navigate to the "Communication Configuration" view. Then find the scenario with name "Send Events to External Systems".comm-config-list.png

This is a template communication configuration, which has to be copied for activation.

Open this template communication configuration and press the "Copy" button.

comm-config-detail.png
Change the name of the scenario to something meaningful, like "Send Events to DataSphere" and assign the previously created communication system.comm-config-attributes.png

Set the relative URL path of the iFlow as created previously.

Here: "/http/scv2/datasphere/standard-event"

comm-conf-outbound.png

 

End-to-End Integration Test

The initial basic configuration steps are complete now. At this point, a straight forward Case integration can be tested.

In your SSCV2 tenant, navigate to the Cases app and create a simple new case.

create-case-snagit.png

Save the Case record. 

Verify in SSCV2 that an event has been triggered for case creation, by checking in the Settings app "Standard Events Monitoring":

event-case-created.png

The event id is equal to the case id. Event type is "sap.crm.caseservice.caseCreate". Susbsequent updates of the same case will raise an event of type "sap.crm.caseservice.caseUpdate".

You may also verify that the message sent to the Cloud Integration iFlow has successfully been processed, and if the trace has been activated, that the right mapping has been found based on the event type, and the message was successfully delivered to DataSphere:

iFlow-executed.png

Finally, verify that a new record has been created in the DataSphere / S/4HANA Cloud DB table for cases:

db-case-created.png

This concludes part 1 of the tutorial. In part 2, we will explore how to manage different types of relations between the data objects: 

Event-Driven Data Integration from SAP Sales and Service Cloud V2 to SAP Datasphere (Part 2 of 2)

This will also include a basic description on how the SAP HANA Cloud DB data can be exposed in SAP Datasphere Data Builder, and then joined with other data sources to create views for consumption in analytical applications.

 

2 Comments