Enterprise Resource Planning Blog Posts by SAP
Get insights and updates about cloud ERP and RISE with SAP, SAP S/4HANA and SAP S/4HANA Cloud, and more enterprise management capabilities with SAP blog posts.
cancel
Showing results for 
Search instead for 
Did you mean: 
TiagoRibeiro
Product and Topic Expert
Product and Topic Expert
1,220

This blog post aims to bring some introduction about the usage of SAC Data Import APIs and how they can be leveraged to replicate data to Public Dimensions. It will also describe of one use case implemented recently to fulfill particular requirements without going with the standard communication scenario.

Data Import APIs can be leveraged to replicate transactional and master data to SAP Analytics Cloud.

The API includes 5 different services that can be used via API endpoints. These can be tested using an API Client or Platform such as Postman, Insomnia.

  • Models service
  • PublicDimensions service
  • CurrencyConversions service
  • Jobs service
  • Import service

For more information about the these 5 services please consult this document inHelp Portal.

Also available inSAP Business Accelerator Hub.

Use Case (Example): Replicate Enterprise Project individually to SAP Analytics Cloud after creation or change in S/4HANA Public Cloud.

Purpose of Use Case:

Replicate Enterprise Projects based on creation (or change) action in S/4HANA Public Cloud.

Keep in mind this is just an example originated from a real requirement but can be adapted, with the respective complexity, to any other object if it fulfills the necessary conditions and systems involved to satisfy the need to replicate an object on individual basis from S4 Public Cloud to SAC.

Ultimately, this is available for any other middleware or point-to-point consumption if the Sender system allows it.

An High-Level Diagram with the components involved is available below:

  • One SAP BTP Subaccount where Event Mesh and Integration Suite are subscribed with respective instances and applications. For more information how to setup the Event Mesh from SAP Integration Suite check the following document.
  • S/4HANA Public Cloud
  • SAP Analytics Cloud
  • SAP Cloud Identity Services is a nice-to-have in this example but logically for productive scenarios it shall be included in the overall architecture.

 

Image Caption: Technical Architecture Design

TiagoRibeiro_0-1740483645111.png

 

 

After understanding which components are used in this scenario, next level Diagram explains the data flow between each SAP component to leverage the Data Import API. Remember that it can be adapted to use one of the other 5 services although might require adjustments, especially in the interface in SAP Integration Suite.

Pre-Requisites:

  • SAP Event Mesh and SAP Integration Suite integrated
  • S/4HANA Public Cloud Integrated with SAP Integration Suite, using for example OAUTH Client Credentials
  • S/4HANA Public Cloud Integrated with SAP Event Mesh. Consult the following document in case more information is needed to perform this step.
  • SAP Analytics Cloud configured with one Public Dimension and with OAUTH Authentication configured to be consumed by SAP Integration Suite. For more information how to configure the authentication, consult this document.

 

Image Caption: Process Flow Architecture Diagram

TiagoRibeiro_1-1740483645126.png

 

 

As legend for the above diagram, explaining the data flow process:

Business User executes a create or change action on the respective business object. In this use case the Enterprise Project.

  1. Create or Change event in S/4HANA Public Cloud releases that outbound even message towards Event Mesh, previously configured;
  2. Event Mesh consumes and subsequently is consumed by SAP Integration Suite;
  3. Once the event is received, SAP Integration Suite extracts the Enterprise Project ID to then execute a GET Call based on the Enterprise Project ODATA API to receive the object data payload to deliver to SAP Analytics Cloud;
  4. SAP Integration Suite interface needs to transform the payload into the structure the Public Dimension expects to successfully replicate it. This has been done via message mapping element but can be accomplished with a script as well;
  5. Finally, the delivery of the message to the dataimport API requires first an authentication, using OAUTH mechanism for example to post the data into the respective PublicDimension which will be defined in the HTTP endpoint.
    1. At this final step, as described in the documentation, SAC demands the CSRF Token which needs to be retrieved first and sent as Header in the final step. This step is described in this document.

 

The following diagram presents a simple example of an interface executing all the steps mentioned above from SAP Integration Suite perspective.

 

Image Caption: The main integration flow with all the steps to accomplish a simple scenario without complex error handling mechanism.

TiagoRibeiro_2-1740483645142.png

 

Image Caption: Local Integration Process to Post the Payload into SAP Analytics Cloud.

TiagoRibeiro_3-1740483645151.png

One important caveat to be successful is to understand the Public Dimension data structure. All the necessary attributes, either mandatory or optional, need to be included in the message mapping. If not, the execution will fail.

This can be either assessed from SAC UI, going into the Public Dimension and check the properties.

Image Caption: SAP Analytics Cloud – Modeler – Public Dimension

TiagoRibeiro_4-1740483645183.png

 

Alternatively, via GET Call, using for example Postman, this information can also be retrieved using the Public Dimension metadata, using the following URL:

https://{host}/api/v1/dataimport/publicDimensions/{publicDimensionID}/metadata

This procedure is explained in this document. One might see that in the Sample Response, all attributes part of the Public Dimension are retrieved with their respective data types and key identification.

Image Caption: Public Dimension Metadata Response Sample

TiagoRibeiro_5-1740483645194.png

 

 In case of failed executions/messages, SAC also provides a mechanism to verify and understand the job failure. If you remember, I mentioned the dataimport API includes 5 services, being one of them for “Jobs”.

For example /jobs/{jobID}/invalidRows can be used to assess what failed during the data import into the respective Public Dimension.

There are additional examples available in SAP Business Accelerator HUB, such as S/4HANA Pricing Information to SAP Analytics Cloud.

Hope this particular scenario can be of any service for particular situations and requirements.

Happy testing and continuous learning. Share with us any interesting use cases you have implemented which can support others and reach out in case of any issues.

SAP Analytics Cloud #SAC Data Import API SAP S/4HANA Cloud Public Edition SAP Integration Suite