Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
zili_zhou
Product and Topic Expert
Product and Topic Expert
16,906

Motivation:


[updated on 27.11.2023 with common mistakes or tips in the customers' projects]

As planning data in SAC is often changed by business users, these changes need to be replicated to the target systems like BW/BPC, S/4 HANA or Data Warehouse Cloud. This is a feature highly demanded by many customers.

The Data Export Service API is already GA in Q2.2022 with full data. With SAC QRC4 2022 the Data Export Service Delta feature will also be generally available without any delta toggle. Currently, if you are on a fast-track tenant, you can request the feature to be toggled on in your system. For further information please have a look at the help document for the delta feature in this API.

This blog will focus on how to use BW/4 HANA (as target) integrated features (Smart Data Integration) to pull the delta data from SAC (source) using the Data Export Service. Please be aware that this in contrast to a live connection or an import connection where SAC is the target and BW is the source.

Content


Architecture and Prerequisites

Steps

StepActionsPurpose
1Install DP Agent and connect BW/4 to SACPrepare for the connections
2

Using SAC Planning model as BW source

-        Verification at HANA side

-        Create BW source system

-        Create Data Source

-        Understand the logic of changing data

-        Understand the logic of deleting data
Understand the delta logic from SAC to the replicated table at BW source
3

Setup BW data flow

-        Create an ADSO Z1DETADSO with  change log.

-        Create a DTP and simple mapping from data source to the ADSO

-        How changes at SAC side are reflected the ADSO Inbound Table, change log and Active Data
Understand the delta logic from replicated table to BW ADSO.


FAQ

Further Links

Call for actions

Architecture and Prerequisites:


Below is the architecture we are going to use. Compared to the way to use ABAP and the API to write to a specific table, the advantages are here 1) best to utilize the HANA artifacts and memory for the real-time replication with SDI technology; 2) No programing is required.

 

Architecture


A DP Agent of version 2.5.3 or higher is necessary. It is recommended to use the latest DP Agent.

The target systems can be a BW/4HANA, DWC, or HANA on premise system (at time of publication of this article). Please check the latest PAM HANA SDI to check the latest status if more target systems are supported.

 

SDI CloudDataIntegrationAdapter PAM



Steps 1: Install DP Agent and connect BW/4 to SAC


There is already blog introducing this. Please refer to this help about the details how to install DP Agent and connect to the HANA system under BW/4: Leverage the SAP Analytics Cloud Data Export Service to extract your planning data to SAP HANA, SAP ...


Important: If you do not change the DP Agent default setting, it only allows to use max 4 GB memory even DP Agent is installed on a machine with much larger memory.  We see this in general cause performance issues in many customer cases. Most of the cause, a small and medium DP Agent sizing will meet your requirement in this SAC--> BW integration.  Thus it is recommended to set DPAgent ini, parameter Xmx to at least 8192m or higher, Increase Xms to the same or similar number.

More details could be found in SAP 2688382 - SAP HANA Smart Data Integration Memory Sizing Guideline

 SMALLMEDIUMLARGE
Use Case

A small scenario with:

·       One source system

·       Up to 40 tables

·       A weighted table size category of S-M

·       Federation (initial load) of tables balanced based on HANA target capacity

·       Modification rate less than 1,500,000 records/hour

·       The example above fits here


A midrange scenario with:

·       Approximately 1-3 different source systems

·       And/or up to 100 tables in total

·       A weighted table size category of M-L

·       Federation (initial load) of tables done sequentially across sources and balanced based on HANA target capacity

·       Modification rate less than 5,000,000 records/hour


An upper mid-range scenario with:

·       Up to 6 different source systems

·       And/or up to 300 tables in total

·       A weighted table size category of M-XL

·       Federation (initial load) of tables done sequentially across sources and balanced based on HANA target capacity

·       Modification rate less than 10,000,000 records/hour
DPAgent Server

·       Hardware: 8-16 CPU cores, 16-32 GB of main memory, 2-3x disk space based on main memory

·       DPAgent ini updates: Increase Xmx to 8192m or higher Increase Xms to the same or similar number

·       Ensure 6-8 GB of free RAM availability for the OS and JVM variations*, above and beyond the Xmx setting


·       Hardware: 16-32 CPU cores, 32-64 GB of main memory, 2-3x disk space based on main memory

·       DPAgent ini updates: Increase Xmx to 16384m or higher Increase Xms to the same or similar  number

·       Ensure 8-12 GB of free RAM availability for the OS and JVM variations*, above and beyond the Xmx setting


·       Hardware: 32-64 CPU cores, 64-96 GB of main memory, 2-3x disk space based on main memory

·       DPAgent ini updates: Increase Xmx to 32768 or up to 65536m               Increase Xms to the same or similar number

·       Ensure 12-24 GB of free RAM availability for the OS and JVM variations*, above and beyond the Xmx  setting


SAP HANA Target System

(for replication only)


·       Single Remote source

·       ~ 1 additional CPU core

·       < 1 GB memory (not including memory growth over time as data volume increases)


·       Separate remote source(s) for high volume modification rate tables

·       ~ 2-4 additional CPU cores

·       1-2 GB memory (not including memory growth over time as data volume increases)


·       Separate remote source(s) for high volume modification rate tables

·       ~ 4-8 additional CPU cores

·       2-4 GB memory (not including memory growth over time as data volume increases)


 

Step 2: Using SAC Planning model as BW source

 

Verification at HANA side


Before start doing anything in BW, you can verify in HANA side -->Provision --> Remote sources, it looks like below. The planning modes are under SAC folder. In case it is empty and also you do not get error, it is many cases due to the result was blocked by firewall in your local network.

 

Verificate the connection in HANA Studio



Create BW source system in BW Modeling Tool (BWMT)


Now we change to BW modeling Tool to create a Smart Data Access type source system. We call it SACDES here. Attention: All the SDA and SDI connection type is under this Smart Data Access. We are in fact using SDI here, however it is still called SDA in the UI.

 

BW Source System Type: Smart Data Access


 

Create Data Source in BWMT


Now we create a Data Source under SACDES. All the planning transaction data are delta enabled. Master data does not support delta. Thus, we search “FactData” here, which is transaction data. Attention: This is going to take a while to retrieve all the metadata from SAC. And you need to check in your SAC modeling URL to find the models.

My model is https://<mySACURL>sap/fpa/ui/app.html#/modeler&/m/model/C9ZOQZN2GI2L4HV6S4MK8ULFK

 

Choose the SAC model from BW


 

Here we created the data source called “Z1DELTAS”

 

Create a new datasource


Go to extraction Tab. Make changes as below for the delta enabled transaction data. The real-time replication type Upsert is like after image and insert is similar to before/after image in BW context.  We will use UPSERT here and will see how delta are reflected later.

Here are more information about the difference of UPSERT and INSERT.

 

Delta enabled extractions


 

Activate your data source first. Then click “Manage“ button, which is behind “Remote Subscription Type UPSERT”.  You should see status Initial here. By this step, BW will generate a virtual table, replication table and subscription, which we are going to check in details later.

 

Manage HANA Remote Subscription


 

Tips: If you choose "without data transfer" above, the initial loading will also happen in a normally cases (if it is the first initial loading with the same filters for this HANA data source). In case of huge data volume, it can save initial loading time.  "With data transfer" will transfer data twice in most of cases, but it will not lead to wrong data as the 2nd data transfer will not really insert into the target table (/BIC/CMT...).

 

Now you execute the delta initialization in the background.

 

Execute the delta initialization


After the jobs are done successfully, you can check how many records are replicated into the replication table.

Here is the way to find out what the virtual table, replication table and subscription. Click Overview and you will see all the subscription generated by BW in the system.  BW delta enabled data source generate VT (/BIC/CMV<Datasourcename>0000X000001) , Replicated Table (/BIC/CMT <Datasourcename>0000X000001) and subscription (/BIC/ CMU<Datasourcename>0000X000001) )

 

Overview of HANA Remote Subscriptions created by BW


In our case, the generated replication table is /BIC/CMTZ1DELTA00001000001 and our BW/4 is in SAPHANADB, thus you can run SQL in the HANA directly. We have replicated 508 records.

 

Check number of records at replicated table



Understand the logic of changing data


We will change a record (GLAccount  =“Cost of Goods Sold”, products “city” and Date “Jan (2022)”) from 9.000k to 6.500k

First, we check it in the replicated table it looks like below. SignedData is 9.000k.

 

How the record looks like at replication table


 

Now we are going to modify a record in SAC story: change this SignedData from 9.000k to 6.500K . and click publish.  IMPT: only public version can be pulled via the API.

 

Publish changed data in SAC


The virtual table points to SAC and it will be changed immediately. Verify by below

 

select  *  from "SAPHANADB"."/BIC/CMVZ1DELTA00001000001"  where "GLAccount"='Cost of Goods Sold' and "Products"='City'

 

 

 

Check the virtual Table pointing to SAC planning model


 

Depending on the volume, after some seconds or maybe several minutes, you can see the replicated table is also changed. SDI_CHANGE_TYPE" A" (as in Autocorrect) is for inserted or updated records.

 

select  *  from "SAPHANADB"."/BIC/CMTZ1DELTA00001000001"  where "GLAccount"='Cost of Goods Sold' and "Products"='City'

 

 

 

SAC changed data is automatically replicated to BW



Understand the logic of deleting data


Now I delete this record (GLAccount  =“Cost of Goods Sold”, products “city” and Date “Jan (2022)”) from SAC and publish.

 

Publish data after deletion


Here is the result from the virtual table, nothing could be found now.

 

Record is deleted at SAC


 

Here you will see the record is SDI_CHANGE_TYPE has value "D" for deleted in the replicated table.

 

SDI marks record as D and clean up the transaction data


Similarly, you can test the remote subscriptions of type INSERT, it returns the complete delta history in the target table of the replication.

Maybe you are already thinking about after some time, how to clean up the replicated table. The good news is BW has also implemented the house-keeping for the delta tables as below. Here is the LINK.

 

House keeping of replicated table



Step 3: Setup BW data flow


 

Create an ADSO Z1DETADSO with change log.


 

Create a DTP and simple mapping from data source to the ADSO

 

 

Create DTP in BW


 

How changes at SAC side are reflected the ADSO Inbound Table, change log and Active Data


Here are 3 tables generated behind this ADSO.

 

Inbound Table, change log and Active Data of ADSO


We are looking at the Inbound table /BIC/A<ADSO>01 as below

 

Check the Inbound Table


Here is the active data table after the request is activated. The delete 1 record is not active and only 507 are there.

 

select * from "SAPHANADB"."/BIC/AZ1DETADSO2"

 

 

 

Check active table after deletion in SAC


After this, I changed one record in SAC, you will see from ADSO request only 1 record is loaded and 1 is activate.

 

Loaded Request and Activate Request


Here is the change log for this ADSO.

 

Changelog in ADSO



FAQ

  • Does the delta API support Classic Account Model?

A: Yes, it supports both SAC Classic Account Model and new model.  It also supports both  Analytic models and Planning models. All transaction data has delta. Master data does not support delta.

 

 

  • Do I need to filter out private version while using the API?

A: No, only published data is exposed by API.

 

  • How to trouble shooting subscription at HANA side?

A: Download the SQL statements from Note 1969700 - SQL statement collection for SAP HANA.Search the keywords "SmartDataIntegration". There will be around 9 SQL statements. For example , "HANA_SmartDataIntegration_Agents"  is to check the DP Agent status connected to your HANA. "HANA_SmartDataIntegration_RemoteSubscriptions" is to check the remote subscriptions.

 

  • How to check the trace at the DP Agent side?

A: SAP Note 3102531 - Where is dpagent framework trace located? - SDI

 

  • How to check if my SAC Odata API works correctly?

A: You can test your  SAC Odata API in Postman or below link Try Out | Data Export Service | SAP API Business Hub

 

  • I have Dev/QA/Production landscape for my on premise systems, can they share the same DP Agent?

A: No, DP Agent can only have one target system. Thus, you have to install DP Agent for each of your BW,/4 HANA, S/4HANA or HANA system.

 

  • Can I create more than one subscription to one SAC planning model.

A: yes, one consumer(BW, HANA and so on) could create several subscriptions to the same model.



Further Links


Further Blogs/links on the usage of SAC Data Export Service API:

Call for actions:


This method is a seamless combination from SAC Data Export Service Odata API to BW/4 and utilized SDI technology. There are still a few things we can tune the performance in case of huge data volume.   For example, parameter “Pagesize” of the data source in HANA is 10.000 by default and you could increase much 10 times large or even more. Parameter “Poll Period” (in mins) is by default 5 minutes, this could be reduced if you want to try to check the poll more frequently.

Be free to share your test result or any other comments here.

In case there is a need for guidance on how to use this architecture for HANA on primes or Data Warehouse Cloud, please also let me know.

Common issues customer facing:


1 use HDB Studio to change SDI parameters and remote source is not working aftewards.

We noticed sometimes when customer use HDB studio to change parameters like pagesize, some HANA data source properties are changed.

Solution: use SQL command directly or below tools for the changes.

SAP HANA Web-based Development Workbench or Web IDE and SAP Business Application Studio

SQL example

 

Alter REMOTE SOURCE “SACDatasourcename" ADAPTER "CloudDataIntegrationAdapter" AT LOCATION AGENT “SDIAGENTNAME" CONFIGURATION' <?xml version="1.0" encoding="UTF-8" standalone="yes"?><ConnectionProperties displayName="Configurations" name="configurations">
              <PropertyEntry name="host">SAC Host</PropertyEntry>
              <PropertyEntry name="port"></PropertyEntry>
              <PropertyEntry name="protocol">HTTPS</PropertyEntry>
              <PropertyEntry name="servicePath">/api/v1/dataexport/administration</PropertyEntry>
              <PropertyEntry name="pageSize”>100000</PropertyEntry>
<PropertyGroup name="connection">
   <PropertyEntry name="require_csrf_header">true</PropertyEntry>
   <PropertyEntry name="auth_mech">OAuth2</PropertyEntry>
</PropertyGroup>
<PropertyGroup name="oauth2">
          <PropertyEntry name="oauth2_grant_type">client_credentials</PropertyEntry>
                 <PropertyEntry name="oauth2_token_request_content_type">url_encoded</PropertyEntry>
                 <PropertyEntry name="oauth2_token_endpoint">your token URL</PropertyEntry>
         </PropertyGroup>        
</ConnectionProperties>'
WITH CREDENTIAL TYPE 'PASSWORD' USING
'<CredentialEntry name="oauth2_client_credential">
    <user>Client ID</user>
    <password>Client Secret</password>
</CredentialEntry>';

 


2. SDI exception prevent replicate new data

Please make sure the exception.is cleared up. Otherwise, it will prevent all data source replicate the data or create new subscription.

Reason and how to process the exceptions could be referred as below:

 



3. DP Agent has not configured enough memory

make sure that you have make a proper DP Agent sizing, The memory (parameter -xmx) is default 4 GB. Please check below document and change accordingly.

Sizing document and Note 2688382 – SAP HANA Smart Data Integration Memory Sizing

 

 

42 Comments