Supply Chain Management Blogs by SAP
Expand your SAP SCM knowledge and stay informed about supply chain management technology and solutions with blog posts by SAP. Follow and stay connected.
cancel
Showing results for 
Search instead for 
Did you mean: 
MichaelMack
Product and Topic Expert
Product and Topic Expert
6,323
Hi Everyone,

We are establishing a new blog series around SAP Integrated Business Planning (IBP) and Integration via SAP Cloud Platform Integration – Data Services (CPI-DS). You can imagine it is like the “tip of the month” where we will publish a series of posts on how to best integrate data from and to SAP IBP. This is tip of the month December, in case you missed the previous one find here the link.

Sometimes it can happen that Data Integration tasks get stuck on SAP IBP side.You can see that for example in the Data Integration App - see screenshot below.

There can be fatal errors that result in no data processed at all. Or other problems where only parts of the entire data batch is processed successfully. There are multiple kind of reasons for this. Here are 4 tweaks on how to troubleshoot them:

1.) What is the status of the Task?


Therefore, check the IBP Data Integration Job and compare this to the CPI Task History. There can be situations where post-processing status is captured wrong on CPI side. These are caused by temporary network interrupts.

2.) Do you have the right global variables for the Task?


Problem: Post-processing finished with zero records processed


typical cause: May be you want to load data to a different Planning area or Planning Version and you replicated the a CPI Task. You exchanged the target data store name however forgot to adjust the global parameters for Version and Planning area.

How to solve?



  • check Shared Global Variables of this task:
    $G_PLAN_AREA = '<Planning Area Name>'
    $G_SCENARIO = '' /*load into base version*/
    $G_TIME_PROFILE = ’-1’
    $G_BATCH_COMMAND = 'INSERT_UPDATE'
    $G_LOAD_DATE = SYSUTCDATE()

    SAP Cloud Platform Integration Guide - Shared Global Variables at: https://help.sap.com/viewer/eab8fd1726934516a89eabced318b210/1911/en-US/6ab81455e0803e6ae10000000a44...

  • check if you are loading into right staging table and this matches the Shared Global Variables


  • check if your data file provides corresponding root attributes for key figure data

  • Are there differences in fields you upload vs. target fields?


3.) What is the data volume on the task?


Error Message: “There was an unhandle exception in the data integration execution.”


typical cause: During Postprocessing Data is processed from Staging Tables into IBP Data Model Data. Within that task there can be the problem that temporary more than 2bn records are kept on SAP HANA side. This results in above error message.How to solve that?
For certain this needs further analysis by SAP Support. Please follow the below steps to create a support message. However there are some workarounds you can give a try.

  1. Reduce the number of records in your data integration task
    e.g. if you are trying to upload 25 million records , split the data volume into 2 or more uploads. You may think of filtering data by time or other attributes like Locations, Product Families or similar.

  2. Deactivate change history
    If the postprocessing touches change history enabled Key Figures also change history records are updated. This can have a massive performance impact and cause as well temporary DB table overloads. You can deactivate this behavior in App: Settings for Change History.

  3. Purge Data
    In order to solve this issue, please try to execute below purge jobs in your system  as per KBA 2728485.

    • Purge Change History Job: This would delete the change history records from the system for the respective planning area. Change history records can keep increasing if the planning area is change history enabled as with every job run system would update the time-series with new entry while keeping the previous record (history records) as well. So periodic clean-up is required

    • Purge Key Figure Job: To remove the Null entries you need to schedule this job and select the "Null option" so that records where no value for any Key Figure exist across a period for a certain combination are deleted.

    • Purge Key Figure Data Outside Planning Area Planning Horizon Job: This job would permanently delete key figure data that is outside the planning area's planning horizon. This key figure data is not used by any planning functions.

    • Purge non-conforming data job.



  4. Modelling & Planning Area configuration
    Attribute as Key Figure
    restrict usage of attribute as key figure and/or into how many time buckets the attribute value needs to be updated by post-processing


4.) How many records are failing?


If there are some records not successfully uploaded please check the rejected records log. Please go to IBP Data Integration Fiori app and download the rejection log report corresponding to the CPI job. Here you can find out reasons per rejected record.

In this KBA  2436131 you can find the common errors & detailed explanation.

 









When you could not figure out so far what’s the issue, please open an incident on component

SCM-IBP-INT-TS-POP

Please mention following information for fast ticket processing:

Time occured, CPI Task name, IBP Task name, CPI-DS Org Name, Planning Area and type of data load (Key Figure or Master Data)

 

I’m interested on your feedback, please let me know.

Kind regards,
Michael

Further Links: SAP IBP Integration Guide at https://help.sap.com/viewer/eab8fd1726934516a89eabced318b210/1911/en-US/178b12559c5d3d6ae10000000a44...