Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
Showing results for 
Search instead for 
Did you mean: 


Hello everyone I am back with another idea that might help you through your day to day processes.

Let me start by quickly walking you through what this post will be about, every business warehouse consultant gets loading errors in process chains, regardless of how stable they were, because data is always coming from user input, and it's subject to change and errors, this might not be happening frequently, unless it's for time dependant characteristics.

What are time dependant characteristics? In data warehousing language they are slowly changing dimensions, let me explain this with an example.

Imagine having a normal characteristic, store, having some attributes, like longitude and latitude, size, etc, and another attribute that will change according to time, like the store manager, last year the store manager was John Doe, this year it got assigned to Jane Smith.

Let's quickly jump to the problem, if you are running a report and ranking the performance of your store managers on the past year, who should appear?  Doe or Smith? If it's a normal attribute Jane Smith will be the one that appears, which is wrong, as the credit should go to John Doe, who was the store manager back then, this is where time dependant characteristics should come into action, SAP Business Warehouse makes maintaining a situation like this very easy, however time dependent characteristics can cause a lot of errors during the loading processes, this is due to the overlapping time periods, in our example having Doe and Smith on overlapping Date From, Date To criteria, the loading process will immediately fail, and you will not be able to handle the duplicate records because you will be excluding one of them, which is wrong.

This is an example of the overlapping time period error:

Date From Date To Personnel No. Per. Name PLANT
21/01/2019 31/12/9999 1 John Doe 1000
22/01/2019 31/12/9999 2 Jane Smith 1000

The issue is you can see that the same plant, has 2 effective store managers on the system, which is wrong, one of them left the store, and has to have the "Date To" change or one of them has been accidentally been assigned the criteria of store manager to be removed, either way the process will fail with the error "The time interval ['99991231'/'20190122'] & (from/to) for the data
records 78 & and 44 & overlaps in characteristic XXXX".

This is where I will have to go and notify the responsible business area to change this each time it happens, provide them with the errors that happened then wait for their action, thus losing time I can't afford to lose.

In this blog I will walk you through on how to make a report that gets sent only to the responsible users only when the errors happen.

Step by step guide:

Business Warehouse:

Creating the Error DTP:

Let's start with the data flow, first you will need to activate the Error DTP, navigate to the DTP of the process you are loading and change this option, under update, change the Error Handling to "Update Valid Records, Reporting Possible (Request Green)". Then Press Creating Errors DTP.


This means that all the records that have no issues will be successfully loaded and available for reporting, while the ones with errors will be moved to the error stack, where you can tend to them later on, this is where we will need our report to read from.

Finding the Error Stack table:

You can find the Error DTP table through navigating to the Error DTP, you will find it under your original DTP, with the naming convention Error DTP: <Same name of the original one>, navigate to Extras then customizing for Error Stack.

You will see the table name listed here:

Important note, PSA table names may vary from a system to another, it's not recommended to migrate the data source, you will have to recreate it on each system.

Creating a Datasource for the Error Stack Table:

Navigate to Tcode RSO2, select the type of the data source, give it a name click create.

Assign it to an application component, give it description them, paste your table name in the Extraction from DB View, as shown:

Click save then choose your selection fields, now navigate to RSA13, select your BW system and click replicate data sources, then activate it.

Modelling the target cube:

In this scenario I will always chose to have a virtual infocube as a target, the data will always be of small volume, and it will save me the hassle of adding the flow to a process chain.

Let's stick to the given example, you will need to have 4 characteristics and 1 dummy keyfigure( as the cube can't be created without keyfigures).

Characteristics will be Data From(Custom), Date To(Custom), Employee and Plant, and a dummy keyfigure.

Create the transformation, map the corresponding fields activate and save.

Creating the BEX Query:

Create a query with the 4 characteristics or any other characteristics you see relevant in helping the business unit on understanding the error, in most cases you won't need any selection as the Stack error gets cleared with each loading, and after executing the Error DTP.

Creating the report:

This is the tricky part, in order to achieve my goal, which is sending the report only when errors occurred, I couldn't use Web Intelligence, I needed some sort of alert on the BI Platform server that gets triggered based on a certain condition, after searching I found out that I have to use Crystal Reports for Enterprise, not a big deal if it will serve my purpose, I used Crystal reports 4 years ago, my requirement was very simple so I wasn't worried, I will be detailed a bit here as this is what the post is all about.

Open Crystal Reports for Enterprise, shortest way to do so is from the BI platform through clicking this icon, note that you should have the correct version that is compatible with your BI platform, mine was 4.1 SP05.

 Once you click it, it will prompt you to download a file click open, this will save you the hassle of establishing the connection with your BI platform.

It has a much more friendly interface than the normal Crystal Reports, on the start page click New from Data source..

Note that it doesn't connect directly to SAP tables, I would've avoided the whole data flow effort if it did, and the normal Crystal reports will not work as well, as it doesn't create events on the CMC, you will understand more on the next points.

Select SAP BusinessObjects Business Intelligence platform (this means it will read the connections on the BI platform, specify the data source type as SAP BEX Query

locate the OLAP BW connection, connect, type in the query name, make sure you allowed external access from the query properties to be able to see it.

Upon selecting the query the application will automatically insert all the fields in the rows and columns into the report, you can make the adjustments that you want, navigate between Structure tab and Page tab to see how will the report look like, this is my structure.

Now to the purpose of using Crystal Reports Enterprise, from Navigate to Data>Alerts

  1. Alerts tab will popup, click New.

  2. Give it a name, make sure enable is ticked.

  3. Click Condition Formula Bar.

  4. Type in the condition formula, if it is true the alert will be triggered, in this case I need to check if the report will return any data, so I did a simple count employee key field, if it is greater than 0, the alert will be triggered. It means it found errors in the error stack.

  5. Type in an error message to be displayed (we will not use this).

Publish the report to the BI Platform in the directory you wish.

Now navigate to the CMC, on the home page under Define click Events:

Navigate to Crystal Reports Events, and voila! our event has been created on the CMC.


Now to the final part, we created this event to set a trigger that there are errors, but then what? this is were the decent scheduling feature of Events on the BI platform comes into action, we will schedule the report twice, to run at the same exact time everyday, the first instance will be a fake instance, that runs either in the default enterprise location, which is the output file repository, a cleaner approach will be having the destination of File system on a certain directory that gets deleted everyday trough a batch file or a shell script.

The first scheduled run will be completely normal, select the format and the desired name for the instance, make it run everyday at 10:00AM for example.

The second run will be at 10:00AM, with the destination Email, to the stakeholders who will be responsible for correcting these errors, and under the event tab you should add the event that we created, SM Error Check.


Those are the 2 runs:

Now the second schedule will always have to wait for the first report's trigger in order to run, in case there are no errors, our count formula will return false, thus the second report will not be triggered, so nothing will be sent to the users!

This is the end result, you can add tips on how to solve the error in the email body:


You can do a lot with triggers, I use them to schedule other reports, use them to retry sending the same report in case it fails.

This is everything, I hope it's clear enough for you and that it will help you as much as it did for me, a stable automated environment will give you more time to innovate and learn.

If you like this blog post you can check my other one through this link:

Please feel free to post any questions you have.
1 Comment
Labels in this area