Showing results for 
Search instead for 
Did you mean: 

APO DP - loading from InfoCube to planning area

Former Member
0 Kudos

I am using APO DP V5.

I have the following situation:

1. I am extracting sales history data at the end of each day from a connected ECC system into an InfoCube, using delta logic, so in the Cube I just have the new sales history transactions (despatches) for the day

2. I am then loading data from the Cube to my DP planning area via transaction /SAPAPO/TSCUBE

I assume I must have the 'add data' flag set for each key figure in this transaction, to ensure I see the consolidated sales history in the planning area.

Is this the 'best practice' approach for regular loading of sales history? I assume it's better to have a 'delta' approach for the Cube, for improved performance.

Thanks, Bob Austin

View Entire Topic
Former Member
0 Kudos

Hi Bob,

I have two questions

1. You said you are only loading delta data into the cube, that means are you deleting all previous data from cube or giving a selection? or setting up the indicator "Delete entire content of data target."?

2. If you set the indicator "add data", then the system ignores existing orders in the same SNP version for the same product, location and periods. This means that if you release to SNP twenty times, twenty separate orders are created in SNP. Does your scenario need two sales orders for one order?

I think the best way is to just load the deltas into the cube and load historical data to the PA using a variant without "add data" selection.

You can find more best practices as per scenario

<a href="">here.</a>

Hope this helps.

Former Member
0 Kudos


Thanks for your reply.

I should have added the following for my design:

1. We are extracting data from ECC on a daily basis

2. We have weekly buckets in APO DP

3. When we load the daily data into the APO DP planning area we will be 'adding' to the data for the 'week to date'

This is why I proposed the 'add data' flag.

Do you agree?



Former Member
0 Kudos


Our production system is using similar approach. Instead of add data. we use full load of the last 1week onwards. First we take the Delta (all the orders changed from last 2 days. We use last 2 days to make sure that we don't miss any orders due to failure in batch/ or other issues) and consolidate in a cube. From cube, we load the data to DP for the last 1 week till future horizon (our case 13 weeks)



-Thanks for rewarding points if this helpful

Former Member
0 Kudos


That's the very common procedure. Using Deltas are the best way to go. In your case,You need not use add data but give the select for the time period under Period tab.

I will explain why I said this:

Let's take an example: You have 1 million records in ECC system.

You get about 1000 records every day.

<i>Using Delta or full load:</i>

Then loading a full load is going to load all 1 million + 1000(every day)'s a very resource intensive. Instead if you use deltas, you can just load 1000 and it takes seconds.

While loading using TSCUBE:

If you load data initially, you will have lets say 1 million records. Then for deltas, you have just 1000 on 05/16. Then give a time frame in "Period" tab as From 05142007 to 05162007.

I'm taking one say as safety to make sure I donot miss any records.

Now, lets say Customer 1000 has qty.50 initially (that was loaded as part of 1 mill. records). Then he orders 10 more. Then the record in BW cube becomes 60. You load the data, this record for customer 100 is now 60 in the PA. But if you check add...then essentially what you are telling the system is that you shouldnot overwrite just you will have 50 + 60 =110 as qty. Isn't this a problem? ALso I have a safety 1 day. This data also addsup!

I would just use deltas as you are doing, give a time frame when loading to PA, donot use "add data" tab.

Hope this helps. Please let me know if you have any questions.

Former Member
0 Kudos


Thanks for this good explanation!

I have two questions:

1. What does the 'period' really refer to? Is it referring to the date of a particular key figure? Or the 'date of data being added to Cube'?

2. Suppose original despatch qty = 100 (two weeks ago), and 'today' the customer returns a qty of 60; how does system handle this - I would want this posted to the original date of two weeks ago.

Thanks, Bob.

Former Member
0 Kudos


Good questions!

1. What does the 'period' really refer to? Is it referring to the date of a particular key figure? Or the 'date of data being added to Cube'?

A: Both are same

The date is generally the date in your cube like the calendar day, month etc. This date is again based on a time stamp in the format DDMMYYYYHHMMSS. The calendar day is part of this field as DDMMYYYY. Also the system recognizes the changes by the same time stamp. So, if a customer changes the qty 05/15 at 3.30 pm, then the time stamp is 15052007153000. The calendar day in your cube or your key figure date is 15052007 and the delta is recognized by the changes in time stamp, between last load at the current system time. So , you are talking about the same time field.

Check in your system if this is the same case you got. let me know if not.

2. Suppose original dispatch qty = 100 (two weeks ago), and 'today' the customer returns a qty of 60; how does system handle this - I would want this posted to the original date of two weeks ago.

A: The data from your ECC system is generally brought to an ODS first. The reason is we overwrite the data there if there is any data that has the same key. If your key for the ODS is Customer and division. Then you overwrite the customer qty for that division whenever the value changes. If you need it by time, lets say per month, include it in the key. The system over writes the value for that month only. For next month, it's a new record.

In your case, if the qty. is 100 2 weeks back and now it's 60, if the time stamp is not in key, the system overwrites it to 60 and you have only 60 when you load it to your ODS and thereby to your PA as it overwrites. Delete the delta in your ODS and it shows the same 100 again. Then load it to PA. This is not a good option. The alternative is to include time stamp like calweek in your ODS key and load it over to cube. That way you have two records.

I hope I answered all your questions. Please feel free to ask more. I would love to answer that way I can also brush things up that were unused.


Former Member
0 Kudos


Thanks for your excellent detailed replies - much appreciated!

On the subject of the 'returns'...

If I understand you correctly, let's say we extract to ODS, and then to InfoCube.

We will have two records in the InfoCube, one for the original qty of 100 for two weeks ago, another for the 'return' qty of 60 for 'today'.

I assume that the 'delta' will just contain the qty of 60, but if we now load to the planning area, I do not understand how this will reduce the qty for two weeks ago from 100 to 40.

Regards, Bob

Former Member
0 Kudos


One other question regarding the 'delta' loads from InfoCube to planning area.

In my situation I have 'weeks' in my planning area, not days.

So, in a particular week, if I have 'delta' loads <u>each day</u>, for example qty of 100 on Monday, qty of 200 on Tuesday, then I think I need the 'add data' flag, to ensure a total week-to-date qty of 300.

Do you agree? Or is there a better way?

Regards, Bob.

Former Member
0 Kudos

Hi Bob

i would also add to what Visu said by pointing out that its a strategic decision thats a fine balance between "data loading time" and "accuracy of data"

While you do a delta load periodically, say every day, to the infocube.... you can also do a less frequent (once in a month) mass update or a full load (without the add data ) to the cube correct the corrections that creep in like this. You can again divide this load into periodic blocks of 1month or so and do multiple loads.

the same logic applies for the planning area. (You need to optimize the period that fall within the time that you have allocated to the loading process and also make sure this will cover as many changes as possible). Once in a while you can reload the planning area with full history which would then change the dependent processes slightly

The assumption is history is not going to be drastically changing due to these order corrections that your forecast or disagggregation which are usually the affected figures will not be much affected.

Once in a while you can reload the planning area with full history (or as much as is relevant) which would then change the dependent processes slightly

A lot of the loading process is based on the time that it takes and the time you have available to run the whole planning process. For eg if you have 8 hrs just for loading and loading all the history takes only 5hrs then why not?

but thats rarely the case. you probably get only 1hr for this and hence you need to optimize on the 1/8th of the total history that you are going to load.

Former Member
0 Kudos


Thanks for this.

I am assuming the following approach:

1. I will do a 'delta' extract from ECC each day to my InfoCube

2. My InfoCube will be 'cumulative' (ie contain 3 years worth of sales history)

3. My load to the DP planning area will be 'delta' based.

Does this sound sensible?

Thanks, Bob.

Former Member
0 Kudos

pretty much. A lot will depend again on the planning cycle and the volume and criticality of the data that is new

0. Load initial extract of all data to start with

1. Daily delta load to Cube

2. Daily/weekly Load(depending on planinng cycle and amount of data) from Cube to planning area for period that is relevant to planning

3. Weekly/Monthly load of data into cube that will rectify errors that arise due to corrections not captured by deltas

4. Step 2 matched to follow step 3 such that corrections are passed on to the planning process

all this need to be tested/thought about to make sure that it doesnt fluctuate the planning results a lot.

Remember all this makes sense when juxtaposed with the rest of the planning activities. for eg Sales history load just before you do a proportional factor job, followed by a forecasting job etc.

also you need to take into consideration the process of CVC generation between loading the cube and loading the planning area (am not sure how much your CVCs will change based on the new data)