cancel
Showing results for 
Search instead for 
Did you mean: 

BPS0 - very long runtime

Former Member
0 Kudos

Hi gurus,

During the manual planning in BPS0 long runtime occurs.

FOX formulas are used.

There is lot of data selected, but it is business needs.

Memory is OK as I can see in st02 - 10-15% of resources are usually used, no dumps, but very long runtime.

I examine hardware, system, db with different methods, nothing unusual.

Could you please give me more advices, how I can do extra check of the system? (from basis point of view preferably)

BW 3.1. - patch 22

SEM-BW 3.5 - patch 18

Thanks in advance

Elena

Accepted Solutions (1)

Accepted Solutions (1)

former_member93896
Active Contributor
0 Kudos

Hello Elena,

you need to take a structured approach. "Examining" things is fine but usually does not lead to results quickly.

Performance tuning works best as follows:

1) Check statistics or run a trace

2) Find the slowest part

3) Make this part run faster (better, eliminate it)

4) Back to #1 until it is fast enough

For the first round, use the BPS statistics. They will tell you if BW data selection or BPS functions are the slowest part.

If BW is the problem, use aggregates and do all the things to speed up BW (see course BW360).

If BPS is the problem, check the webinar I did earlier this year: https://www.sdn.sap.com/irj/sdn/webinar?rid=/webcontent/uuid/2ad07de4-0601-0010-a58c-96b6685298f9 [original link is broken]

Also the BPS performance guide is a must read: https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/7c85d590-0201-0010-20b5-f9d...

Next, would be SQL trace and ABAP performance trace (ST05, SE30). Check the traces for any custom coding or custom tables at the top of the runtime measurements.

Finally, you can often see from the program names in the ABAP runtime trace, which components in BPS are the slowest. See if you can match this to the configuration that's used in BPS (variables, characteristic relationships, data slices, etc).

Regards

Marc

SAP NetWeaver RIG

Answers (3)

Answers (3)

Former Member
0 Kudos

Hello,

I am running a planning sequence which copys data between two cubes. I configured the sequence in such a away that it takes always the same amount of data records but the job is running slower and slower the longer it runs. That is the speed with which one package is finished get longer and longer, although it has the same amount of data.

Does anybody have an explanation for the behaviour.

Thanks,

Regards, Lars.

Former Member
0 Kudos

Hello,

thank you very much, very helpful answers.

Best regards, Elena

Former Member
0 Kudos

Elena,

I can tell you Marc's webnar which was first presented at ASUG forum in Dallas last year was excellent.

Marc gave about 12 tips, I knew about half of the tips from working with BPS for a number of year but about half of them were new to me. I thought about it and reorganized some levels and rewrote some of my foxes that we were having some issues with performance in the testing / QA and I was able to reduce a series of Fox processing steps that is to be used once a month from from about 8hrs in a QA box to 30 minutes...

Hope this helps

Mary

Former Member
0 Kudos

Hello Elena,

If you think the volume of data cannot be restricted, then following are a few options to reduce runtime..

1. See if you can modify your fox formula to reduce IF statements or make the code a bit more lean and efficient

2. In case you have already done the above, try creating more packages for the planning function which would reduce the amount of data being called for each time the function is executed

Keep us posted if the above 2 options do not work out ..

I might have some other options..

Sunil