cancel
Showing results for 
Search instead for 
Did you mean: 

BIA sizing

Former Member
0 Kudos

Dear All,

I want to see if the program ZZ_BIAMEMCONSUMPTION_BW3X leads to similiar results with the real memory consumption.

In our development system I runned the program for one cube and got the results below:

Grand Totals

  1. InfoCubes 1

TotalMemory [KB] 24558

InfoCube Memory [KB] 22689

MasterData Memory [KB] 1869

And I built BIA index of that infocube and checked the index sizes in TREXADMIN ( Index Admin tab 😞

INDEX ID INDEX TYPE MEMORY SIZE TOTAL (KB)

b4q_bic:fzmk_cid01 physical 16.061

b4q_bic:dzmk_cid017 physical 103

b4q_bic:dzmk_cid01b physical 70

b4q_bic:dzmk_cid01a physical 52

b4q_bic:dzmk_cid011 physical 2

b4q_bic:dzmk_cid019 physical 2

b4q_bic:dzmk_cid01t physical 2

b4q_bic:dzmk_cid012 physical 1

b4q_bic:dzmk_cid013 physical 1

b4q_bic:dzmk_cid018 physical 1

b4q_bic:dzmk_cid016 physical 0

b4q_bic:dzmk_cid01p physical 0

b4q_bic:dzmk_cid01u physical 0

b4q_zmk_cid01 BIA 0

-


16.295

in SAP note estimated memory usage should be doubled.. So;

22689 * 2 =45378 (KB) for the Infocube, while in real the memory usage is 16.295.

Do you think that I am using a correct method to compare actual and planned usage ?

Thanks in Advance..

Rgds,

Berna

Edited by: bberna on Jun 12, 2009 1:38 PM

Accepted Solutions (0)

Answers (1)

Answers (1)

Former Member
0 Kudos

The reason they suggest doubling the memeory is that many other temp structures get created and deleted so you need space for them as well. Also - the degree of compression willl also vary from cube to cube depending on the nature of the data.

Former Member
0 Kudos

hi pizzaman,

Can you suggest a way to learn this degree of compression ?

thanks

berna

Former Member
0 Kudos

Best bet is you use the sizing recommendations SAP provides. They have the expereience of implementing BWA at many customer sites. To make you're own estmate, you would have to fully understand the compression algorithm being used and then be intimate with the data values and distribution patterns of those values for each cube.

Any sizing should include a fairly generous cushion since you almost certainly have data volumes that are growing and will continue to add new InfoCubes to your BW, so there really isn't much point in trying to get that precise is there?,

Former Member
0 Kudos

Thanks Pizzaman,

Cheers,

Berna

Vitaliy-R
Developer Advocate
Developer Advocate
0 Kudos

Hi Berna,

Typical assumption for compression is 8-12 times. But SAP is sophisticating its compression algorithms, e.g. management of spares optimization.

Good assessment of space requirements is corresponding RSRV test (unfortunately, for fact table only)

Cheers,

-Vitaliy