cancel
Showing results for 
Search instead for 
Did you mean: 

Perfomance issue due to large data flow BPC 10.0

0 Kudos
377

Good afternoon!

We have a Real-time cube with massive data in it (over 500 millions of records).

During the busy period users are generating about 20-30 millions of records per day. Every night we have a cube optimization scheduled. But anyway we are facing performance issues during the business day, for example data manager packages are running twice or more times longer, than usually. Info cube Optimization helps, and right after it is done packages are running with "ok" time for 3-4 hours and than data volume in F-table rise so fast that performance decreases again.

Running additional info cube optimization in the middle of working day is not the best option, as it kills all the ongoing user packages.

Maybe someone have helpfull advices about actions that we should consider to prevent such performance issues.

We are using BPC 10.0, NW 7.4.

Thanks for responses

Accepted Solutions (0)

Answers (2)

Answers (2)

former_member186338
Active Contributor

Options:

1. Move historical data to another environment or model

2. Migrate to HANA

peter_warren_uk
Explorer
0 Kudos

Would it be possible to move some of the data to a standard cube and keep only the most recent data in the transactional cube

We used to have this kind of solution on a BCS consolidation system. Search for

Delta load based MultiProvider scenario for BCS

See if you could consider a similar solution for BPC

former_member186338
Active Contributor
0 Kudos

Sorry, but useless for BPC