on 2022 May 27 1:25 PM
Good afternoon!
We have a Real-time cube with massive data in it (over 500 millions of records).
During the busy period users are generating about 20-30 millions of records per day. Every night we have a cube optimization scheduled. But anyway we are facing performance issues during the business day, for example data manager packages are running twice or more times longer, than usually. Info cube Optimization helps, and right after it is done packages are running with "ok" time for 3-4 hours and than data volume in F-table rise so fast that performance decreases again.
Running additional info cube optimization in the middle of working day is not the best option, as it kills all the ongoing user packages.
Maybe someone have helpfull advices about actions that we should consider to prevent such performance issues.
We are using BPC 10.0, NW 7.4.
Thanks for responses
Request clarification before answering.
Options:
1. Move historical data to another environment or model
2. Migrate to HANA
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Would it be possible to move some of the data to a standard cube and keep only the most recent data in the transactional cube
We used to have this kind of solution on a BCS consolidation system. Search for
Delta load based MultiProvider scenario for BCS
See if you could consider a similar solution for BPC
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
4 | |
2 | |
1 | |
1 | |
1 | |
1 | |
1 | |
1 | |
1 | |
1 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.