on ‎2019 Nov 24 1:10 PM
Hi experts,
It's been more than 2 years since go-live on BPC 10.1 NW and have approx. 30k data loads on the cube. Last week we ran the 'lite optimize' package for the first time and encountered runtime issues. So as per the KBA 1935969, we manually collapsed 98% of the data loads records in several batches and executed the 'lite optimize' package. With fewer records to compress, finally the lite optimize ran successfully without any issues.
Next month we plan to execute 'lite optimize with zero elimination' and unsure if we would run back into the similar issues, where the package fails due to run time error and the cube becomes unavailable for data loads. Can you please guide on...
1. If executing 'lite optimize with zero elimination' package will consider the full cube for zero elimination and end up with run time issues? how to avoid that situation?
2. Similar to collapsing the data loads on the cubes as a workaround for lite optimize compress step, do we have any workaround for zero elimination?
3. Will there be any negative impact by setting the system run time parameters to unlimited (0) on RZ11? Right now we're maintaining 10minutes.
Thank you
Request clarification before answering.
Sorry, but if you already run lite optimize without zero elimination, then zero's will not be eliminated after lite optimize with zero elimination. Zero's are eliminated only during compress.
To correct things use RSCDS_NULLELIM in SE38
read my answer here: https://archive.sap.com/discussions/thread/3694991
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
| User | Count |
|---|---|
| 17 | |
| 11 | |
| 9 | |
| 3 | |
| 2 | |
| 2 | |
| 1 | |
| 1 | |
| 1 | |
| 1 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.