2024 Aug 28 3:23 AM - edited 2024 Aug 28 12:35 PM
Dear Experts,
I am facing some performance issues with datasphere modeling. Here is my design structures:
Background: We have to create SAC report based on COPA Actual/Plan Data and some other External Data as well.
Modeling Steps refer to above:
1. Preload COPA Actual Data(20 M Records), Budget Data(1 M Records), External Data(10 K Records) to local Table and store table data in memory. Then create views based on local tables and make projections to reduce the no. of columns. Filters and Aggregations are also applied to reduce the total no. of records.
2. Use Input parameters to limit amount of data access.
3. Create UNION for Actual and Plan data
4. Create UNION for External data and above(3)
5. Aggregation based on final data structure
6. Various Calculated column(simple calculation) created
7. Fact source view created(Measures: 110 / Attribute: 45) and used by Analytic Model. Till this point, no. of records have reduced to 2M
SAC then use this Analytic Model for Reporting
Now here comes the trouble:
1. When I Preview data in the Analytic Model, it takes 30~40 seconds to show the result, the acceptable response time is within 10 secs.
2. While the model is consumed by SAC, as we create more calculations(YTD, Last year YTD...etc) SAC crashes.
Is there any suggestion for performance improvement ?
Thanks a lot
Request clarification before answering.
Hi,
# SAC crashes - OoM?
# In DSP you can get PlanViz file, which you can analyze in Studio. Could help, but not for the SAC Crash.
# For SAC there is the the Performance Analysis Tool. But before you deep dive there, go to DSP. The 40 seconds runtime is quite a lot for the 20M entries. First you need to get the runtime in DSP down - can bypass by persisting the view and consuming in SAC via analytic model.
I would:
1. lower the complexity. Remove the calculations in step#6 / remove the step #6 . Check the runtime. I assume the calculations between step #2 and #3 are necessary for the union in stepp #3, if not, remove them as well. Check if you go for runtime < 10 seconds.
2. persist this view. In the same task chain, where you have the COPA source tables, add step to persist this view and then re-use it in analytic model, which you consume in SAC.
3. How do you connect DSP to SAC? What do you consume? Try analytic model - Worked for me the best way.
Martin
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
We finally cut the SAC reporting time from 2 mins to around 10 sec.
The root cause is the complex logic on SAC that generated a huge amount of MDS statement and Datasphere is unable to consume it efficiently.
We overcome this issue by taking the complex logic prior in ERP using ABAP and stored the pre-processed data in add-on table for Datasphere.
For direct connection(SAC to Datasphere), it's better to avoid too much logic design on SAC.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hello Charlielin,
i would recommend you to have a look at his Blog https://community.sap.com/t5/technology-blogs-by-members/performance-analysis-in-the-sap-datasphere/...
After Reading this Blog you should be familiar on how to analyse the Performance on each step done by the SAP HANA Database thats underneath the SAP Datasphere.
Best Regards
Julian
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi,
I would suggest please refer note 2511489 for troubleshooting the performance issues in SAC.
Thanks,
Raj
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
39 | |
15 | |
10 | |
8 | |
6 | |
5 | |
5 | |
5 | |
4 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.