cancel
Showing results for 
Search instead for 
Did you mean: 
Read only

Content Refresh performance i SAP Disclosure Manangement?

Former Member
0 Likes
289

How are the performance and processing of the 'Content Refresh' in SAP Disclosure Management?

Let's say I have 2 datasets that I will put in the datacache and then update them into a report using content refresh. Then there could be two approcahes to do this.

Approach 1) Having separate Excels in the datacache:

Report (datacache)Report (to update)Content refresh
Datacache_chapter1 (Excel)Chapter 1 (Excel)Content refresh done on the report level (updating both chapters)
Datacache_chapter2 (Excel)Chapter 2 (Excel)

Approach 2) Having one common Excel in the datacache:

Report (datacache)Report (to update)Content refresh
Datacache_Common (Excel)Chapter 1 (Excel)Content refresh done at the report level (updating both chapters)
Chapter 2 (Excel)

Questions:

  • Which of the two approaches would have the best performance (why)?
  • What does impact the performance the most in content refresh?
  • Recommendations?

My own observations is that the content refresh of 'Aproach 1' is faster - but the downside is that I have more datacache chapters to check in/out. This is means more manual labor.

Anyone haves an explanation? Has is something to do with serialization of processing in the Task Engine?

Accepted Solutions (1)

Accepted Solutions (1)

Marc_Kuipers
Product and Topic Expert
Product and Topic Expert
0 Likes

Hello Mikkel

Server side content refresh is probably the slowest part of the DM application.

The most time consuming part in DM during a content refresh is ‘Loading data from datacaches’. (and not creating the tables and formula fields).

Small improvements can be made by hiding non-used worksheets. However, performance increase is minimal.

The content refresh is scheduled to be addressed (a re-write of the code) in DM 10.1 SP07 which is scheduled for September 2016

Thanks

Marc

Former Member
0 Likes

Hi Marc,

Thank you for the answer.

As of now we have a workbook with 13 worksheets. The workbook size is approx. 3.4 MB.

In the report to update we have created a chapters with Excels. Each chapter has a datalink to a specific worksheet of the workbook.

Total runtime for the content refresh of the report to be updated are approx. 45 minutes.

Then we have tried splitting the workbook into 5 workbooks. Where the 4 new workbooks holds the largest worksheets from the original workbook. This has actually reduced the runtime for the content refresh to approx. 8. minutes. The downside is that also takes about 5 minutes to check each workbook out from of the datacache and update it. Meaning the net time improvement is not that big. But the content refresh are.

Using approach 2

1 Workbook containing 13 worksheets (workbook size 3.4 MB):

Content refresh takes 45 min

Total 55. min

Using approach 1

5 workbooks

- workbook size 0.23 MB (6 sheets)

- workbook size 0.36 MB (1 sheet)

- workbook size 0.86 MB (2 sheets)

- workbook size 1.20 MB (2 sheets)

- workbook size 1.40 MB (2 sheets)

Extra work for updating the datacache workbooks  4 x 5 minutes (20 minutes)

Content refresh takes 8. minutes

Total 28. minutes

Net time improvement = 17. minutes

Content refresh improvement = 37. minutes

Now you would notice that the net time improvement come with the cost of more manual work. The interesting part is that the content refresh itself was much faster.

So when you say the most time consuming part in DM is 'Loading the data from the datacache'. Is it reasonable to think the time are used loading all of the data the entire Excel workbook for every datalink. And thats's why 'Approach 2' performs worse?

Regards

Mikkel

Answers (0)