Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
Max_Gander
Product and Topic Expert
Product and Topic Expert
7,282

Introduction


An often-asked questions by SAP Analytics Cloud customers (in particular by planning customers) is how to back up planning model data (facts and master data) and related content like models, stories, data actions, multi-actions etc.

The rationale is simple: plan data is valuable to business processes. When you lose it, you lose direction and in the worst case, your users need to re-enter their data. When you lose your stories, data actions etc., you will lose time in planning processes or your users cannot report their data anymore. Trust me, I was a planning process administrator before and I have painful memories…

Note that we are aware of a respective idea in the Customer Influence Portal. While a respective product enhancement is for long-term consideration, this blogpost aims to outline different options that are there to back-up and restore planning scenarios.

Content Network


Our first approach is based on the SAP Analytics Cloud Content Network. The Content Network's primary use case is transport management. But we can create content packages which we can use as content-level snapshots for later recovery (with some restrictions applying as outlined below).
These Content Network capabilities have recently been enhanced with 2 options for external storage and management of content packages to create a history of snapshots.
These new options for external storage and management are:

    • Public API for downloading/uploading content network packages within the same or cross tenants
    • Enabling the export of content packages to SAP Cloud Transport Management on SAP BTP

So, how does this work? First of all, you create packages of the content you want to back up. Be aware that this is a manual exercise that you must repeat for every desired snapshot. Using the new capabilities described above, you can automate the download of content packages. These can be exported to a managed storage or the SAP Cloud Transport Management Service. You can include all relevant objects (folders, stories, models, data/multi actions etc.) and data in these packages. In the case of disaster, you can do a selective rollback of your previous content snapshot.

Figure 1: Content Network optionsThese are some aspects to be aware of when evaluating this approach:

    • Manual content package creation leads to maintenance efforts
    • Data back-ups may be required more frequently than content back-ups. There are more adequate ways for data back-ups as outlined in the other options
    • Inclusion of data will fill up your storage space in ACN
    • Restrictions / not included:

        • Predictive scenarios

        • Comments

        • Object and folder level security information

        • Activity logs

        • Scheduling/Publishing information

More info:

Back-up model in SAP Analytics Cloud

A way to back-up your planning data inside of your SAP Analytics Cloud tenant is to copy the data into a back-up model which is (almost) identical with your main model. There is a blogpost written by Jef Baeyens which gives detailed instructions. In a nutshell, the architecture is as follows:

Figure 2: Back-up model architecture


You create a copy of your model and add a back-up date dimension to it. Then, you build a data action with an advanced formula that transfers the data from your main model to the back-up model and removes the previous back-up. A data action in the opposite direction serves as the restore job.
Jef also mentioned some challenges with this approach:

    • Tenant memory can become an issue in case of very large data volumes
    • Version handling
    • Performance impact if multiple data actions run in parallel (use multi actions and calendar to orchestrate)
    • Fact data only

I’d like to add the following:

    • Continuous maintenance as you need to keep an additional model in sync, update the data action etc.

More info:

Data Export Service

You can use Data Export Service to back up your model data outside of your SAP Analytics Cloud tenant. In addition to the fact data, it also offers the extraction of master data and audit data (audit data cannot be re-imported though but may be useful anyways as reference later on).
You can back-up your data in BW, HANA, Datasphere, or other 3rd party or custom storages. We have plenty of blogposts that explain some options. You can make this a low-touch recurring process with limited load on the system as Data Export Service also supports delta on facts. As an example, setting up an export to SAP Datasphere only takes a few minutes.
To restore, you can create import jobs from wherever you chose to store the data (if a connection is available). You can also use our Data Import API.

The architecture below illustrates three common scenarios. The third scenario could be useful if you anyways bring your data to ADSOs to be reused in other planning processes or reported in Bex queries. Otherwise, you would probably build everything in HANA Studio as well.

Figure 3: Data Export Service architecture


Some topics to be aware of:

    • Hierarchies are not exported (we want to add that in 2024)
    • A DP agent is required in many scenarios
    • A HANA full use license may be required depending on the chosen architecture

More info:

Model export job (data management in modeler) - DEPRECATED QRC1.2025!

Please note the approach described will be deprecated as per SAP note 3401679

In 2025.QRC1 the option to create a new schedules for exporting model data to file repository will not be available. Existing schedules will also stop running. This only impacts scheduled exports and not individual file exports.


The last option to be mentioned here probably is the most straight forward, but also least sophisticated one. And that is to use the Export Job in the Data Management tab of the planning model. Here, you can export fact data into a csv format and store it in the SAP Analytics Cloud file repository or on a file server.

However, there is a limit to the number of records you can export in one job (6 million when exporting facts only). So, if your model is bigger than that, you will need to split it into multiple export jobs.

Downsides of this approach:

    • Row limit per export job
    • File management
    • Limited possibilities to transform the data if needed in Excel

More info:

Conclusion

The table below provides and overview and comparison of the available options. In my opinion, having a solid restore option for your fact data is most important as there are many ways you could imagine such a disaster to happen (import of content, data management, version deletion…). Given the delta capability and low-touch characteristic of the Data Export Service, I would personally prefer this approach. However, the back-up model offers a straight-forward option that does not require any work in another system.

If you want to take it further, look at the Content Network to safeguard your objects and Data Export Service to back up your master data as well.  Combining the Content Network and Data Export Service, you will have a solid and comprehensive disaster recovery plan in place!

OptionCovers content?Covers facts*Covers master data (incl. properties etc.)How to restore?**
Content Network APIsX***XXRe-import via Content Network
Back-up model in SAP Analytics Cloud X Cross-model copy
Data Export Service X + deltaXImport job (from datasource), Import API
Model export job (data management in modeler) 

X

(multiple jobs may be needed)
 Import job (from file)


*Excluding private versions
**Only the most obvious option to restore
*** Restrictions apply as described

Note that I did not list legacy export options (OData push) as they will not be enhanced further in the future and Data Export Service offers more features and better performance.

4 Comments