Financial Management Blogs by SAP
Get financial management insights from blog posts by SAP experts. Find and share tips on how to increase efficiency, reduce risk, and optimize working capital.
Showing results for 
Search instead for 
Did you mean: 
Former Member
System performance may be affected by large data storage

without proper data aging strategy in place. In order to reduce administration costs and efforts, optimize system performance, we advise to implement data archiving processes.

This will help classify data according based on Information Lifecycle Management recommendations.

Knowing  how current data can be archived , stored in near-line storage, can really improve performance.

Data archiving keeps  data constant and consistent without deletion.

The data is first moved to an archive or near-line storage and then deleted from the SAP BW/4HANA system.  Data can be either directly accessed or loaded back for reporting purposes.

Note that you should not change a near-line storage connection if it is being used in an active data archiving process.


  • Data have been loaded in InfoProvider

  • Prior to archiving, Data must be activated before if using aDSO (Advanced Data Store Object)

  • aDSO should not include non cumulative key figures and one of the modeling properties has to be selected

  • fields length of aDSO is restricted to 18 characters maximum


Three (3) main Process Flows to be considered when running Data Archiving :

  1. Setup Data Archiving Processes:

  2. Scheduling Data Archiving Processes Using Process Chains:

  3. Manual request processing: Creating and Executing Archiving Requests:

Each data archiving process is assigned to one specific InfoProvider and has always the same name as this InfoProvider.  Before scheduling a Data Archiving process using Process Chain, Data Archiving process has to be previously setup. Once Data Archiving process has been created and setup for an InfoProvider, you may manually setup, launch , reload archiving request without using Process Chains.

Using a Data Transfer Process, InfoProvider data that has been archived using a data archiving process, can be loaded into other InfoProviders.

All data storage features can be configured using Data Tiering Optimization (DTO).

Data Tiering Optimization helps classify the data in the defined DataStore object (advanced) as hot, warm or cold, depending on how frequently it is accessed.

Data Tiering Optimization job moves the data to the corresponding storage areas at regular intervals.

We consider following options within DTO interface :


  • Standard Tier (hot): The data is stored in SAP HANA.

  • Extension Tier (warm): The data is stored in a SAP HANA extension node.

  • External Tier (cold): The data is stored externally (in SAP IQ).

* Note that Hadoop-based external storage is currently only possible via SAP NLS data archiving processes and not via DTO.


Both Standard and Extension Tier built a Logical Unit for high availability whereas External Tier is managed separately.Using External Tier , Data are stored in external storage (NLS: Near-Line Storage Solution) in either SAP IQ or HADOOP.

Editing and customizing aDSO with NLS (Near Line Storage) :

  • Run Transaction Code RSDAP to start editing built Advanced DataStore Object (aDSO)

  • Select Edit and Select your defined NLS Connection (Near-Line Storage connection) under the "General Settings" tab

  • Click on " Selection Profile " to include "Primary Partition Characteristic" definitions based on archiving strategy.

  • Under " Nearline Storage ", define maximum Size , number of Data Objects for NLS and make sure the associated Near-Line Object and Near-Line connection are selected before saving the changes.

Note: Edit Data Archiving Process with transaction code RSDAP

For DataStore objects (advanced), data archiving process is used in near-line storage . Storing historical data in near-line storage reduces the data volume of InfoProviders, but the data is still available for queries. The database partitions and the near-line storage partitions for an InfoProvider consistently reflect the total dataset.

In BW/4HANA, BPC (Standard) Time and any other BPC Dimensions can be included in the Data Archiving process. Before BW /4HANA , only BW Time Characteristics could be used in a Data Archiving process (this was not applicable to BPC).

More info about Creating Data Archiving Processes