Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
manukapur
Active Participant
0 Kudos
3,772

Introduction

In the ever-evolving landscape of enterprise resource planning (ERP), SAP S/4 HANA stands tall as a game-changer, promising unparalleled efficiency, and innovation. One of the key features that contribute to its prowess is Data Aging. In this blog post, we delve into the intricacies of Data Aging in S/4 HANA, understanding its significance and exploring how it can shape the data management strategy of modern businesses.

Understanding Data Aging:

Data Aging is a Suite-tailored data management concept for reducing the SAP HANA memory footprint, based on a Data Aging Framework provided by SAP NetWeaver ABAP. Data Aging is available for SAP Business Suite on HANA and SAP S/4HANA applications and offers the option of moving large amounts of data within SAP HANA to gain more working memory. Data Aging differentiates between operationally relevant data (Hot/Current), and data that is no longer accessed during normal operation (Cold/Historical). The data temperature can be used to horizontally partition the tables (taking part in Data Aging) for optimizing resource consumption and performance - moving data between the different partitions (i.e., from hot to cold partitions). Hot data resides within SAP HANA main memory, whereas cold data stays primarily stored on disk, but remains accessible via SQL on request.

Benefits

The goal of aging is to both reduce the main memory footprint and speed up database queries by only keeping operationally relevant (hot) data in main memory. In contrast to this, cold data is placed primarily on (less expensive but usually slower) secondary storage and accessible via SQL on request.

Data Archiving and Data Aging in S/4 HANA.

In SAP HANA, Data Aging is different than Archiving in the sense that cold data is still kept within the SAP HANA Database and remains accessible via SQL in the very same table as the hot data (yet in another partition). Whereas archived data - strictly read-only - is written to an archive file and deleted from the database and needs additional access paths (address information or archive indexes) to be read. Aging targets the main memory footprint reduction whereas archiving is the basis for ILM, covering the full life cycle up to the destruction of information.
Data Aging offers you the option of moving operationally less relevant data within a database so as to gain more working memory. You use the relevant SAP applications, particularly data aging objects to move data from the current area to the historical area. The move influences the visibility when data is accessed. This means that you can perform queries of large amounts of data in current area in a shorter time. This also means that careful testing is required to ensure that the change of visibility is in line with the business processes requirements.

To be able to apply Data Aging to your data, you need to fulfil certain requirements regarding the database and the application.
Data Archiving is used to remove data from the database and store it outside in a consistent and secure manner. The archived data is stored in a file system and from there can be moved to other, more cost-efficient, and long-term storage system via the ArchiveLink interface or the ILM interface.

Hot and Cold Data (Buzzwords…)

Hot data : Current data is the data relevant to the operations of application objects, needed in day-to day-business transactions. The application logic determines when current data turns historical by using its knowledge about the object’s life cycle. The application logic validates the conditions at the object level from a business point of view, based on the status, execution of existence checks, and verification of cross-object dependencies. Current data is stored in the current area.

Examples of current data: Open FI items, items cleared only a few months ago, undelivered purchase orders, sales documents of sales cycle that is still in progress, documents for an ongoing project, and IDocs that need to be processed.

Cold data: Historical data is data that is not used for day-to day-business transactions. By default, historical data is not visible to ABAP applications. It is no longer updated from a business point of view. The application logic determines when current data turns historical by using its knowledge about the object’s lifecycle. The application logic validates the conditions at object level from a business point of view, based on the status, executing existence checks, and verifying cross object dependencies. Historical data is stored in historical area.

Examples of historical data: Cleared FI items posted two years prior to the current fiscal year, material documents one period older than the current closed period, processed IDocs, and application logs after X number of days.

Data Aging Technical Process in S/4 HANA

The data aging mechanism for ABAP applications is based on a data aging framework provided by SAP NetWeaver ABAP. ABAP developers use this framework to specify the data aging objects that are aged as one unit, to identify the involved tables, and to implement the logic for determining the data temperature. The data temperature is set via an additional data temperature column '_DATAAGING' (type DATA_TEMPERATURE with ABAP date format “YYYYMMDD”), which is added to all participating tables.
The data temperature can be used to horizontally partition the application data with time selection partitioning (a.k.a. "aging") on the column '_DATAAGING' for optimizing resource consumption and performance. Only one partition contains the hot data (represented by the value “00000000”) and the other partition(s) contain cold data with different data temperature ranges.
By default, only hot data is accessed. As the hot data is located in a separate partition, SAP HANA should only load that partition into memory during normal operation. If required, ABAP developers can set the data temperature context to switch between accessing only hot data, all data, and all data above a specified data temperature.
The SAP HANA-specific database shared library (DBSL) in the ABAP server adds a corresponding clause to the SQL statements that are sent to SAP HANA. By adding the clause WITH RANGE RESTRICTION ('CURRENT') to a SQL statement, SAP HANA restricts the operation to the hot data partition only. Instead of 'CURRENT' also a concrete value can be specified. This restricts the operation to all partitions with data temperatures above the specified value. The clause WITH RANGE RESTRICTION ('20100101'), for example, tells SAP HANA to search the hot partition and all cold partitions that contain values greater or equal than '20100101'. Range restriction can be applied to SELECT, UPDATE, UPSERT, DELETE statements and to procedure calls.
All other clients that want to access these Data Aging tables with proper filtering, the same generic syntax extension may be used.
The application knows which business objects are closed and may hence be moved to cold partitions. Therefore, the application actively sets values in this column to a date to indicate that the object is closed and the row shall be moved to the Cold partition(s) during an Aging Run. Since the table is partitioned by the temperature column, the rows are automatically moved then to a cold partition. The move influences the visibility of the data and its accessibility. Several configuration steps/prerequisites need to be administered to be able to execute a Data Aging Run.

When does the data move from the hot partition to cold partition?

The application logic determines when current data turns historical by using its knowledge about the object’s life cycle. The application logic validates the conditions at the object level from a business point of view, based on the status, execution of existence checks, and verification of cross-object dependencies.
The data will be moved during a Data Aging Run. To set up an Aging Run several tasks need to be fulfilled upfront:

  • Determining the data: The application-specific runtime class can be used to determine the data for which Data Aging is intended. The SAP application assigns these runtime classes to the relevant Data Aging object so that the runtime class can be called and processed in a Data Aging run.
  • Managing Partitions: To be able to move the data from the HOT partition of the database to the COLD partition(s) according to the specified partitioning objects and partitioning groups, all of the participating tables must be partitioned for Data Aging.  For each system, you need to define the partitions for corresponding tables of a Data Aging object (DAGPTM), this setting is not transportable. If the conditions are not fulfilled, the Data Aging run is not started. There should be at least one cold partition covering todays date and for multiple partitions on one table the intervals can have no gaps.
  • Activating Data Aging Objects: After the partitions have been defined, choose transaction Data Aging Objects (DAGOBJ) to activate the Data Aging object. The system runs through various checks on each table belonging to the Data Aging Object so that the Data Aging object can be used for the run.
  • (specific Settings for Data Aging Objects)
  • Managing Data Aging Groups: Define Data Aging Groups via transaction DAGOBJ -> Goto -> Edit Data Aging Groups and select all Data Aging Objects to be processed in one Group.

For scheduling Data Aging Runs go to transaction DAGRUN and select a Data Aging Group, Maximum Runtime and Start Date/Time to schedule the run. The same transaction can be used to monitor Data Aging Runs as the initial screen shows a list of runs with the details, such as, the Data Aging Groups, Start date/time, Duration, and Job name.

Guidance, Technical Considerations and TCOs

Data Aging in SAP S/4HANA can be used to actively remove part of the data that is stored within the SAP HANA database from HANA’s main memory. This is different from the “column unload” functionality in SAP HANA. Aged data remains technically addressable within the database; within the ABAP system, visibility of aged data is per default restricted. The technology to store and process aged data is designed for sporadic transactional (OLTP) access. Access must be limited to that part of data for which longer response times caused by hard-disk access are acceptable.

To utilize Data Aging in each application component, a corresponding data aging object must exist. Available Data Aging Objects - Available Data Aging Objects

Technical Considerations:

Data Aging does not reduce the size of data in the SAP HANA database. It can reduce the amount of memory required to store and operate on parts of the data in the database. In consequence, Data Aging is not a technology to fulfil expectations such as:

  • Reduce the amount of disk space required by the database.
  • Minimize size of database backups; minimize the runtime of database backup and recovery operations.
  • Amend system copy processes.

To reach those goals, the following methodologies and technologies are most relevant:

  • Avoid creation of unnecessary data
  • Housekeeping (i.e., deletion of temporary technical data e.g. spool, jobs, application logs)
  • Data archiving
  • SAP ILM

Technically, aged data is accessible in the SAP HANA database and the SAP S/4HANA system. There are no additional authorizations to control access to aged data or other means to block access to aged data. Therefore, Data Aging does not play a role when designing system aspects related to data protection and privacy (DPP). The use of archiving with ILM is one important means to achieve DPP blocking.

If use of Data Aging and archiving is combined for the same objects, the archiving runs will need to process aged data. This impacts the runtime of archiving processes.

TCO Considerations

To understand potential impact of Data Aging on TCO, consider,

  • Project cost of introducing Data Aging. These include analysis and potentially optimization of custom code for Data Aging, ensuring that OLAP processing of aged data will not take place, or testing and planning of Data Aging runs. Depending on the memory reduction targets you define for your system, you may also need to consider extending the existing Data Aging objects or creating additional ones.
  • Comparison of Aging Residence Times to established Archiving Residence Times or ILM Retention Times: do reasonable settings lead to notable memory savings through Data Aging?
  • In some countries there are data privacy regulations that require the physical removal of data from the primary database or destruction of data which can be achieved using SAP ILM. Consequently, in many customer systems there will be anyway a point in time when selected data must be removed from the database by executing data archiving or data destruction jobs. Data Aging would then be just an additional, intermediate step and a temporary solution.

Other Considerations:

Data Aging in SAP S/4HANA can be used to actively remove part of the data that is stored within the SAP HANA database from HANA’s main memory. This is different from the “column unload” functionality in SAP HANA. Aged data remains technically addressable within the database; within the ABAP system, visibility of aged data is per default restricted. The technology to store and process aged data is designed for sporadic transactional (OLTP) access. Access must be limited to that part of data for which longer response times caused by hard-disk access are acceptable.

Because of technology advances, use of SAP HANA NSE without Data Aging is preferrable in the case of technical/logging data. As a simpler alternative to Data Aging, memory footprint reduction for technical and logging data can be achieved by using SAP HANA native storage extension (NSE) directly without using Data Aging.

Benefits of using NSE directly are:

  • No adoption of custom code required because visibility of data does not change.
  • Liberty to choose level of NSE usage (entire table, individual column, selected table partitions)
  • Easily reversible
  • Does not require an aging object, can e.g., easily be used for suitable z-tables.
  • No need to schedule and monitor Data Aging runs.

Conclusion
For technical and logging data, direct use of NSE is recommended.

For application data, if you are using Data Archiving - use of Data Aging can normally not be recommended.

Data Archiving (with or without ILM) remains the most important means to manage information lifecycle.

To implement Data Aging, ( Pre-requisites , Procedure etc ) refer to the SAP Documentation, Data Aging | SAP Help Portal

1 Comment
Labels in this area