Archiving financial data using object FI_DOCUMNT is one of important functionality for ERP customers. FI_DOCUMNT archiving object archives FI Header data (table BKPF) and FI Line Items (table BSEG). Standard archiving process using archiving object FI_DOCUMNT involves:
1. Writing data to archive file
2. Deleting data from database
3. Post-processing of data
Why is data archived? -- To reduce the disk space and create sufficient space for next postings. Typically most customers would deal with millions of records for year on year. If the data keeps building up then there would not be server space for new set of data.
In this regard, there was an alarm raised by customer that they had millions of record for one posting period and archiving data was running for hours and hours. And most of the times not going to completion and ending with short dumps. (Posting period could be defined by customers. in this case one month was posting period)
This is where Design Thinking principle sets in. What are the primary steps of design thinking:
To know more on design thinking just go through this wonderful blog written by Karthikeyan, click here.
We had a similar situation with another customer and changed the functionality to provide flexibility to customer to archive data date-wise. Typically customers use fiscal year and fiscal period (along with company code) to archive data. With the change customers could provide a date range. This functionality was provided through note 1554093.
In current scenario, the above changes were implemented and tested. Still even for a single day of archiving the job would terminate. It would be tough for the customers. There were four of us involved in this project across locations. Myself, two colleagues from CoE and one consultant who was on-site.
We thought about a new change and introduced time slices. The customer could enter time range of the day for archiving data. This was painful for customer to enter time frames. This looked to be a solution too. The prototype was developed and tested. Still not happy with the results because certain time ranges for a day had lot more postings. A standardized time slice looked difficult.
With the limitations this solution was set aside.
Next round of ideation leads to integrating Parallel Process Framework with FI Archiving program. To know more about PPF, click here.
With the integration another step was introduced in the archiving process.
1. Pre-processing of data
2. Writing data to archive
3. Deleting data from database
4. Post-processing of data
In the first step which is pre-processing, variants were created. Data was read from FI header table and packages were created into variant specific table. With the customizing feature in the PPF, customer could define how many parallel processes could be started simultaneously. In the write job, with the customizing and created variants, archiving jobs were scheduled. This was followed by the deletion jobs.
Later all three colleagues who were previously working from different locations met at customer site. This was to implement the prototype and test the solution. We did some more iterations and loads of testing.
The note 1779727 provides this solution (beta-phase).
What is to be noted is with this approach, the total time of archiving data was brought to around 2hours for a fiscal year and fiscal period. Which previously would always terminate. Continuous test results gave similar time for completion of the process.
In the end everyone of us involved and the customer was happy.
A situation were design thinking helped arriving at a solution for bringing smiles.