In this blog, i would like to share my experience with expiring around a million SPL entries.
I would like to give background before going into details. The SPL file business had subscribed from its Data Provider was a huge list of SPL which included all sorts of blacklisted entries for different reasons including trivial reasons which would normally not require screening. The SPL data was really huge and was causing lot of false blocks especially in case of banks during payment screening. Business then decided to subscribe very specific set of SPL entries which really made sense as per their business need. That meant that we had to get rid of all the redundant entries from the system leaving aside the required SPL entries.
The current volume of SPL data was around a million, to be more precise around 750,000 SPL entries which was going to be reduced to around 12000 entries. This 12000 was a subset of 750000 entries. Due to legal requirements and best practices, it is not advisable to delete the SPL records from table for audit reasons. So the option we were left with was to expire these SPL entries which will automatically mark them for deletion from the system. Then it was important to screen all the blocked business partners and documents against this delta list to see if some of the business partners get released. It was also needed to screen existing set of positive and negative list to see if they are still required. But the very first step was to expire around 750000 SPL entries.
We decided to create a LSMW program using recording to expire SPL entries. It had merely three screens to mark a SPL record as expired by setting valid to date of the SPL record to the current date. Using standard transaction to expire SPL entries meant that all the relevant table entries would get updated automatically. To check the performance and time needed to complete this activity for all the records, we measured the number of entries getting expired in second which came around three SPL records. 200 SPL records in a minute and 12000 records in an hour. With this calculation, for 750000 records, it needed around 62 hours of time. But we had only 2 days of weekends as cutover window in which we had to expire around 750000 SPL entries, load delta SPL and screen all the BPs in the system including blocked BPs and blocked documents.
We were fortunate that our team was big enough and we decided to assign a number of resources to get the work done in around 8-10 hrs for expiring SPL entries and then do the remaining set of activities in next 8-10 hrs. Another important consideration was system load or performance, we cannot process many files in parallel in one system as well as on the server at a time to avoid performance issues. We had to take appropriate number of SPLs per file and appropriate number of files to be executed at a time in parallel. After some round of testing and dry run, we came to the conclusion to have 15000 records per file and 2 files at a time in one system. It was taking around 75 to 90 mins to complete a set of 2 files running in parallel each of 15000 records. So if we are 8 resource executing in parallel, we could complete around 240000 records in approximately 2 hrs. So in 7-8 hrs we planned to complete our first phase of cutover i.e. expiring SPL records.
The second phase of cutover was to upload delta SPL file (which included one full load file and three delta files). Once this is completed, screening activity could be started for all the BPs including blocked BPs and blocked documents.
So the sequence of activities was like this:
Fortunately the cutover went really smooth and as planned. We shared various pre and post cutover reports with business.
I would like to get your feedback or suggestion if there is any other simpler or easier way to get this done more efficiently and in economical manner.
regards,
Kul Vaibhav
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
3 | |
2 | |
1 | |
1 | |
1 | |
1 |