Supply Chain Management Blogs by Members
Learn about SAP SCM software from firsthand experiences of community members. Share your own post and join the conversation about supply chain management.
cancel
Showing results for 
Search instead for 
Did you mean: 
somnath_manna
Active Contributor
1,997

At one of my recent clients we are responsible for monitoring daily and weekend Planning Run results setup in SAP APO as multiple background jobs managed through Process Chains. Production Support Team was tasked to monitor the jobs proactively and also report on key Planning Run results from SNP Heuristics, Deployment and TLB run application logs. The vital statistics (different application log paramater values like number of location-products planned, number of Purchase Requisitions and Planned Order created, number of Stock Transfer Orders created etc.) and runtime is compared to determine if anything is amiss.

The different batch jobs run takes time and you need to wait for one step to finish before analysing the corresponding application log and extract the required data to fill in a pre-defined Excel Weekly Reporting template file. Doing this repeated activity week on week was quite a bit of drag on resources including senior consultant time / effort. So I started to look at how this activity can be automated to significantly reduce the effort required to manually check different application logs (entire set of planned products are broken out into multiple job variants for optimised performance but generates separate application logs) and prepare the weekly reporting file in Excel.

So how can this be done - here is the high level solution design. Create a custom report program that reads APO Application Logs and Process Chain Runtimes to extract relevant data and store in a staging table. The same report program can read data from staging table and create week on week comparison, if various thresholds are maintained then highlight the same through traffic lights in report output. A custom control table will have various job various with monitored parameters and threshold values. Also it should be possible to automatically create Excel file in pre-defined format that can be sent out as email attachment if report is executed in background mode. The aim was to have this custom program put in the weekly batch with suitable variants inter-spaced after regular weekend planning run (SNP Heuristics, Deployment, TLB, PPDS Heuristics) job variants such that runtime & application log analysis, data extraction can be completely automated and production support team member gets email notification with run results rather than continuously monitoring jobs during weekend.

Here is the ideal business flow chart:    Weekend / Daily jobs run    -> Logs are generated in system   ->   Execute Custom Report to extract relevant data from application log and inserts in Custom Table -> Summary / Monitoring Report generated and emailed to support team as attachment -> Ad-hoc (manual) Batch Job Run Comparison reporting.


Now coming to some useful technical objects that can be used to build the custom report program.

First let's see how Process Chain to Application Log mapping is possible through a set of tables.

RSPCLOGCHAIN Table – CHAIN_ID put Process Chain Technical Name to get LOG_ID matching to DATUM (Date of Process Chain run) and ZEIT (Time of run) or CCMS_REPORTED (20,140,811,180,341.0255260 in UTC format)

Then RSPCPROCESSLOG Table – Put LOG_ID of Process Chain run, TYPE to get INSTANCE

Followed by RSPCINSTANCE Table – Put TYPE, VARIANTE (matching Process step variants), INSTANCE to get LOG_HANDLE Low Value GUID

To find the relevant log number function module BAL_DB_SEARCH can be used. Pass Object as APO_SNP, Sub Object as HEU / DEP / TLB and LOG_HANDLE in LOG_FILTER import parameter table to get LOGNUMBER in LOG_HEADER export parameter table.

Next passing the LOGNUMBER to BALHDR table to get LOG_HANDLE GUID value, in turn pass it to BALDAT table to get the cluster table entries.


To extract Application Log details data use the following code snippet in your program (confirm with your ABAPer)

IMPORT gt_log_det

FROM DATABASE bal_indx(al)

ID g_lognumber.


The Application Log number when passed as G_LOGNUMBER provides output GT_LOG_DET as 3 x 8 table containing necessary log data details. Object as APO_SNP, Sub Object as HEU / DEP / TLB, program variant parameter values including SELE_PAR for Selection Profile in IPARAM, internal tables POCO, PODET contain Order Details data by Product, Source and Destination Locations.


DISCLAIMER: I no longer consult the client for whom this development was originally designed for but understand it has been put in practice.

Labels in this area