Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
Showing results for 
Search instead for 
Did you mean: 
Observability is a key to a successful long-term maintenance of any system. One of the aspects that we considered worth monitoring in our landscape was Adobe Document Services and its usage from connected systems. For observability we are using ELK (Elasticsearch - Logstash - Kibana) stack. This blog article explains how we managed to connect ELK to ADS and build a dashboard over ADS logs.

One of the widgets built in Kibana

Enabling statistics in ADS

To enable statistics being written to NWA logs you need to enable IFbA statistics to be written to the log. This feature is available starting from Adobe Document Services 7.50 SP09. Information on required SP and patch levels is listed in Note 2714231 – IFbA: Configuration for collection of ADS performance data. This note outlines that if you have compatible ADS release a log entry under location "com.adobe.Monitoring" is generated every time a PDF form is being printed but not written to the log by default.

If you are not keen on using  ELK stack for observability and want to get plain TXT \ Excel files for further study, you can follow the note's instructions. They outline steps required to enable writing ADS print statistics entries to separate files in server filesystem, and the note also includes a Python script for generating Excel-file based on those files.

Below we will look only at getting to these events to ELK stack.

Enable writing statistics entries

First you need to enable writing statistics entries to default location. For this you need to change logging severity level for "com.adobe.Monitoring" tracing location to INFO instead of default ERROR. We did not need to change logging location (as described in Note 2714231) since we are already ingesting developer trace files directly from NWA using Logstash.

Log configuration for com.adobe.Monitoring

After that a new entry is added to the logfile every time any print request is sent to ADS.

Log entry as viewed in NWA Log Viewer

Information is written as comma-separated values with columns as described in Note 2714231. Here I will describe some of useful ones:

  • template: name of the print form

  • adsTime: time it took ADS to fulfil request in milliseconds

  • requestSize: size of HTTP request to ADS in bytes

  • responseSize: size of HTTP response from ADS in bytes

  • system: ABAP client system ID which requested printing

  • client: ABAP client number which requested printing

  • user: ABAP user which triggered PDF printing

  • interactivePDF: true if the form was requested to be printed as Interactive form

  • success: false if ADS subsystem call fails for any reasons (ADS warnings are not considered a failure)

Export to ELK stack

The following section assumes that Logstash is already connected to NWA logs. Also the following assumptions about ELK are made:

  • Log messages marked with "NWA" tag are sourced from on-premise NWA installations

  • Field "sap_location" contains "location" field from NWA log entry

  • Field "message_text" contains "message" field from NWA log entry

Here is a Logstash filter configuration we used for processing ADS monitoring log entries:
filter {
# parsing of Adobe monitoring info
if "NWA" in [tags] and [sap_location] == "com.adobe.Monitoring" {
csv {
# mark entry with ADS tag for easy search
add_tag => ["ADS"]

# split CSV message into fields
autogenerate_column_names => false
target => "ads"
columns => ["template", "adsTime", "requestSize", "responseSize", "system", "client", "user", "transactionType", "renderOutputType", "numberOfPages", "interactivePDF", "xmlFormTime", "templateUuid", "success"]
convert => {
"adsTime" => "integer"
"requestSize" => "integer"
"responseSize" => "integer"
"numberOfPages" => "integer"
"interactivePDF" => "boolean"
"xmlFormTime" => "integer"
"success" => "boolean"
separator => ","
skip_header => true
source => "message_text"

mutate {
# concat system and client into one field
update => {
"[ads][system]" => "%{[ads][system]}/%{[ads][client]}"

# remove client field
remove_field => ["[ads][client]"]

# split fields
split => {
"[ads][renderOutputType]" => "/"
"[ads][transactionType]" => "/"

Elasticsearch will receive new fields under "ads" group of fields. Afterwards these fields are available for discovery and analytics in Elasticsearch.

List of ADS-related fields available in ELK stack

Table view of ADS statistics in Kibana

We built a whole dashboard in Kibana using ADS logs

Use cases for ADS statistics

ADS statistics allows us to have a good understanding which print forms are being used by end users and who are those users.

This information is vital for:

  • Updating regression testing scenario to include print forms currently in use

  • Phasing out old not-in-use print forms, especially custom ones

  • Not spending effort on SAP updates to forms which are not in use

  • Estimating ADS subsystem performance and load profile and tuning it accordingly

  • Performing unit tests on all currently in-use print forms when any major changes to ADS configuration is done

Enabling ADS statistics also allowed us to figure out print forms which generated warning messaging in production environment. Once we knew the source of these warnings we were able to place a correct task for developers to fix print form issues. Below are snapshots for both NWA Log Viewer and Kibana view on the same data.

NWA Log Viewer with ADS warnings and print template name

Kibana with ADS warnings and print template name (same as above)

Labels in this area