ABAP Blog Posts
cancel
Showing results for 
Search instead for 
Did you mean: 
Greltel
Explorer
859

Introduction

Logging in ABAP is one of those things that sounds boring until you've done it five times and realized you've written five different solutions. How many custom loggers have you written in your ABAP career? I've lost count.

The scenario keeps repeating. One developer needs structured messages, another insists on SLG1, and the business wants a JSON dump for a dashboard. Three projects later, and you’re deep in the FMs BAL_LOG_CREATE and BAL_LOG_MSG_ADD, maintaining a massive Z-class that is somehow totally different from the one you wrote for the last project.

Τhen came ABAP Cloud. Τhose old function modules weren't released anymore. SAP swapped out the BAL_* family for the cl_bali_* classes (like cl_bali_log and cl_bali_header_setter). The new API is better. Cleaner, properly object-oriented, makes more sense once you get it. The problem is getting it — there's a learning curve, and almost no community tooling around it yet.  Actually, the only one that I have found is this. Shoutout to the author( Björn Schulz )for the inspiration.

So, I decided to build a new one from scratch. With a few rules:

  • ABAP Cloud compatible. No legacy types, no unreleased APIs. Has to run on Steampunk and Embedded Steampunk natively.

  • One entry point for everything. Free text, symsg, exceptions, BAPIRET2 tables, or an arbitrary structure I want dumped to JSON — I shouldn't have to think about which method to call.

  • A fluent API. The kind where the code reads almost like a sentence.

  • Multiton instances keyed on object / subobject / external number. Two unrelated parts of the same transaction should be able to log to the same place without passing instances around, and without paying the cost of recreating loggers every time.

  • All those small things I keep hardcoding anyway. Built in, once.

    What came out of all this is ZCL_CLOUD_LOGGER, a fully open-source logger built specifically for ABAP Cloud.

    In this post, I’ll show you exactly how it works, dive into some of the architectural choices (and bugs I hit along the way), and explain how you can easily drop it into your own systems.

    Installation

    If you've used abapGit before, you already know the drill. If not, this is the moment to install abapGit. Î™t's the standard tool for ABAP open source distribution.

    The github project of cloud logger can be found here.

    The simplest possible example

    Once installed, here's what the absolute minimum looks like:

    DATA(log) = zcl_cloud_logger=>get_instance(
      object    = 'Z_MY_LOG_OBJ'
      subobject = 'STEP_1' ).
    
    log->log_string_add( 'Process started.' ).
    
    " ... do stuff ...
    
    log->log_string_add( 'Process finished.' )
       ->save_application_log( ).

    Logging anything

    The standard SAP API forces you to juggle completely different calls depending on whether you're dealing with a BAPIRET2 table, an exception class, or just a standard sy-msg. My class still gives you specific methods for those, but behind the scenes, everything is routed through the exact same core logic. This guarantees your log output stays consistent, no matter what kind of data you just tossed at it. Υou can also log arbitrary data like a serialized json.

    " A symsg-style message
    log->log_message_add( symsg = VALUE #( msgty = 'E'
                                            msgid = 'ZMY_MSGS'
                                            msgno = '042'
                                            msgv1 = 'Order 100' ) ).
    
    " An exception
    TRY.
        do_something_risky( ).
      CATCH cx_root INTO DATA(err).
        log->log_exception_add( exception = err
                                severity  = 'E' ).
    ENDTRY.
    
    " Whole BAPIRET2 table — and only the errors
    log->log_bapiret2_table_add( bapiret2_t   = lt_bapi_messages
                                 min_severity = 'E' ).
    
    " Log serialized to JSON automatically
    log->log_data_add( data = ls_my_payload ).

    Sticky context

    Say you're looping through 50 sales orders, and each one goes through multiple validations. If one fails, your log needs to show exactly which order caused it. Without sticky context, you're forced to manually concatenate the order ID into every single string before logging it. With sticky context, you just set it once per iteration:

    log->set_context( |Order { order_id }| ).
    
    log->log_string_add( 'validation started' ).
    log->log_string_add( 'price check OK' ).
    log->log_string_add( 'stock check failed' ).
    
    log->clear_context( ).

    Built-in timing

    Every project eventually gets a requirement like "how long did this specific step take?". Since I always end up writing the same timer logic anyway, I just so I built in it directly into the logger:

    log->start_timer( ).
    
    call_external_service( ).
    
    log->stop_timer( 'External service call' ).

    The internal error trail

    I'll admit, almost every logger I've ever written has the same dirty secret somewhere in the code: CATCH cx_bali_runtime INTO DATA(swallowed). " ignore.You obviously don't want a simple read method like log_contains_error to trigger a dump and blow up the calling program. So, you just swallow the exception. Works fine until something actually breaks and you have no trace of what happened.

    DATA(errors) = log->get_internal_errors( ).
    " Returns a table of timestamp / method_name / error_text
    " for every cx_bali_runtime that was caught silently.

    Multiton config validation

    There is a specific flaw with the multiton pattern.If two callers ask for the exact same (object, subobject, ext_number) key but pass different db_save flags, the second caller just silently gets the first caller's instance. Then they waste time debugging why their db_save = abap_false log ended up being saved to the database anyway. To prevent this,  the logger checks the config and throws if there's a mismatch.

    " First caller
    zcl_cloud_logger=>get_instance(
      object = 'Z_LOG' subobject = 'STEP' db_save = abap_true ).
    
    " Second caller, later in the same transaction
    zcl_cloud_logger=>get_instance(
      object = 'Z_LOG' subobject = 'STEP' db_save = abap_false ).
    " → raises zcx_cloud_logger_error: config_mismatch

    Full Example

    Here is a full example based on a real world project:

    DATA(log) = zcl_cloud_logger=>get_instance(
      object               = 'Z_INVOICE'
      subobject            = 'POSTING'
      enable_emergency_log = abap_true ).
    
    log->set_context( |Invoice { invoice_id }| )
       ->start_timer( ).
    
    TRY.
        post_invoice( invoice_id ).
        log->log_string_add( string = 'Posted successfully'
                             msgty  = log->c_message_type-success ).
    
      CATCH cx_root INTO DATA(err).
        log->log_exception_add( exception = err severity = 'E' ).
    ENDTRY.
    
    log->stop_timer( 'Invoice posting' )
       ->clear_context( )
       ->save_application_log( ).

    Conclusion

    Logging is the kind of code nobody notices when it works. The whole point of this class was to stop solving the same problem on every project and have something I can drop in and forget about. Fluent API, multiton, sticky context, timing, JSON dumps for the weird requirements — all the stuff I kept rebuilding by hand.

    The project is open source, MIT-licensed and runs in ABAP Cloud. I've worked through some bugs to get it stable, and it's now in real use. If you give it a try, tell me what's missing. Feel free to open an issue,pull request or just send me a message.Find it on GitHub.