Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
Showing results for 
Search instead for 
Did you mean: 
Active Participant
This is the 2nd blog post in the series on Centralised Transport Naming Standards.

See also blog posts:

Centralised Transport Naming Standards

Centralised Transport Naming Standards - Branch By Abstraction

Centralised Transport Naming Standards – Service Now Integration

As mentioned in my previous blog post, I was interested in understanding how the solution could be ported to the SCP ABAP Environment. In theory at least, it is a logical place for it, since it wasn't dependant on any core SAP logic such as FMs or global classes. Therefore, I thought it would be a useful exercise to perform for several reasons:

  1. To update my skills to use the latest language concepts to then further document development standards for this environment

  2. Understand the scope and also the current limitations of the environment

  3. Start to form a strategy around other migrations to SCP ABAP Environment that had been already suggested internally

This blog is a combination of both 2 and 3 in that it is a step by step account of the migration, identifying the major issues along the way but also some benefit of hindsight in how I would approach this a second time around.

Step One - Check the code in the current environment

As per olga.dolinskaja blog on How to check your custom ABAP code for SAP Cloud Platform ABAP Environment,the first step was to run the the SAP_CP_READINESS_CHECK on the current code. I was hopeful there would be a decent number of 'Quick Fixes' but unfortunately only two 😞


These were:

  1. To change the DESCRIBE keyord to use LINES instead.  (Example above)

  2. To change a SELECT SINGLE to specify the full key

There were also a multitude of other errors and warnings that primarily fed into 2 camps:

  1. Test for Restricted Language Scope (ABAP Language Version) - 11 Errors, 3 warnings

  2. Whitelist Check (on-premise) - 34 Errors, 37 Warnings

Plus there were two other errors related to table maintenance dialogs that appeared under the heading 'Check for allowed object types in SAP Cloud Platform ABAP Environment'. These are obviously due to dynpro's not being allowed in the target environment, which was fair enough.

Hindsight tip #1 - I should have copied the code to a new package and made as many adjustments in the current environment as possible

Step Two - Use abapGit to Export from On Premise and Import to Cloud

Now was the task to enact another blog by olga.dolinskaja. The 'How to bring your ABAP custom code to SAP Cloud Platform ABAP Environment' one in this case, as well as studying the tutorial Use abapGit to Transform ABAP Source Code to the Cloud that she also references at the bottom of the blog. In our case, rather than GitHub, I used Azure DevOps but the procedure is pretty much the same.

Here is the import log where you can see the table maintenance dialogs I mentioned before.


Here they are listed as successful, but clearly they cannot be used in the new environment and unfortunately I cannot delete them there either!

Hindsight Tip #2 - I should have paid more attention at this point, specifically around the objects that were supported in Steampunk. Coupled withe Hindsight Tip #1, the package ready for upload to Steampunk should only contain the objects supported.


Step Three - Remediate the objects in Steampunk

After import there were a lot of errors. I decided to work from the bottom up - especially since the top was completey broken anyway (dynpro) and had to be deleted (somehow).

The main areas for remediation were:

  • DDIC - creating domains and data elements

Very few domains and data elements exist within the Steampunk (93 data elements, 22 domains at the time of writng this) and the advice from SAP is to create your own anyway.  For even this small piece of development, I had to create 17 data elements and 17 domains. Not difficult of course, just time consuming. Naturally, as you build up your own environment, it would be wise to have a strategy as to how/when you create DDIC objects vs use your existing namespace ones.

  • DDIC - table definitions

Of course now the required data elements and domains are in place, the table definitions now needed to be adjusted accordingly, reflactoring the typing of fields to the newly created data elements. Also, system fields that we would orinarily take as granted are not in the standard guise, sy-mandt for example is not available - use abap.clnt instead. However, sy-subrc is thankfully there though!

An example of 1 table is below:

@EndUserText.label : 'Transport Naming Standards Statistics'
@AbapCatalog.enhancementCategory : #NOT_EXTENSIBLE
@AbapCatalog.tableCategory : #TRANSPARENT
@AbapCatalog.deliveryClass : #A
@AbapCatalog.dataMaintenance : #ALLOWED
define table zcau_tr_nm_stats {
key mandt : abap.clnt not null;
key system_id : zcau_dev_sysid not null;
key transport_number : zcau_trkorr not null;
key id : zcau_id not null;
transport_type : zcau_transport_type;
transport_title : zcau_transport_title;
transport_user : zcau_transport_user;
transport_date : zcau_transport_date;
transport_time : zcau_transport_time;
error_code : zcau_tr_err_cde;


  • Table maintenance

No ABAP generator as per on premise, so this involves a CDS views (should be two I believe now - interface and consumption?), plus the Business Definition and Implementation, then the Service Definition and Binding. The Fiori Elements UI is then accessed from the Service Binding. (No FLP currently I understand although on the roadmap).

 I've aligned the objects alongside the RAP model in the diagram below as best as I understand them.

I don't pretend to think that this is the best practice way to implement this, so if anyone can steer me to the correct way, feel free 🙂

I'm also assuming that I have created an unmanaged business object as I have created the CRUD operations myself. The outine structure can be seen expanded out below:

and then in greater detail showing the Behaviour Implementation structure:

and a snippet of the coding within is below.  For knowledge on this, I referenced the code from the Flight Model - package /DMO/FLIGHT/ - that I had previously imported via the Downloading the ABAP Flight Reference Scenario page on the ABAP RESTful Application Programming Model standard SAP help page or also available via the .pdf
* 1) define the data buffer

TYPES: BEGIN OF ty_buffer.
INCLUDE TYPE zcau_lndscpe_cfg AS data.
END OF ty_buffer.

TYPES tt_config TYPE SORTED TABLE OF ty_buffer WITH UNIQUE KEY dev_system.

CLASS-DATA mt_buffer TYPE tt_config.

CLASS lcl_handler DEFINITION final INHERITING FROM cl_abap_behavior_handler.
roots_to_create FOR CREATE landscape_config
roots_to_update FOR UPDATE landscape_config
roots_to_delete FOR DELETE landscape_config.

IMPORTING it_config_key FOR READ landscape_config RESULT et_config.

IMPORTING it_config_key FOR LOCK landscape_config.


Naturally, this is quite an onerous task to just maintain a table, especially when we are used to creating this facility in a few clicks. Glad to hear that SAP have it on the roadmap to address.

The last item that I wish to highlight is:

  • Lock Objects

This was an imported object that at least on the surface was successful

However, when I tried to use it, I encountered issues. Although the object was there, my code couldn't 'see' it. I couldn't create it using the same name either, so a bit of a stalemate. Therefore, eventually I created a new object and deleted the old one.


Step Four - ABAP Unit Testing

As I had only copied the unit tests from the previous on-premise code, this did not involve too much apart from type changes to reflect those already made in the main class. Later on I had to refactor them again but this was a result of the complete restructuring exercise (more details to come in the next blog).

What I would like to say though is that I was glad I had the tests there as they had a fundamental role in me knowing whether my code was sufficiently working before attempting to test via the remote system.


Step Five - Testing via a Remote System

As Inbound RFCs are currently not available in Steampunk (this is promised for Q3) I had to completely change the structure of the inbound access in favour of an HTTP Service.  The creation of the service was easy enough but the surrounding creations of Communication Scenario, Communication Arrangement, Inbound Service, Communcation User and Communication System seemed quite complex. I followed the tutorial and also had a lot of help from SAP Expert andreas.zimmermann but still this took a while to come together. I am led to believe this will be simplified in later versions. I certainly hope so!

Also as a consequence of no Inbound RFC, I had to remove the local class (and local test class) implementations away from the original function module/group and locate them in a global class.  This was actually a good thing and something I wanted to do in the original on-premise version anyway. It certainly showed me the limitations of having the local class and something to take on board in the future.

As a reminder, this was the original setup from part one (with the RFC mentioned this time)

Now on Steampunk, the complexity rises somewhat as shown by the diagram below.


Firstly this was tested out just using the console logging functionality that was featured in the Create Your First ABAP Console Application which basically boils down to implementing the IF_OO_ADT_CLASSRUN interface, the method IF_OO_ADT_CLASSRUN~MAIN and the out->write() statement.

Then I tested via a program provided by andreas.zimmermann

constants: lc_use_rfc type abap_bool value abap_false.

data: lo_http_client type ref to if_http_client,
lo_rest_client type ref to cl_rest_http_client,
lo_response type ref to if_rest_entity,
lo_request type ref to if_rest_entity,
lv_rfc_dest(20) type c,
lv_http_status type string,
lv_body type string,
lv_resp_data type string,
lv_resp_reason type string,
lv_resp_status type string,
lv_resp_code type i,
lv_resp_length type i,
lv_resp_type type string,
lt_resp_headers type tihttpnvp,
lv_msg type string,
lv_url type string.

lv_url = `https://<enter your RL here>/sap/bc/http/sap/z_cau_check_transport_title/?sap-client=100`.

exporting url = lv_url
ssl_id = 'ANONYM'
sap_client = '100'
importing client = lo_http_client
exceptions argument_not_found = 1
plugin_not_active = 2
internal_error = 3
pse_not_found = 4
pse_not_distrib = 5
pse_errors = 6
others = 7 ).
if sy-subrc ne 0.
lv_msg = 'Failed to create HTTP client: RC=' && sy-subrc.
write: / lv_msg color col_negative.

lo_http_client->propertytype_accept_cookie = if_http_client=>co_enabled.
lo_http_client->propertytype_accept_compress = if_http_client=>co_enabled.
lo_http_client->authenticate( username = '<the username you setup>'
password = '<the password you setup>'
client = '100' ).
lo_http_client->request->set_version( if_http_request=>co_protocol_version_1_1 ).
lo_http_client->request->set_header_field( name = 'connection' value = 'keep-alive' ).
"set other header values (optional)
lo_http_client->request->set_header_field( name = 'Content-Type'
value = 'text/xml; charset=utf-8' ).
"send GET/POST request (with default timeout)
"lo_http_client->request->set_method( if_http_request=>co_request_method_get ).
lo_http_client->request->set_method( if_http_request=>co_request_method_POST ).
lo_http_client->request->set_content_type( 'application/json;odata=verbose' ).
lo_http_client->request->set_cdata( `{ "tr_number":"PF2K000815", "tr_description":"My invalid test object"}` ).

lo_http_client->request->set_form_field( name = 'REQUEST' value = 'SYSK123456' ).
lo_http_client->request->set_form_field( name = 'TYPE' value = 'K' ).
lo_http_client->request->set_form_field( name = 'SYSID' value = 'NKP' ).
lo_http_client->request->set_form_field( name = 'TEXT' value = 'CAU:R1:INC0001234:Steampunk:1' ).
lo_http_client->request->set_form_field( name = 'OWNER' value = '<your userid>' ).

lo_http_client->send( timeout = if_http_client=>co_timeout_default ).
"read response
lo_http_client->receive( ).

importing code = lv_resp_code
reason = lv_resp_reason ).
lv_resp_status = lv_resp_code.

"dump response header
skip 1.
write: / 'HTTP Response Header' color col_heading.
lo_http_client->response->get_header_fields( changing fields = lt_resp_headers ).
loop at lt_resp_headers assigning field-symbol(<f>).
write: / `[`, <f>-name, `]`, at 40 <f>-value.

"read response data
lv_resp_data = lo_http_client->response->get_cdata( ).

skip 1.
write: / 'Raw Response' color col_heading.
write: / lv_resp_data.


This then enabled me to at least test the connection between the two systems before then slotting the code into the BAdI to trigger on transport creation/release. This is only in a sandbox environment of course.


Step Six - Refactoring

As well as extracting the local class out of the FM and creating the global class, I also took the opportunity to further enhance the solution to enable easier changes to the code down the line by introducing a concept known as Branch By Abstraction (covered in detail by the legendary Martin Fowler) I'll cover this in detail in Part Three.


Closing Remarks

Overall this was a great experiment in how to migrate a fairly simple solution to Steampunk, that can now pave the way for another couple of use cases that are more substantial. These are being considered as they don't logically belong on the system they reside on, and are also quite memory intensive. However, before venturing too far down the Steampunk track, we'll also need to understand the consumption model i.e. what drives cost within the new environment? Perhaps this will drive efficiency in terms of coding but also will it stop us migrating code that is either memory intensive or CPU intensive in the first place?  We'll have to see.

So for now, I will continue to invest some time in Steampunk as currently I personally see an immediate future where we will certainly migrate code but also prioritise developing solutions natively. This could be the perfect bridge to enable us to take on new functionality available in an evergreen environment, coupled to the older 'backends' of perhaps a NW 7.31/7.02 system used as a system of record. Features are being added all the while so there will be refactoring along the way but that's just the flipside of being an early adopter and taking on new functionality.

I'm also pleased to see that there are new tutorials being created all the time that will help to hone my knowledge in this environment. Keep them coming!


Labels in this area