Application Development Blog Posts
Learn and share on deeper, cross technology development topics such as integration and connectivity, automation, cloud extensibility, developing at scale, and security.
Showing results for 
Search instead for 
Did you mean: 
Active Contributor
Did you know that CL_SALV_TABLE is not officially editable? And also that making changes to existing programs is often quite difficult? You can find out how to can address both problems be reading this long rambling incoherent blog.



For many years now I have been posting an annual blog bemoaning the fact that CL_SALV_TABLE has less functionality than its “father” CL_GUI_ALV_GRID – the most important point being that the CL_SALV_TABLE is not editable.

This is in fact the ultimate example of banging one’s head against a brick wall – for whatever reason it seems to be part of the human condition that if one makes a mistake, and that mistake is pointed out, then the reaction is to move heaven and earth to explain that this is not a mistake at all it was deliberate and it is a really good thing. People are willing to spend one million billion times the effort and cost to defend a mistake that it would have taken to fix that mistake.

Whilst I was writing this blog it came out that in ABAP 7.56 all existing workarounds to making the CL_SALV_TABLE had been removed and a new workaround had been created (which will most likely be removed in 7.57 now it has been discovered) but the point is editing is not officially supported in any ALV technology and that is a mistake because end users love it, and that mistake is not acknowledged. If the whole concept is horrible why is it allowed in UI5?

This refusal to admit something is wrong is not just SAP and is not just in the software industry. However since my focus is SAP here is another example – FOR ALL ENTRIES – as you know if the selection table is empty then a full table scan is done. That is an obvious bug and an easy fix, but it will never be fixed because fixing the bug is not downward compatible and apparently some people might be relying on the bug (!) and so fixing it might cause their program to break.

That last argument never made any sense to me. Let us say you put a row of streetlights in XYZ street and because of a design fault none of them have ever worked. Due to budget constraints there is no money available to fix them until 12 months after they were installed. Should the council say, “Oh we cannot possibly fix them now, people have become used to the lights not working, if we fix them it might disturb them, maybe they did not bother to install curtains in their bedroom as there was no streetlight to disturb them”.


A man is in a restaurant. When the meal arrives it is terrible. He demands to see the chef. The waiter says, “I am sorry sir – at the moment the chef is at the restaurant next door having his lunch”.

You have probably heard the phrase “we eat our own dog food” meaning that a company uses its own products e.g. a chef eats at his own restaurant,. SAP has changed that phrase to “we drink our own champagne” which completely and utterly misses the point.

The point is that to a human dog food looks and smells disgusting. So the originator of the phrase was saying that HIS dog food was so incredibly wonderfully good that even a human would love it and in fact he ate it himself.

So – translate that to “Our champagne is just about good enough that it is possible that a human could drink it! In fact I drunk some of our champagne myself just prior to being violently sick!” and it does not come out as such a ringing endorsement of the product.

In this example the dog food that I eat is the way to make an CL_SALV_TABLE editable which I describe in my book “ABAP To the Future” and have discussed in assorted blogs here on the SCN. It is the subject I get most correspondence from of all the chapters – to put this in context for every query I get about UI5 I get about 20 about the ALV. This indicates to me that report programs in the SAP GUI are not going away anytime soon.

Now despite the fact that an SAP Press Author gets roughly ten million dollars in royalties paid in gold bullion for every book sold, and slightly less for an EBITE, this is not my only source of income. I actually have a real job in that I am an ABAP programmer for global building materials company Heidelberg Cement.

Naturally at work every so often there is a requirement to write some sort of interactive program in which the users see an ALV grid and are able to change some of the values.

Do I do the sensible thing and use CL_GUI_ALV_GRID or do I do what I tell people to do in my books and use CL_SALV_TABLE plus the assorted workarounds I describe to make some columns editable?

The first would be a “do as I say, not as I do” but as it turns out I do the second – I actually do eat my own dog food. One benefit of this is it makes what I write realistic – another is that if a new business requirement comes along then I can improve the code to accommodate that new requirement and update my miracle solution.

This is what this blog is actually about.


If I can call something a life changing experience it is when there was a blog on the SCN about the book “Anti-Fragile”

That book is not about software development – the examples are generally about biological or political entities, both of which have the capacity to actually turn out stronger when something really bad happens to them.

I will give you some examples (not from the book) which are close to me. I have shares in the UK pub company “Fuller, Smith and Turner”. Circa 2009 you may well recall the “Global Financial Crisis”. I put that in quotation marks because not every country was affected. Anyway the UK certainly was but in the end FST came out of it in a stronger position than before because they were able to buy ten iconic pubs in London that they would never ever in a million years have been able to afford before the crisis.

You might say that is just pot luck and taking advantage of someone else’s misfortune (the people who had to sell those pubs at a fire sale price). So let us take another example using the same company.

In the year 2019 FST sold  their brewery business to Japan and just retained the pubs. In the same year UK company Whitbread sold off Costa Coffee and just retained their hotel business.

After all, it was not as if something would come along early 2020 that would shut down every single pub and hotel in the world for at least a year, sometimes two years, but allow breweries and coffee companies to keep operating. Nothing like that could possibly happen!

As we know that is exactly what happened so that is about as big as a disaster as you could imagine happening to a pub company, if they still had the brewery they would at least have still had some money coming in (in the same way the airline Qantas still had quite a bit of money coming in from reselling pet insurance and other strange things like that) as it actually was they had nothing.

Now there was no way the UK government was going to let the pub industry be destroyed so there was a tax holiday and what not, but most pub companies went into hibernation – after all, how can you spend money on anything when your revenue has shrunk to zero with no light at the end of the tunnel?

Instead of going into hibernation FST decided to borrow a load of money and embark on major refurbishments at many, if not most, of their pubs – specifically installing covered areas (pagodas and the like) in the beer gardens to keep the rain/snow off anyone sitting outside. Construction work was still allowed during the pandemic to a greater or lesser extent.

The UK had to open and shut the pubs several times as the pandemic waxed and waned but as I recall one such opening was in the middle of winter and no-one was actually allowed inside a pub, you had to sit in the garden in the rain/snow. Naturally everyone still wanted to go to the pub – obviously - so given the choice you might choose a pub which had covered areas in the garden as opposed to one that did not.

Even when people were allowed to sit inside once again, even when the “new normal” started it seems highly unlikely that the UK rain and snow is every going to go away even with global warming, and even with the 40 degree record breaking heat this year the new structures in the gardens still work as they provide shade, so all that work gave FST some sort of permanent advantage..

The point is that the worst thing possible happened to the company and yet at the end it came out stronger than before – stronger than if the Really Bad Thing had not happened in the first place.


As it transpired in one of the latest SAP Mentor Meetings the guest speaker was Sam Watson, a professional Eventing rider, an Irish Olympian, World Games Silver Medallist, athlete, and owner of a data analytics business (all about horsey statistics).

And guess what he said on one of his slides?

“Progress = Stress + Recovery”

He was talking about horses jumping over high fences, but I made the point that software is also like that. If you do not have any stress at all you will never get better over time, and if you let the stress get to you then it is game over, but if the recovery bit leaves you better off than before then the more stress you encounter the better off you become.


OK now let us move onto an SAP example from my work. About a year ago everyone in IT got new job titles. Usually such changes are purely cosmetic, and I assumed this was to be the case again. So I changed from “Senior ABAP Programmer” to “Senior ABAP/Integration Specialist”. Integration? I thought that was meaningless but as it turned out from early 2022 we have had big customers lining up to do point-to-point integrations of their and our ERP systems. You are probably thinking “what about a hub solution like Ariba or one of the sixty-three SAP CRM companies?”. However those concepts do not seem to have caught on yet in Australia, despite having been pushed since the year 2000.

I think what is finally driving the change is that the Australian government has a law coming in on 1st July 2023 whereby if your company is over a certain size then if a supplier says they will not pay you unless you send them an electronic invoice you are legally required to say, “yes that is fine, bonzer, no worries mate”. Government departments are bound by that from 1st July 2022 and the NSW government had already voluntarily adopted these rules before that. Over time the minimum size of organisations bound by this will reduce until there are no more paper invoices.

The framework the Australian Tax Office has chosen to implement is called PEPPA PIG

Peppa Pig the E-Invoicing Champion

Anyway dealing with a variety of ERP systems makes me realise how spoilt we are in SAP world in that we have DDIC validations on quantity and currency and date and time fields so you cannot enter THE AFTERNOON in a time field or TOMMOROW in a date field or  BANANA in a quantity field and so on. In many ERP systems I have been dealing with there are either no restrictions at all (everything is a string) or limited restrictions.

In this case the customer system could specify that a field was either a number or string. The time field was set as a number.

Based on examples my mapping program could convert 930 into 9:30AM and 1630 into 4:30PM and I had unit tests to prove this mapping worked.

Then of course one day someone entered 11 (meaning 11AM) and my program could not handle that. The underlying problem was me not foreseeing that all too likely event, but anyway I now had a problem which needed to be solved.

This is a crisis – everyone is running around like headless chickens screaming (which is difficult to do without a head) that the sky is falling.

Now what I could do is just make the code change – it is easy and would take less than a minute – and then manually test this using my B2B simulator program. However even if that works it adds a little bit more conditional logic to the program – sort of like taking a Jenga block out of the tower and putting it on top. The tower is now a little higher but also a little bit more unstable.

What I do instead is to use TDD – I write a unit test in which the value of “11” is passed in and I expect the result to be 11:00:00 AM. The test fails which mirrors what is happening in reality.

I then change the production code until the required result occurs. I know when I am finished because all the unit tests give green lights. I can also be sure I have not broken anything else as a result because all the previous tests pass.

In essence I have put a new Jenga block on top of the tower without having to take any blocks out of the existing layers. If anything I have added a missing block to the lower existing levels. The program is stronger and more stable than before – it can do more and has more tests.

Put another way without automated tests every change you make to a program makes it weaker – with automated tests every change you make to a program makes it stronger. This is not just some buzzword driven pipe dream – I see this in real life at work all the time.


This actual key example for this blog is a problem with a productive program using my lovely editable CL_SALV_TABLE. The end users asked for a sort of “shopping cart” functionality whereby they get a list of items on a contract with a blank quantity field at the start of each one. They input quantities against the ones they want and press COPY and the items with quantities against them get copied to the sales order.

All well and good. This works fine. The problem was I did all my testing using units of measure like TON and M3 and the input field appeared with three decimals places which is what I expected, and all seemed well.

Of course as soon as real people started using this the actual UOM they used for 95% of products was BAG and that was coming out with three decimal places as well. That makes no sense as you cannot order 0.005 bags of something, it has to be a whole number.

How does standard SAP handle this? In table T006 there is a field called ANDEC to specify the number of decimal places for a UOM. For TO and M3 this number is three, for BAG it is zero.

When creating a database table or structure in SAP you are forced to point any quantity field at a unit of measure field. For example in table VBAP you store a quantity in field KWMENG, and this points to the VBAP field VRKME which stores the UOM for this quantity. So if VBAP-VRKME was M3 there would be three decimal places on the screen, if VBAP-VRKME as BAG then there would be zero decimal places on the SAP GUI screen, as per the customising.

In the REUSE_ALV_LIST_DISPLAY function and class CL_GUI_ALV_GRID your program passes this sort of information into the field catalogue. In the CL_SALV_TABLE however there is no field catalogue manually built up – the field catalogue definition is automatically generated based on the contents of the internal table. This is seen as the biggest (some would say only) benefit of CL_SAV_TABLE over CL_GUI_ALV_GRID.

So if my internal table was typed as a DDIC structure or a transparent table then the SALV would read all pertinent information out of the DDIC and most likely I would not get the BAGS problem.

But in my experience developers generally do not create a table or structure for every single report program. Instead they define an internal table which gets filled at runtime from assorted database tables. You could in theory define a DDIC structure for the ALV output fields each time but that seems a bit like overkill.

As we know an internal table might contain fields not based on any data element at all, so the SALV has methods to add long and short texts to such fields, define the length and so on, just like you could do with the field catalogue in the older technologies.

In this example my internal table has a field called GAMNG to store the quantity the user inputs and a field called VRKME for the unit of measure, taken from the VBAP field from the contract item.

All well and good but when the CL_SALV_TABLE generates the field catalogue based on my internal table it has no idea at all those fields are related. Hence the three decimal places for the BAG field which I am guessing is the default value for quantity fields.


After about two minutes of research I can see that the CL_SALV_TABLE “column” object has a method called SET_QUANTITY_COLUMN which does what it says on the tin. That will obviously solve the problem. However I have two choices, and they demonstrate (sort of) the concept of “technical debt”.

  1. I could add a few lines of code to the program at hand. That would take a few minutes at most and then all would be well.

  2. I could spend quite a lot of time updating my custom framework to add the new functionality, and then change the program at hand to use that new functionality. As you will see there are quite a few steps to this.

The first option would seem the obvious way to go – but this is creating a “debt”. In what way you may ask? If this problem exists in a current program and I only fix the current program what happens when someone else writes a new program at some point in the future and gets the same issue? They will have to do the same investigation I just did, and then change their new program. And sometimes the powers that be will decide that is just too much effort and let the problem continue, it is just not that important. Whereas if the functionality was built into the framework the future programmer could fix the issue in the new program in minutes.

This is maybe not the best example of technical debt in the world, but hopefully you get the concept – a bit more work in the here and now can save a lot more work for future programmers, one of whom might actually be yourself. In fact the whole idea of my custom SALV framework was to avoid having to write the same boilerplate code again and again for each new report program.

So I decided to go with option two and update my custom SALV framework


I have the custom ALV framework all bundled up inside the following Git repository:

As it turns out I cannot even connect to an SAP today, but eventually the changes I describe below will be “pushed” to that repository via abapGit. That is neither here nor there.

If you have never even heard about my SALV framework the below will not make any sense at all. A picture is worth a thousand words – twenty years ago I would have used VISIO but that costs tons of money nowadays even to companies with Microsoft licences and all the free alternatives seem really weird so here is something I knocked up in Excel.

ALV Flow

In my framework there are loads of things that needed to be changed, here we have them listed, in possibly a logical order:

ZIF_BC_ALV_REPORT_VIEW – an interface describes the functionality available (public methods and attributes) whilst hiding the actual implementation which could be in any class that implements the interface. The idea is to hide what ALV technology we are using – one of the ALV function modules, CL_GUI_ALV_GRID, CL_SALV_TABLE or whatever comes next. So you can easily roll forward any or all of your programs to the next technology, or more likely roll them backwards from CL_SALV_TABLE to CL_GUI_ALV_GRID when the end users want the cells to be editable.

In this case our new functionality is to enable the calling program to link a quantity field with a unit of measure field. I could go hunting to see if such a structure already exists in standard SAP but that is not very easy due to the total and utter lack of any sort of naming conventions in standard SAP objects, so it is far easier to create my own. There is not even any need to create a custom DDIC structure and table, I can just declare the structure and table as attributes of the interface (which are by definition public). Later in the blog I will change my mind.

BEGIN OF m_typ_quantity_uom_mapping,
quantity_field TYPE lvc_fname,
uom_field      TYPE lvc_fname,
END OF   m_typ_quantity_uom_mapping .
m_tt_quantity_uom_mapping TYPE SORTED TABLE OF m_typ_quantity_uom_mapping

WITH UNIQUE KEY quantity_field .

So I add a table attribute MT_QUANTITY_UOM_MAPPING to the ZIF_BC_ALV_REPORT_VIEW as the view will need to know what quantity fields are to be displayed based on the UOM field. Later we will add a similar table attribute to the model so that the model can decide what data to place in this table before asking the controller to pass this information to the view. I am sure to people not familiar with the MVC pattern this whole thing seems ridiculous, but this is what the separation of concerns is all about.

There are many ways to get what you want, I use MVC, I use Italy

Imagine an artist like Leonardo da Vinci being commissioned to paint the “Last Supper” in 1495. The “model” is the Duke of Milan who has the money and thus gets to say what he wants (e.g. no kangaroos and only one Jesus) that is – he controls the business logic of what is to be in the painting. The controller is the duke’s servant – the go between who actually commissions the artist and tells him what the duke wants and pays the artist with the duke’s money (i.e. grunt work beneath the dignity of the duke) . But once the painting starts Leonardo is not going to let anyone in the world tell him how to paint his picture because he is the artistic genius, not anyone else as in “Do I go around telling birds how to fly or dogs how to bark?”. Thus there is a clear separation of concerns – the duke says what, the artist says how, the go-between servant connects the two. The duke can choose any painter he wants, the painter can choose any patron they want, either can be changed and the process would still work. This is the MVC pattern.

Back to the Example

Now there is a global table definition in which to store the linkage between a given quantity field and the corresponding UOM field it is time to add an optional IMPORTING parameter called IT_QUANTITY_UOM_FIELDS in multiple method definitions.

First off we need to change the model class.


First off we add a new (read only) attribute MT_QUANTITY_UOM_FIELDS based on the new internal table definition.

Then we add a new method FILL_QUANTITY_UOM_FIELDS with no parameters. We do not need them because we have the member table to store the data.

Then we update the model in the calling program (which inherits from the abstract ZCL_BC_MODEL_BASE which implements ZIF_BC_MODEL_BASE) such that in the CNSTRUCTOR we fill up the quantity UOM mapping in the inherited method FILL_QUANTITY_UOM_FIELDS.

In my example

INSERT VALUE #( quantity_field = 'GAMNG'
uom_field      = 'VRKME' )

INTO TABLE mt_quantity_uom_fields.

We have now got to the stage where the model knows the UOM mapping rules and since that data is public (read-only) any interested parties (such as the controller) can read that data.

VIEW =>CREATE_CONTAINER_PREP_DISPLAY in the view class. This is the “entry point” to the view as it were. The controller passes the UOM table from the model into this method. The view class then moves the imported table into a member table MT_QUANTITY_UOM_FIELDS in order to hold on to the data.

VIEW => FILL_CONTAINER_CONTENT. This gets called from a function module which is just a DYNPRO screen with a container. I use such a function module because I need a container in order to add custom pushbuttons dynamically (something you may not need to do from ABAP 7.56 onwards, we shall see). Anyway in this method we move MT_QUANTITY_UOM_FILEDS into the PREPARE_DISPLAY_DATA method.

VIEW => PREPATE_DISPLAY_DATA. This method needs the new optional importing parameter IT_QUANTITY_UOM_FIELDS to receive the data the last method just sent. This is what is called “tramp data” in that this current method does nothing at all with the table it has just received except to pass it into another method called APPLICATION_SPECIFC_CHANGES.

VIEW => APPLICATION SPECIFIC CHANGES. This method needs the new optional importing parameter IT_QUANTITY_UOM_FIELDS to receive the tramp data the last method just sent. This time though the method actually does something with the imported table, which is to loop through the table and for each entry.

* Quantity UOM Fields
LOOP AT it_quantity_uom_fields ASSIGNING FIELD-SYMBOL(<ls_quantity_uom_fields>).
set_quantity_uom_fields( id_field_name         =  <ls_quantity_uom_fields>-quantity_field
id_quantity_uom_field =  <ls_quantity_uom_fields>-uom_field ).

VIEW => SET_QUANTITY_UOM_FIELDS. Here we finally call the CL_SALV_TABLE specific method to link the quantity field with the UOM field.

METHOD set_quantity_uom_fields.

mo_column ?= mo_columns->get_column( id_field_name ).
mo_column->set_quantity_column( id_quantity_uom_field ).
CATCH cx_salv_not_found INTO DATA(not_found).
DATA(an_error_occurred) = abap_true.
"Object = Column
"Key    = Field Name e.g. VBELN
zcl_bc_dbc=>require( that = |{ not_found->object } { not_found->key } must exist|
which_is_true_if = boolc( an_error_occurred = abap_false ) ).
CATCH cx_salv_data_error INTO DATA(data_error).
an_error_occurred = abap_true.
zcl_bc_dbc=>require( that = |{ data_error->object } { data_error->key } must exist|
which_is_true_if = boolc( an_error_occurred = abap_false ) ).


The obvious question is why not just call that standard CL_SALV_TABLE method directly rather than jumping through all those hoops? What I am doing is abstracting away the fact that CL_SALV_TABLE is used as the underlying technology. That way if SAP every comes up with something to replace CL_SALV_TABLE in the SAP GUI (and don’t you tell me that will never happen because the GUI is obsolete and so on and so forth) I don’t have to change ten billion calling programs, just create a new class implementing the base ALV interface which uses the new technology and then update the factory method which gets the view class.

The whole thing is far too complicated. You are a madman.

It certainly looks that way does it not? There are so many moving parts which is what you often get with the “Single Responsibility Principle” where there are lots of different classes looking after just one thing each.

The idea was that the Z framework gets set up once which is a bit of a pain to do but thereafter all you need to do in each new report program is have the model says what it wants in the CONSTRUCTOR in regard to editable fields, technical fields, sort criteria and all the other ALV type data, quantity UOM mapping being the New Kid on the Block.

That works really well. At my organization when a programmer needs to create a new ALV report they can concentrate on the what rather than the how. All the repetitive boiler plate stuff is hidden under the hood, and as I said the design is all SOLID compatible (which may not seem important, but it is) and future proof, even if some may say I am preparing for a future that will never come.

OK maybe writing programs using is framework is simple. But you have just proved without a doubt that when the framework has to be enhanced you have to change ten billion things and that is not simple in the least. Therefore you are a madman.

This argument has a lot of merit. The main point of this entire blog (I bet you were wondering if there was one) is that the list of things I had to change in assorted different class went on forever.

When I made this change in real life every time I thought I was done, a tested the program and my UOM change had not worked. I found I needed to change something else. That happened about ten billion times because there are just so many places a change is needed one is bound to miss some and when I finally found (by trial and error) every single place that needed to be changed and everything was working, when the change went to QA I got a return code 8 because I had forgot to make the new importing parameter optional in ONE of the multiple places I added it, and of course the method I had missed was called directly by a seemingly unrelated program.

Not the first, not the last

Guess what happened next? As soon as the testers were let loose on the new functionality someone made the (very reasonable) observation that in a table control if you were entering quantities the field would be blank but in my editable ALV there was a zero sitting in the field and the end user had to manually remove that zero before adding the required quantity and it was really easy to stuff that up and end up with the required quantity plus the original zero on the end i.e. ten times what is actually required.

Is there a standard CL_SALV_TABLE method to suppress zero values? Of course there is. But it was not used in my wonderful framework. So I had to add it – which involved making the same ten billion changes in various places I have just described.

So you are a madman after all.

So it would seem. Just to further prove it I am going to jump off here to a seemingly unrelated topic – the YAGNI principle. That stands for “You Ain’t Gonna Need It”

What this refers to is that it is very difficult if not impossible to predict what changes the end users will want n the future, because they have not asked for those changes yet. You could guess what they are going to ask for an spend ages building that into your program but then if they never ask for any of it you have wasted a colossal amount of time and money.

In this case there are twenty billion different possible ALV functions. Up front I built into my framework every such function I knew for a fact that was currently used and left out the other 75%. As you have seen I am adding extra ones as requested and it is a right pain.

So what would a sane person do at the point they realise requests for extra ALV features are going to come along every so often and making such a change is really painful? The obvious answer is to change the framework such that when a change comes along you do not have to make changes in twenty billion different places. Just one place would be nice, but maybe that is not really feasible, but surely we can make things simpler than the current nightmare.

So what’s the solution, manager?

As you have seen with every such change I am making the signature of various methods more and more complicated over time by adding in more and more optional parameters. That is bad because we do not want to make things more and more complex over time, if anything we want to do the reverse if at all possible.

So how we change an identical set of importing parameters in twenty billion methods such that

  • Things are simpler

  • They always stay in synch so you cannot accidently miss some and

  • You no longer have to manually change the signature (and code) in every single method

You have probably already worked out the answer. Why not put all those parameters into a structure and then have that as a parameter?

Just to be 100% safe the best way to do this is to add the structure as an optional parameter to all those methods so as not to stuff everything that currently exists up. Then work from the “inside out” as it (that is private methods deep in the bowels first) and see what can be simplified without breaking any existing callers.

I could create the new structure as a public type of the model class interface, or I could create a DDIC structure. Now, in some implementations of the MVC pattern it is fine for the view and the model to know about each other, but I am going with the  approach in the “Big Book of Secrets” (1977) by Jolly Gyles Brandreth in which the Master Spy only communicates with individual spies using couriers, so no individual spy knows who the Master Spy is or what they look or sound like. So a DDIC stricture it is.

I will detail the steps here, and when I am done update the GitHub repository for “ABAP to the Future 4” ( so you can have a look yourself. As I said I cannot connect today and so cannot upload the changes, but I will do shortly.

Here we go, here we go, here we go

Earlier in this blog I said I did not need to create a DDIC structure for the quantity UOM mapping, I could just define that structure/table in the model, but now I have changed my mind and so create structure ZST_QUANTITY_UOM_MAPPING and table type ZTT_QUANTITY_UOM_MAPPING.

As I said I am going to work from the “inside out” and the innermost component is the “application specific changes” method in the view. The interface defines this with a load of IMPORTING parameters. We can change that so that there is just one IMPORTING parameter which is the structure – but first we need to create that structure.


Then I am going to add that as an extra optional IMPORTING parameter and not delete any existing ones as yet. That latter step happens right at the end of the refactoring process. As the “application specific changes” is a public method the definition is in the interface so that is where the change needs to be made.

Then I change the code in the concrete class ZCL_BC_VIEW_SALV_TABLE so that only the new incoming structure is referenced as opposed to the multiple individual parameters. That only took about ten minutes.

Now is the time to go one level out i.e. what calls ‘application specific changes”? To be honest I have always found the “where-used” list for individual methods to be very flaky in “lower” releases like 7.50 but it seems to work OK in a 7.52 system. Anyway the next method to be changed is “prepare display data” and once again the new optional parameter is added to the interface.

We are back to the “tramp data” concept again so the only change in the code is to pass the new incoming structure straight to “application specific changes”. Then we repeat the “outside in” process and see where “prepare display data” is called from. The answer is “all over the place” i.e. from multiple calling programs which indicates a design fault.

Put another way, whenever I look at my old code which I thought so wonderful at the time I wrote it, a work of genius, in retrospect I wonder what in the world I was thinking as with the benefit of a few years more experience I can see flaws everywhere. This is a good thing – I hope – as it indicates I am constantly improving. The day I look at some code I wrote five years ago and think “that is just fine” is probably the day I am in trouble because it indicates I have fossilized.

So in the fifth edition of “ABAP to the Future” the SALV code in the calling programs is going to be very different, but that is not really relevant at the moment as I am half way through changing assorted methods and do not want to stop now and change every single other thing in the framework as well (yet, and when I do that will be another blog).

Anyway the next method up the chain is “fill container content” which is used when you want the system to automatically create a screen with a container (so you can make the grid editable and/or add custom user commands).

The view class ZCL_BC_VIEW_SALV_TABLE needs a new member parameter MS_ALV_SETTINGS which is a structure based on ZST_ALV_SETTINGS. Then in the “fill container content” that structure can be passed into “prepare display data”. But how does that new structure get filled up? We have to jump outwards one for step.

The next method on the block is the view class method “CREATE_CONTAINER_PREP_DISPLAY” which passes an instance of the view class into the pretty much empty function module which shows a DYNPRO screen with a container inside it. Here is where we can add a new IMPORTING parameter to get the structure on pass it into the member variable MS_ALV_SETTINGS. This seems to be going on forever but remember – the whole aim of the game is to avoid changing half a hundred methods the next time we want to add extra functionality.

So …. Where does “CREATE_CONTAINER_PREP_DISPLAY” get called from? Half a hundred calling programs! That is not good, not good at all. That is a serious design flaw which needs to be fixed, as is the fact that during a fit of madness I decided that having an APPLICATION class as well as CONTROLLER class in all my custom report programs, when both of them did pretty much the same thing, was a good idea. That resulted in half a ton of duplicate code in ten billion calling programs, which is an odd design choice, and also needs to be rectified.

Anyway, one problem at a time. The problem at hand is that the calling program needs to pass the ALV structure from the model to the view, and thus the model needs to have the structure defined as a public member variable. The model will fill that structure up based upon business logic, and the controller will pass that data from the model to the view.

So into ZIF_BC_MODEL goes MS_ALV_SETTINGS. That can be filled up by the various calling report programs where the model class inherits from ZCL_C_MODEL which implements ZIF_BC_MODEL. That will be the source of the river of data which flows through assorted methods of assorted classes and ends up being passed into CL_SALV_TABLE.

Don’t you Inflection Point that thing at me

It is more than likely that changing half a billion calling programs is not a sensible thing to do no matter how good the new design is. So it is just as well we have such a convoluted design with lots of classes. You can keep the outermost layer i.e. the interface to the calling programs the same, and only change the innermost layers. Thus the outermost layer is known as an “inflection point” which is named after that part of a rollercoaster where you have crawled up to the top and the carriage is about to go back down at a billion miles an hour.

When it comes to designing unit tests the idea (per Michael Feathers) is to have the tests call these “inflection point” outermost methods. Then if you change something deep in the bowels of the framework and that breaks something the tests on the outermost methods will start to fail.

Put another way – the method(s) which are called in multiple places are the “inflection points” and thus it is problematic to change the interface of those methods. Any lower-level methods (if they only ever get called from one or two places into the entire system) are much easier to change. In this case we do not remove anything from the existing interface to the “entry point” methods which are PREPARE_DISPLAY_DATA and CREATE_CONTAINER_PREP_DISPLAY on the grounds they are called by ten zillion programs already.

Instead inside those methods the incoming data can be passed to the structure and propagated onwards, and therefore any lower-level methods can have the bulk of the optional parameters removed and replaced with just the structure. That will prevent loads of changes having to be made to all the inner methods when we want to add new functionality.

Naturally there is nothing to stop us adding the structure as an optional parameter to the outermost methods, and for new reports that structure parameter is all that will get filled. That is the outermost methods are clever enough to know to use the new structure if provided, otherwise stick with the existing logic.


In a load of presentations I have done I have referred to the “no-one is ever happy with anything theory”. This states that as long as a program is in use the end users will complain about it and ask for fixes/improvements. The B side of this is that if the end users ever stop asking for changes to your program that means it probably is not being used any more.

So, every time you have to make a change to an existing ALV program you can simplify the call to the outermost layer to just use the new structure and drop the half a ton of older parameters.

In summary the inner classes/methods get simpler and more maintainable right away, and the outer classes get simpler/more maintainable over time.


I have always said that a good programmer is a lazy programmer. If you have a recurring problem that takes a lot of effort (i.e. lots of changes to the code)  to fix, why not change things (if you can) so the next time you get the problem it is no longer difficult?

That is not always possible but - more often than not - it actually is possible, and if you go down that path you end up in the Anti-Fragile situation where the more stress (requested changes) you get to your programs the more you are forced to improve them.

Put another way instead of your programs getting worse with every requested change, they get better with every requested change.

Cheersy Cheers


SAUG 2022