In certain scenarios, deeply nested data structures, multiple long text messages, etc. need to be passed from backend to frontend and sometimes complicated filter strings, mass data for processing, etc. can be passed from the frontend to the backend. In both these scenarios one can work with JSON data.
Examples for the former include data rich entities with large number of fields like vendor data, material data, etc., backend log operation messages for download and so on.
For the later one some examples can be mass user creation in the system via file upload, creating generic property in the underlying entity to handle a fixed set of dynamic filters and so on like a ‘Search’ property which will have (key:value) pair for different properties on which search should function.
In first case the backend can send data to frontend after serializing the same and frontend code parses it to extract meaning.
In second case the data sent by frontend can be de-serialized by backend so that it can be parsed and used in meaningful way by the backend.
With the standard ID based XML transformations there is negligible performance hit if any.
Sample implementation code is below.
DATA(lv_string) = /ui2/cl_json=>serialize( data = <Any Internal Table> compress = abap_true ).
Code Snippet 1: Data serialization example
/ui2/cl_json=>deserialize ( EXPORTING json = <String property with data> CHANGING data = <Any internal table> ).
Code Snippet 2: Data deserialization example
PS: 1. While serializing if ‘compress’ parameter is set to abap_true then any property which has initial value is skipped during the process and not available in the serialized data.
Even though a minor topic, hope this serves as ready reference for many of us. Feedbacks are most welcome.