Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
Ankit_Maskara
Advisor
Advisor
8,724
In certain scenarios, deeply nested data structures, multiple long text messages, etc. need to be passed from backend to frontend and sometimes complicated filter strings, mass data for processing, etc. can be passed from the frontend to the backend. In both these scenarios one can work with JSON data.

 

Examples for the former include data rich entities with large number of fields like vendor data, material data, etc., backend log operation messages for download and so on.

 

For the later one some examples can be mass user creation in the system via file upload, creating generic property in the underlying entity to handle a fixed set of dynamic filters and so on like a ‘Search’ property which will have (key:value) pair for different properties on which search should function.

 

In first case the backend can send data to frontend after serializing the same and frontend code parses it to extract meaning.

 

In second case the data sent by frontend can be de-serialized by backend so that it can be parsed and used in meaningful way by the backend.

 

With the standard ID based XML transformations there is negligible performance hit if any.

Sample implementation code is below.

 
DATA(lv_string) = /ui2/cl_json=>serialize( data = <Any Internal Table> compress = abap_true ).

Code Snippet 1: Data serialization example

 
/ui2/cl_json=>deserialize ( EXPORTING json = <String property with data> CHANGING data = <Any internal table> ).

Code Snippet 2: Data deserialization example

PS: 1. While serializing if ‘compress’ parameter is set to abap_true then any property which has initial value is skipped during the process and not available in the serialized data.


Even though a minor topic, hope this serves as ready reference for many of us. Feedbacks are most welcome.
5 Comments
joseph_manjiyil
Participant
0 Kudos
Hi Ankit,

I used the /UI2/CL_JSON=>deserialize method to convert the JSON response to ABAP structure. Until I created the SE11 structure matching the JSON fields exactly, the internal table assigned to data was always initial.

In the above snippet which you have mentioned, does it always work for you even if the internal table defined is of type ANY?

Also other challenge which I faced was, I was using HERE geocoding service and the JSON response returned doesn't have field name in the table for certain deep structure. It just had value. Hence the /UI2/CL_JSON class was not populating it correctly. But I noticed CL_FDT_JSON did populate the values even though the JSON response didn't have field name coming in the JSON response.

Any guidance is much appreciated, since I was struggling to make it work until I defined the SE11 matching structure and more and more external web service providers are using JSON instead of SOAP.

Regards,

Joseph M

 
Ankit_Maskara
Advisor
Advisor
0 Kudos
Hi Joseph,

Yes you need to create the SE11 structures with exact matching fields names. The reason is class/method - /UI2/CL_JSON=>deserialize uses standard ID based XML transformations which move the JSON field values to structure properties based on filed names/id. This mapping is the default behavior of this API. You can also create custome XML transformations to achieve it.

For second part , same API should work with deep structures as well. Just the names of properties should be matching in JSON payload and SE11 structures.

BR.
Former Member
Very effective solution.
 

Hello Ankit / Joseph,

I tried to use deserialize method and it says method doesn't exist?

I am at EHP7 for SAP ERP 6.0.

Also the table type to be created at SE11, should it have the Header part fields too?

Thanks for your help.

-LT-
Ankit_Maskara
Advisor
Advisor
0 Kudos
Hi Linda,

My demonstration is on S4HANA 1709. I need to check whether it exists in EHP7 ERP 6.0.

For second part, yes you need to create deep structure with header fields in it.

Regards.