Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
RichHeilman
Developer Advocate
Developer Advocate
10,542
Hey gang!  This morning on the ABAP_Freak->show(), we introduced you to the little project we've been working on which we call the ABAP File Uploader tool.  This tool allows you to upload data directly to a database table in SAP Cloud Platform, ABAP Environment, aka Steampunk.  How this came about was when I was messing around with the RAP(ABAP RESTful Application Programming Model) a few weeks ago and wanted to seed my tables with some data for my testing and prototype purposes.  Because of limitations and restrictions of the Steampunk environment, my choices were also limited.  In an on-prem scenario, the answer to this question is pretty simple, we use the CL_GUI_FRONTEND_SERVICES class to upload data via a custom program.  But as you may know, this class and many others that are SAP GUI dependent are not allowed in Steampunk.  So what I started out doing was simply having a loader class which had a method containing a few INSERT statements with hardcoded values.  I did not like this approach almost immediately.  So I pinged my buddy, thomas.jung and asked, "you know of a good way to get data into a table in Steampunk".  His answer was "yea, INSERT statements in a class/method".  At which point, I believe I responded with the "puke" emoji.  Then he said, "would be nice to have a generic uploader tool".  I was like, "Hold my Old Monk & Coke....".  Couple days later I had something working using an HTTP Service, and then Tom threw a nice UI on it for us.


The UI is pretty simple, you provide the name of the database table(yes, there is value help there!) that you want to load the data into as well as the actual file.  You can choose to append the data to data already existing in the table, or to replace it, meaning that a DELETE will be performed before the data is uploaded.  At this point, we only support json as the format for the import file. I'm looking to add support for csv soon, its not all that much more work, just some nuances with how the request comes in. Also it is important to note that this tool assumes a lot and is not all that robust. We assume that the structure of the json, meaning the attributes, are the same as the columns of the database table in the same order.  Also, I've only tested this with a file of 1000 rows. It will probably do much more but I assume there is a limit to what it can do.  That being said this is not your ETL tool for Steampunk, so don't try loading a file with millions of lines in it. So again, not a whole lot of intelligence built into it right now.

In this example, I have a json file with 3 attributes, COLA, COLB, COLC which correspond to my columns in the database table called ZRH_TEST


 


 

When you click the "Upload File" button, the tool creates a dynamic internal table and deserializes the json into that internal table and then simply does an INSERT into the database table.  There is some error handling built in there, but again not all that robust at the moment. It does do the job quite nicely though, as oppose to having a massive method of INSERT statements.


We are releasing this tool on our SAP-samples github.  Instructions on how to install the tool are provided in the README there.

We hope this is helpful for you guys!  Till next time...

 
10 Comments