Application Development Blog Posts
Learn and share on deeper, cross technology development topics such as integration and connectivity, automation, cloud extensibility, developing at scale, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 
former_member215344
Contributor
29,848


Introduction:

 

We often come across situations where we would like to directly upload data in the custom database tables (Transparent tables) which we create. The main intention of such kind of a direct update is primarily based on the fact that these tables only hold relevant data and there are no complex validations which needs to be taken care during the upload process.

 

Moreover we also may not have sufficient time to create a module pool application or a table maintenance generator for the data input, and then create an upload program or use a BDC program finally for data upload. In such situations you can go for a generic program like the one I have discussed below, for the data upload. The highlight of this program is that you don't have to change the program source code to upload several database tables. Well you can use the same single program to upload data in any database table (Only transparent tables, since they are the only tables which have one-to-one relationship of fields between dictionary and database).

 

Concept:

 

Now speaking about the realization, it is obvious that there are many ways you may create a data upload program that could achieve the functionality. But I am sharing the source code of the program which I had developed, which I think is more simple, highly re-usable and more easy to understand. The main concept behind this program is dynamic creation of data via the CREATE DATA statement in ABAP. This powerful statement is used in my program to create a dynamic internal table and the corresponding work area, which forms the main part of this program. This is how it works:

 

In the selection screen, user enters the table name which needs to be uploaded. Now for the table name entered by the user, an internal table of same type is created via the statement CREATE DATA as mentioned below.

 

CREATE DATA gt_data TYPE TABLE OF (p_table).

 

Since gt_data is a dynamic data, obviously we cannot declare gt_data by specifying any database table name. We need to declare gt_data as generic data like mentioned below:

 

DATA gt_data TYPE REF TO data.

 

Now once you have created the dynamic internal table through via CREATE DATA statement as mentioned above, all you have to do in order to use this internal table is to assign this dynamic data (gt_data) to a field symbol as mentioned below:

 

ASSIGN gt_data->* TO <ft_data>.

 

Once this assignment is done, the field symbol now points to the dynamic internal table gt_data and the internal table can now be processed just like a normal internal table using the field symbol <ft_data>. The same approach is used for creating the work area as well. Finally the data from the excel sheet is transferred to the table <ft_data> by calling the function module TEXT_CONVERT_XLS_TO_SAP. I preferred to use this function module since lot of validations are done by this FM itself. For example, invalid cell value, incorrect date format etc will be taken care by the function module itself. The FM will even give the exact cell which is causing the error. However you may use other function modules as well, but the concept will remain the same. You may even use a text file for the data upload and use FM like GUI_UPLOAD.

 

Finally the data is uploaded in the database table row by row using MODIFY statement. And the upload results are displayed by calling the factory ALV. Those records which are successfully uploaded will be displayed in the ALV. You can also see the number of records that were successfully uploaded and number of records which gave error while uploading.

 

Steps for Upload:

 

1. Enter the database table name and file path in the selection screen



 

2. Click on execute button. The data upload takes place and you can see the result in ALV format.



 

3. Click on BACK button and you can see the update count.



 

Source Code:

 

REPORT ZISU_DATA_MIGRATION.

 

*--- Selection Screen
PARAMETERS: p_table TYPE dd02l-tabname OBLIGATORY.
PARAMETERS: p_file LIKE rlgrap-filename OBLIGATORY MEMORY ID ysm.

 

*--- Declarations for dynamic data
DATA gt_data TYPE REF TO data.
DATA GS_DATA TYPE REF TO data.

FIELD-SYMBOLS: <ft_data> TYPE STANDARD TABLE,
<FS_DATA> TYPE ANY.

 

*--- Declarations for Result Count
DATA: GV_SUCCESS TYPE I,
GV_ERROR   TYPE I.

 

*--- Declaration for factory ALV
data: gr_table type ref to cl_salv_table.

*--- File Open Dialog
AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
PERFORM f4_file_name.

 

*--- Validate table name
AT SELECTION-SCREEN ON p_table.

PERFORM f_validate_table.

 

START-OF-SELECTION.

 

*--- Create dynamic internal table
CREATE DATA gt_data TYPE TABLE OF (p_table).
ASSIGN gt_data->* TO <ft_data>.

 

*--- Create dynamic work area
CREATE DATA gs_data TYPE (p_table).
ASSIGN gs_data->* TO <fs_data>.

 

*--- Get Excel data in internal table
PERFORM f_convert_excel.

 

*--- Upload data to database table
PERFORM f_upload_table.

 

END-OF-SELECTION.

*--- Create Factory ALV and display result
PERFORM f_show_result.

 

 

*&---------------------------------------------------------------------*
*&      Form  F4_FILE_NAME
*&---------------------------------------------------------------------*
*       text
*----------------------------------------------------------------------*
*  -->  p1        text
*  <--  p2        text
*----------------------------------------------------------------------*
FORM f4_file_name .
CALL FUNCTION 'KD_GET_FILENAME_ON_F4'
* EXPORTING
*   PROGRAM_NAME        = SYST-REPID
*   DYNPRO_NUMBER       = SYST-DYNNR
*   FIELD_NAME          = ' '
*   STATIC              = ' '
*   MASK                = ' '
*   FILEOPERATION       = 'R'
*   PATH                =
CHANGING
file_name           = p_file
*   LOCATION_FLAG       = 'P'
* EXCEPTIONS
*   MASK_TOO_LONG       = 1
*   OTHERS              = 2
.
IF sy-subrc <> 0.

MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
ENDIF.

ENDFORM.                    " F4_FILE_NAME

*&---------------------------------------------------------------------*
*&      Form  F_VALIDATE_TABLE
*&---------------------------------------------------------------------*
*       text
*----------------------------------------------------------------------*
*  -->  p1        text
*  <--  p2        text
*----------------------------------------------------------------------*
FORM F_VALIDATE_TABLE .
DATA LV_TABNAME TYPE DD02L-TABNAME.

 

*--- Upload only Tables in customer namespace
IF p_table+0(1) NE 'Z' AND p_table+0(1) NE 'Y'.
MESSAGE 'Only tables in customer namespace can be uploaded' TYPE 'E'.
ENDIF.

 

*--- Only transparent tables can be uploaded
SELECT SINGLE tabname
FROM DD02L
INTO lv_tabname
where tabname = p_table and
tabclass = 'TRANSP'.
IF sy-subrc NE 0.
MESSAGE 'Only transparent tables can be uploaded' TYPE 'E'.
ENDIF.
ENDFORM.                    " F_VALIDATE_TABLE

*&---------------------------------------------------------------------*
*&      Form  F_CONVERT_EXCEL
*&---------------------------------------------------------------------*
*       text
*----------------------------------------------------------------------*
*  -->  p1        text
*  <--  p2        text
*----------------------------------------------------------------------*
FORM f_convert_excel .
DATA lt_raw_data  TYPE truxs_t_text_data.

CALL FUNCTION 'TEXT_CONVERT_XLS_TO_SAP'
EXPORTING
*     I_FIELD_SEPERATOR    =
i_line_header        = 'X'
i_tab_raw_data       = lt_raw_data
i_filename           = P_file
TABLES
i_tab_converted_data = <FT_DATA>

EXCEPTIONS
conversion_failed    = 1
OTHERS               = 2.

 

IF sy-subrc <> 0.

MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno

WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.

ENDIF.

 

ENDFORM.                    " F_CONVERT_EXCEL

*&---------------------------------------------------------------------*
*&      Form  upload_data
*&---------------------------------------------------------------------*
*       text
*----------------------------------------------------------------------*
FORM f_upload_table.

SET UPDATE TASK LOCAL.   " Switch to local update

 

LOOP AT <ft_data> ASSIGNING <fs_data>.

MODIFY (P_TABLE) FROM <FS_DATA>.

IF SY-SUBRC = 0.

GV_SUCCESS = GV_SUCCESS + 1.

ELSE.

delete <ft_data> index sy-tabix.

GV_ERROR = GV_ERROR + 1.

ENDIF.

ENDLOOP.

 

ENDFORM.                    "upload_data

*&---------------------------------------------------------------------*
*&      Form  f_show_result
*&---------------------------------------------------------------------*
*       text
*----------------------------------------------------------------------*
FORM f_show_result.
*--- Create Instance

CALL METHOD CL_SALV_TABLE=>FACTORY

IMPORTING

R_SALV_TABLE = GR_TABLE

CHANGING

T_TABLE      = <FT_DATA>. " Only entries which are updated successfully

 

*--- Display ALV \Output
gr_table->display( ).

 

write: 'No. of records successfully uploaded',GV_SUCCESS.

write: 'No. of records unable to upload', GV_ERROR.

ENDFORM.                    "f_show_result

 

 

Note:

 

1. Even though I have created an executable type program for data upload, it is better if we create an update function module with the above source code and call it in update task for huge data upload.

2. If you wish to validate individual records of the file, you can use the ASSIGN COMPONENT OF STRUCTURE statement to check for a particular field and then do validations.

3. It is not mandatory that you should use only excel file for data transfer when using this program, you can use text files or other file types also. You need to call relevant function module accordingly in order to get data in your internal table <FT_DATA>.

4. The excel file should contain all fields of the database table in the exactly same order, including the MANDT field. Upload of partial fields is not possible through this program.

 

 

Thanks and Regards,

Ajay Bose

44 Comments
Labels in this area