cancel
Showing results for 
Search instead for 
Did you mean: 
Read only

Convert internal table to text format

0 Kudos
2,154

Hi experts,

I have some data in itab which I have to convert in text format and write on the application server.

I used the FM SAP_CONVERT_TO_TEX_FORMAT but the client system puts out Resource shortage dump when using this FM.

Is there a way to to achieve this and write on the application server directly as I don't want to use another itab.

Thanks in advance.

    DATA:ls_header_ep TYPE char100.
    DATA:ls_header_ep_tech TYPE char100.
    CONCATENATE
            'Plant'
            'Material'
            'Procurement'
            INTO ls_header_ep SEPARATED BY ','.
    CONCATENATE
            'WERKS'
            'MATNR'
            'BESKZ'
            INTO ls_header_ep_tech SEPARATED BY ','.
    CALL FUNCTION 'SAP_CONVERT_TO_TEX_FORMAT'
      EXPORTING
        i_field_seperator    = ','  " Comma seperator
      TABLES
        i_tab_sap_data       = gt_table_ep[]
      CHANGING
        i_tab_converted_data = lt_out_temp
      EXCEPTIONS
        conversion_failed    = 1
        OTHERS               = 2.
    IF sy-subrc <> 0.
      MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
              WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
    ENDIF.

    APPEND  ls_header_ep TO lt_out.
    APPEND  ls_header_ep_tech TO lt_out.
    APPEND LINES OF lt_out_temp TO lt_out.
    PERFORM file_transfer.
    CLEAR:ls_file,lv_file.
  ENDIF.

FORM file_transfer .
  IF p_dir IS NOT INITIAL.

    lv_file = p_dir.

    OPEN DATASET lv_file FOR OUTPUT IN TEXT MODE ENCODING DEFAULT.
    IF sy-subrc = 0.
      LOOP AT lt_out INTO ls_out.
        TRANSFER ls_out TO lv_file.
      ENDLOOP.
      CLOSE DATASET lv_file.
    ENDIF.
  ENDIF.
ENDFORM.

Accepted Solutions (1)

Accepted Solutions (1)

Sandra_Rossi
Active Contributor
0 Kudos

Before doing any fix, you should analyze why you get "resource shortage" (we can't to that for you)

Answers (1)

Answers (1)

RaymondGiuseppi
Active Contributor
0 Kudos

Did you consider splitting your (huge) internal table (gt_table_ep) and process data by batches of 'n' records, appending data to the end of the dataset?