‎2017 Jan 04 7:04 PM
I want to split Internal Table into multiple Internal Tables as the Main Internal Tables contains huge no of records due to which the system generates a dump.
If the no of records are less it easily processes the Records.
For Eg
ITAB1 = 100000 Records. Split this table into multiple internal tables
TAB1 = 50000 Records
TAB2 = 50000 Records.
I dont want to follow the hardcoding logic of writing and checking sy-tabix = 50000, append into TAB1 or TAB2.
Can this be done using dynamic logic. Like the system identifies it has reached the size limit and automatically fills up another internal table.
Hope my requirement is clear.
Please guide me in resolving this issue.
‎2017 Jan 04 7:07 PM
I guess the first question is what in the world are you wanting to read into an internal table that could be that large to begin with (and not better suited for another toolset that handles "big data" for instance). Are you absolutely sure you should be pulling that much data??!?!
‎2017 Jan 04 8:22 PM
As Christopher said I doubt this approach is the correct way and also wouldn't the dump already be generated if your first table exceeds the limit?
I gave it a try anyways:
TYPES: BEGIN OF lty_line,
column1 TYPE i,
column2 TYPE c LENGTH 4,
END OF lty_line.
CONSTANTS: lc_test_data_amount TYPE i VALUE 100000,
lc_split_at_amount TYPE i VALUE 10000.
DATA: lt_big_table TYPE STANDARD TABLE OF lty_line,
lv_string TYPE string,
lt_small_tables TYPE STANDARD TABLE OF REF TO data,
lr_small_table TYPE REF TO data.
FIELD-SYMBOLS: <lg_target> TYPE STANDARD TABLE.
" Generate test data
DO lc_test_data_amount TIMES.
CALL FUNCTION 'GENERAL_GET_RANDOM_STRING'
EXPORTING
number_chars = 4 " Specifies the number of generated chars
IMPORTING
random_string = lv_string. " Generated string
APPEND VALUE #( column1 = sy-index
column2 = CONV #( lv_string ) ) TO lt_big_table.
ENDDO.
CLEAR lv_string.
" Split
DATA(lo_descr) = CAST cl_abap_tabledescr(
cl_abap_typedescr=>describe_by_data( lt_big_table )
).
LOOP AT lt_big_table ASSIGNING FIELD-SYMBOL(<ls_line>).
IF ( sy-tabix - 1 ) MOD lc_split_at_amount = 0.
CREATE DATA lr_small_table TYPE HANDLE lo_descr.
ASSERT lr_small_table IS BOUND.
APPEND lr_small_table TO lt_small_tables.
ASSIGN lr_small_table->* TO <lg_target>.
ASSERT <lg_target> IS ASSIGNED.
ENDIF.
APPEND <ls_line> TO <lg_target>.
ENDLOOP.
UNASSIGN: <lg_target>, <ls_line>.
FREE lr_small_table.
BREAK-POINT. " lt_small_tables contains references to the split tables
‎2017 Jan 04 8:32 PM
I believe that is his question....ie. can he "detect dynamically" right before an internal table size would exceed and cause a dump and thus, "spill over" into a new, additional, other internal table and continue to do the same on that new internal table as needed (possibly creating more and more other "spill over" tables too).
So in your example.....he does not know/want a "hardcode" data limit or table size....but by the way, very nice code there!
‎2017 Jan 04 9:04 PM
Thanks! I doubt it's possible then or at least I don't think there is a (documented) way to get the memory allocation limit for a data object from the runtime environment.
‎2017 Jan 05 3:02 PM
Could the limit be derived from one of the environment variables (t-code RZ11)?
‎2017 Jan 05 8:49 PM
You could try getting a size from there and compare it using something like this I guess:
DATA(lv_max_size) = ???.
DESCRIBE FIELD: lt_tab LENGTH DATA(lv_size_current) IN BYTE MODE,
ls_new_line LENGTH DATA(lv_size_additional) IN BYTE MODE.
IF ( lv_size_current + lv_size_additional ) > lv_max_size.
" On to the next itab...
ENDIFBut I doubt the behavior is "well defined" (in ABAP) and is probably release, hardware and / or workload dependent. Looking through the documentation there is even wording like "usually" in place so I doubt that approach is worth pursuing.
http://help.sap.com/abapdocu_751/en/index.htm?file=abenmemory_consumption.htm
‎2022 Jun 01 11:15 AM
Hi Fabian,
could you please let us know how to decode the lt_small_tables since it now has references.
Thanks,
Subba