2024 Sep 13 11:02 AM - edited 2024 Sep 13 2:57 PM
Hi.
I am fetching data from below code.
While fetching from CDPOS ..Dump is created.
As per my analysis ..What i understood is ..It is having memory issue.
Table CDHDR is nearly having 27 Lakhs records. Due to this Dump is created.
My Question is without using Hana In Program itself is there any possible way to solve the issue.
Some logic I am thinking.
1)First need to count total records from CDHDR.
2)Then while fetching CDHDR table i need to split, it into batches like up to 1 lakh records then remaining process like Fetching data from CDPOS and other remaining process need to complete, then remaining records again from CDHDR Need to process it.. until total records over.
But how to implement I would like to know?
SELECT objectclas,objectid, changenr, username, udate, utime, tcode FROM cdhdr INTO TABLE @DATA(lt_cdhdr)
WHERE objectclas IN @LR_objectclas AND username IN @s_user AND udate IN @LR_daterange.
IF lt_cdhdr IS NOT INITIAL.
SELECT objectclas, objectid, changenr, tabname, tabkey, fname, chngind, value_new, value_old FROM cdpos INTO TABLE @DATA(lt_cdpos)
FOR ALL ENTRIES IN @LT_cdhdr
WHERE objectclas = @LT_cdhdr-objectclas AND objectid = @LT_cdhdr-objectid AND changenr = @LT_cdhdr-changenr.
endif.
2024 Sep 13 1:48 PM
Please use the buttons "..." and "</>" to display your code nicely. See how to do it here: https://community.sap.com/t5/questions-about-sap-websites/how-to-post-code-in-sap-community-gt-2024/...
2024 Sep 13 2:58 PM
2024 Sep 13 3:19 PM
Hello @AadivaramAbbulu
You can split your processing into chunks using OPEN CURSOR with PACKAGE SIZE n addition and FETCH statements. Use joint select from CDHDR and CDPOS to avoid FOR ALL ENTRIES, that might also cause issues when resulting OR condition is too large.
Best regards
Dominik Tylczynski
2024 Sep 13 4:13 PM
Also SELECT ... PACKAGE SIZE ... is possible.
ABAP documentation: https://help.sap.com/doc/abapdocu_latest_index_htm/latest/en-US/index.htm?file=abapinto_clause.htm#!...
2024 Sep 16 12:08 PM
Thank U I Implemented the logic. Now while fetching the records. I mean while calling the Query no dump. but there is Final internal table like rt_output here I am getting the dump No more memory available to add rows to an internal table-TSV_TNEW_PAGE_ALLOC_FAILED .... I am thinking final internal table not able to to hold huge data... Any advice.
"Using Cursor Technique to improve the performance
" Open cursor for fetching data in packages
OPEN CURSOR WITH HOLD @LV_cursor FOR
SELECT objectclas, objectid, changenr, username, udate, utime, tcode
FROM cdhdr
WHERE objectclas IN @LR_objectclas
AND username IN @s_user
AND udate IN @LR_daterange.
DO.
* " Fetch data in chunks
FETCH NEXT CURSOR lv_cursor INTO CORRESPONDING FIELDS OF TABLE lt_temp_cdhdr PACKAGE SIZE lv_package_size.
" Exit the loop if no more data
IF sy-subrc <> 0.
EXIT.
ENDIF.
SORT lt_temp_cdhdr BY objectclas objectid changenr.
CLEAR lt_dcdhdr.
" Append fetched records to the final table
APPEND LINES OF lt_temp_cdhdr TO lt_dcdhdr.
IF lt_dcdhdr IS NOT INITIAL.
SELECT objectclas, objectid, changenr, tabname, tabkey, fname, chngind, value_new, value_old FROM cdpos INTO TABLE @DATA(lt_dcdpos)
FOR ALL ENTRIES IN @LT_dcdhdr
WHERE objectclas = @LT_dcdhdr-objectclas AND objectid = @LT_dcdhdr-objectid AND changenr = @LT_dcdhdr-changenr.
IF lt_dcdpos IS NOT INITIAL.
SELECT bname, name_first, name_last FROM user_addr INTO TABLE @DATA(lt_duser_addr)
FOR ALL ENTRIES IN @LT_dcdhdr
WHERE bname = @LT_dcdhdr-username.
IF lt_duser_addr IS NOT INITIAL.
SELECT bname, persnumber, addrnumber FROM usr21 INTO TABLE @DATA(lt_dusr21)
FOR ALL ENTRIES IN @LT_duser_addr
WHERE bname = @LT_duser_addr-bname.
IF lt_dusr21 IS NOT INITIAL.
SELECT addrnumber, persnumber, date_from, consnumber,smtp_addr, smtp_srch FROM adr6 INTO TABLE @DATA(lt_dadr6)
FOR ALL ENTRIES IN @LT_dusr21
WHERE persnumber = @LT_dusr21-persnumber.
ENDIF.
ENDIF.
ENDIF.
SORT lt_dcdpos BY changenr.
SORT lt_duser_addr BY bname.
SORT lt_dusr21 BY bname.
SORT lt_dadr6 BY persnumber.
LOOP AT lt_dcdhdr INTO DATA(ls_dcdhdr).
READ TABLE lt_dcdpos WITH KEY changenr = ls_dcdhdr-changenr INTO DATA(ls_dcdpos) BINARY SEARCH.
READ TABLE lt_duser_addr WITH KEY bname = ls_dcdhdr-username INTO DATA(ls_duser_addr) BINARY SEARCH.
READ TABLE lt_dusr21 WITH KEY bname = ls_duser_addr-bname INTO DATA(ls_dusr21) BINARY SEARCH.
READ TABLE lt_dadr6 WITH KEY persnumber = ls_dusr21-persnumber INTO DATA(ls_dadr6) BINARY SEARCH.
APPEND VALUE #( uname = ls_dcdhdr-username
udate = ls_dcdhdr-udate
utime = ls_dcdhdr-utime
tcode = ls_dcdhdr-tcode
Objvale = ls_dcdhdr-objectid
tabname = ls_dcdpos-tabname
fname = ls_dcdpos-fname
valueold = ls_dcdpos-value_old
valuenew = ls_dcdpos-value_new
fstname = ls_duser_addr-name_first
lstname = ls_duser_addr-name_last
email = ls_dadr6-smtp_addr ) TO rt_output.
CLEAR:ls_dcdhdr,ls_dcdpos,ls_duser_addr,Ls_dusr21,ls_dadr6.
ENDLOOP.
ENDIF.
ENDDO.
CLOSE CURSOR lv_cursor.
2024 Sep 16 12:21 PM
You have changed your code to finally no advantage because you consume exactly the same amount of memory (RT_OUTPUT). The goal of loading by package is to process each package, free the memory, then process the next package, etc. So, change your code accordingly.
Note that the "death trio" STANDARD TABLE, SORT and BINARY SEARCH, have been obsolete for 20 years. Only use SORTED TABLE.
2024 Sep 16 1:05 PM
I did some changes. No Use. Still same issue.
Do.
FREE lt_temp_cdhdr.
ENDDO.
2024 Sep 16 1:53 PM - edited 2024 Sep 16 1:55 PM
You don't understand. The issue is about RT_OUTPUT which contains gigabytes of data. Instead, process by package, fill RT_OUTPUT only with few megabytes of data, process the data, then load the next package in RT_OUTPUT, process the data, etc.
2024 Sep 16 2:10 PM
Hi, Sandra not understood.
Rt_output is Final internal table. Before displaying the data into the Report output. I need to fill it. I am adding data into the RT_output By package only...
Example:
let's say example
First time package
Rt_output 100 records loaded next time
Rt_output 100+120 records loaded next time many records.
Just filling Rt_ouptut with just 1 package and how to show the output... I need to process remaining package until total package over tight. But mean while rt_ouput is filling memory. Not understood clearly.
"fill RT_OUTPUT only with few megabytes of data, process the data"
2024 Sep 16 3:23 PM
Let's say your program prints a report.
In your example, LV_PACKAGE_SIZE is 100.
You need to refactor your program, to print 100 lines to the report at a time:
RT_OUTPUT will never have more than 100 lines.
2024 Sep 16 3:25 PM
Agree with term "death trio" : lol
Suggest looking into using standard tables with defined non unique secondary keys instead of sorted tables. Sorted tables have relatively poor insert/delete in comparison. There can be multiple defined secondary keys as well.
2024 Sep 16 2:39 PM
Hello @AadivaramAbbulu
I can see two problems with your code:
Best regards
Dominik Tylczynski