‎2015 Jan 21 5:43 AM
Hi all,
As known, it's not recommended to write SELECT when doing loop for an internal table.
But this time my problem is that I have to use an SAP standard FM to get data from DB in a LOOP.
The internal table is large and it causes some memory shortage and time issues when I run it.
And it's not easy to do data fetching by myself so that I am trying to figure out another way.
Is there something like always make DB connection available, avoiding 'open'(like PREPARE) and 'close' a lot of times, like:
OPEN DB connection.
LOOP AT it.
SELECT xxx FROM xxx
ENDLOOP.
CLOSE DB connection.
Is it possible to do that?
‎2015 Jan 21 6:31 AM
‎2015 Jan 21 7:09 AM
‎2015 Jan 21 8:58 AM
hi ming,
you can use database cursor concept where you can mention size.
‎2015 Jan 21 10:13 AM
‎2015 Jan 21 11:54 AM
Hi,
You can write SELECT query outside LOOP using FOR ALL ENTRIES w.r.t internal table "it" and then read this structure in within the LOOP.
SELECT XXX
INTO TABLE it_one
FOR ALL ENTRIES XXX
WHERE XXX
LOOP AT it.
READ TABLE it_one INTO wa_one WITH KEY XXX = it-XXX.
ENDLOOP.
Regards,
GJ
‎2015 Jan 21 12:07 PM
Hello,
you could work with DB cursor to process only 100 entries at one time:
OPEN CURSOR dbcur FOR
SELECT *
FROM spfli
ORDER BY carrid.
DO.
FETCH NEXT CURSOR dbcur INTO TABLE spfli_tab PACKAGE SIZE 100.
IF sy-subrc <> 0.
EXIT.
ENDIF
* Your logic here for 100 entries each
ENDDO.
CLOSE CURSOR: dbcur.
‎2015 Jan 21 12:20 PM
Hi,
You can eliminate the data using 'FOR ALL ENTRIES ' statement OR you can use that statement for performance tunning:
%_HINTS ORACLE
'..........'.
refer the following links,
http://www.sapfans.com/forums/viewtopic.php?p=883185&sid=37b22c7f1a6421ccb8966c9da421a540
‎2015 Jan 21 12:23 PM
‎2015 Jan 21 12:46 PM
Try to use package option. Take some interval of records while fetching data