‎2005 Jul 10 9:04 AM
Hi,
If I am supposed to get say 1 Million records and if application server's memory is not sufficient how can u get the records into chunk of packets ?
‎2005 Jul 10 9:12 AM
You may use the following example.
DATA: itab TYPE STANDARD TABLE OF SCARR WITH NON-UNIQUE
DEFAULT KEY INITIAL SIZE 10.
FIELD-SYMBOLS: <FS> TYPE scarr.
SELECT * INTO TABLE itab PACKAGE SIZE 20 FROM scarr.
LOOP AT itab ASSIGNING <FS>.
WRITE: / <FS>-carrid, <FS>-carrname.
ENDLOOP.
ENDSELECT.
‎2005 Jul 10 9:12 AM
You may use the following example.
DATA: itab TYPE STANDARD TABLE OF SCARR WITH NON-UNIQUE
DEFAULT KEY INITIAL SIZE 10.
FIELD-SYMBOLS: <FS> TYPE scarr.
SELECT * INTO TABLE itab PACKAGE SIZE 20 FROM scarr.
LOOP AT itab ASSIGNING <FS>.
WRITE: / <FS>-carrid, <FS>-carrname.
ENDLOOP.
ENDSELECT.
‎2005 Jul 11 4:27 PM
Hi,
If you getting records using a select statement, then use the Cursor, Fetch statement.
Thanks.
Kshitij
‎2005 Jul 11 5:15 PM
Hi Nitin,
Generally, the application server's memory can be considered to be virtually unlimited. However, it is the memory for the work process that will be limited. Even so, in most cases it would be capable of holding very huge volumes of data without any problem. It can even acquire additional space from the SAP Extended Memory.
The memory for a work process and the size of the Extended Memory can be configured using some profile parameters.
Are you getting any errors / short dumps because of the volume of the data? Is there a scope to improvise the logic of the program ? For example, not all of these million records may be needed. The SELECT statements can be written with more restrictive where-clause etc.,
Further troubleshooting can be given if the exact nature of your problem is known.
Regards,
Anand Mandalika.