‎2006 May 03 12:17 PM
Dear all,
I want to load 10 millions data into an internal table.
Is it possible?
Is there any limited for the size of internal table?
If there is, what is the size?
Is it possible to change the size from the setting?
Thank you very much.
Seven
‎2006 May 03 12:22 PM
Hello Seven,
Yes, you could load any number of records into the Internal table. There is no particular limit on Internal tables but, when an Internal table is loaded with 10 Million records then the Memory allocated to the Internal table will not be sufficient and you may receive a System Error message 'No Roll Blocks availabe'.
It is always better to break the number of records . Do let me know your need.
Thanks,
- PSK
‎2006 May 03 12:26 PM
Dear Pabbisetty,
Is there any way to calculate how much memory it will use?
Actually, I want to list 10 millions data.
Thanks.
Seven
Message was edited by: Seven Jin
‎2006 May 03 12:29 PM
Hi seven,
1. Is there any way to calculate how much memory it will use?
For that we just need to know
the length of each fields
and the total characters due to it.
That multiplied by the number of records,
will be the memory in BYTES.
regards,
amit m.
‎2006 May 03 12:29 PM
) Go in debugging Mode
2) Setting->Memmory display on/off
3) Click on tables
4) Enter The Internal Table Name.
5) u will see the memmory used has
Internal table ______________
Length of Table Reference (in Bytes) _________
Bound Memory Referenced Memory
Memory Allocated __________ ____________
Memory Actually used __________ ________
Mark HelpFull Answers
‎2006 May 03 12:35 PM
Hello Seven,
As Amit and others have informed on how the memory could be calculated and as informed that you want to list the 10 Million records (thats huge)
You can use the Pakage size option
select field1 field2 field3
from Tablename
into table it_tablename package size 10000.
* Package size determines the Total # of records that will be picked up in the First loop.
loop at it_tablename.
write:/ it_tablename-field1,
it_tablename-field2.
endloop.
refresh it_tablename.
free it_tablename.
endselect.
In this way the first 10000 records are fetched written on to the screen, the memory is freed and the Internal table is ready for the second set to 10000 records.
Hope this helps.
- PSK
‎2006 May 03 12:22 PM
Hi Jin,
I think there is no limit as such for the size of the internal table
U can set like
data: begin of itab occurs 10 with header line.
supposing u have only 10 records but if more than 10 are there they still will be available in the internal table as they get appended to the existing records.
hope this helps,
priya.
‎2006 May 03 12:24 PM
Basically there are no limits...
also There is no limit on number of fields in internal table.
If you are preparing a list output, then use LINE-SIZE addition of the report statement and set it to 1023 (max).
ex.
REPORT ZSAMPLE LINE-SIZE 1023.
‎2006 May 04 11:05 AM
yes its true that ther is such no limit for internal tables , but better to use Hashed table to handle large amount of data .
Regards
Siddharth
‎2006 May 04 1:23 PM
Dear all,
I decide to use Package conception.
But I have a doubt.
Imagine if there are 10 records in the ZZZ table.
They are A1, A2, A3, A4, A5 ... A9.
I use the Package size 3.
First time, I will get A1, A2, A3. If I delete the A2 from the table, what will happen for the next package?
The next package will be ( A4, A5, A6 ) or ( A5, A6, A7 )?
Thank you.
Seven
‎2006 May 04 1:34 PM
Hi,
Which table are you talking about? If it is internal table, then the next package option would get you, A4, A5, A6.
If it is database table itself, then firstly you cannot do it inside a select-endselect block and if you use package size option in select, you have to use selecte-endselect compulsorily.
If you delete the records after the endselect,
Then the next package would be A1, A3,A4.
Regards,
Ravi
‎2006 May 03 12:32 PM
hi,
u can use field group instead
U can see t memory used in debuging mode..
set memory display on.
tanveer
‎2006 May 03 12:33 PM
HI
GOOD
THERE IS NO LIMITATION OF THE INTERNAL TABLE , YOU CAN STORE N NUMBER OF DTA IN THE INERNAL TABLE.
THERE IS NO CONSTANT INTERNAL SIZE OF THE INETNAL TABLE.
TYPES <t> TYPE|LIKE <tabkind> OF <linetype> [WITH <key>]
[INITIAL SIZE <n>].
THANKS
MRUTYUN
‎2006 May 03 12:41 PM
hello Jin,
you do not have to specify the memory required for an internal table. By using occurs n ,memory is reserved for the number of lines specified as soon as the first line is written .If more lines are added to an internal table than specified by <n>, the reserved memory expands automatically. If there is not enough space in memory for an internal table, it is written to a buffer or to the disk (paging area).
Award points if helpful.
thanks,
keerthi.
‎2006 May 03 5:04 PM
Dear all,
Thanks for your reply.
When I try to gather data from table into the internal table, there is a short dump. How can I do? They won't change the system setting.
The message is,
<b>What happened?
The current program had to be terminated because of an
error when installing the R/3 System.
The program had already requested 374467792 bytes from the operating
system with 'malloc' when the operating system reported after a
further memory request that there was no more memory space
available.
What can you do?
Make a note of the actions and input which caused the error.
To resolve the problem, contact your SAP system administrator.
You can use transaction ST22 (ABAP Dump Analysis) to view and administer
termination messages, especially those beyond their normal deletion
date.
Set the system profile parameters
- abap/heap_area_dia
- abap/heap_area_nondia
to a maximum of 374467792. Then reduce the value by 10.000.000 to be on the
safe side.
Then restart the SAP System.
abap/heap_area_nondia and abap/heap_area_dia:
Set smaller than the memory achieved for each process
with 'malloc' and smaller than abap/heap_area_total
You should also check whether the available memory (file system
swap and main memory) is sufficient for the started program
and its data.</b>
‎2006 May 04 9:28 AM
Dear all,
Could you please help me on this?
Thank you very much.
Seven
‎2006 May 04 9:33 AM
Hi again,
1. processing 10 million record
in one go, does not make sense.
2. Today or tomorrow, there might
be some memory shortage/problem.
3. For such cases,
we should use PACKAGE concept,
and process the reocrds,
IN A BUNCH (instead of in one shot).
4. To get a taste of how to use package,
just copy paste this in new program.
(it will list out company codes,
5 at a time)
5.
report abc.
data : itab like table of t001 with header line.
*----
select * from t001
package size 5
into table itab.
write:/ '----
new package'.
loop at itab.
write 😕 itab-bukrs.
endloop.
endselect.
regards,
amit m.
‎2006 May 04 9:51 AM
Hi,
first analyze what fields in the record you really need to know.
A l w a y s avoid the SELECT * because it will transfer all fields (including client which usually not needed). Define your internal table just with the fields needed.
If this does not help, use the package concept as described earlier.
I developed a quite dynamic conceot for this:
- define a structured type,
i.e.
Arrears collector
BEGIN OF ty_bsid_arrears_dta,
bukrs TYPE bsid-bukrs,
kunnr TYPE bsid-kunnr,
gjahr TYPE bsid-gjahr,
bldat TYPE bsid-bldat,
dmbtr TYPE bsid-dmbtr,
zbd1t TYPE bsid-zbd1t,
END OF ty_bsid_arrears_dta,
ty_t_bsid_arrears_dta TYPE SORTED TABLE OF ty_bsid_arrears_dta
WITH UNIQUE DEFAULT KEY.
data:
lt_bsid_arrears_dta type ty_t_bsid_arrears_dta.
*just before selecting data, retrieve the fields *dynamically using
PERFORM getfields
USING lt_bsid_arrears_dta
CHANGING lt_fields."TYPE TABLE OF fieldname
Then
SELECT (lt_fields)
INTO CORRESPONDING FIELDS OF lt_bsid_arrears_dta
WHERE...
Not to forget the form for the fields, it worrks with any structured or table datatype:
&----
*& Form getfields
&----
get fieldnames for table or structure (C) Clemens Li 2006
----
FORM getfields
USING px_data TYPE any
CHANGING pt_fields TYPE table.
DATA:
lt_comp TYPE abap_compdescr_tab,
lr_dat TYPE REF TO data,
lv_kind TYPE abap_typecategory,
lr_typedescr TYPE REF TO cl_abap_typedescr,
lr_tabledescr TYPE REF TO cl_abap_tabledescr,
lr_structdescr TYPE REF TO cl_abap_structdescr.
FIELD-SYMBOLS:
<fs> TYPE ANY,
<ft> TYPE ANY TABLE,
<comp> TYPE LINE OF abap_compdescr_tab.
lr_typedescr ?= cl_abap_typedescr=>describe_by_data( px_data ).
CASE lr_typedescr->kind.
WHEN 'S'.
lr_structdescr ?= lr_typedescr.
lt_comp = lr_structdescr->components.
WHEN 'T'.
ASSIGN px_data TO <ft>.
CREATE DATA lr_dat LIKE LINE OF <ft>.
ASSIGN lr_dat->* TO <fs>.
lr_structdescr ?= cl_abap_structdescr=>describe_by_data( <fs> ).
lt_comp = lr_structdescr->components.
WHEN OTHERS.
MESSAGE e241(00).
Function is invalid in this environment
ENDCASE.
CLEAR pt_fields.
LOOP AT lt_comp ASSIGNING <comp>.
APPEND <comp>-name TO pt_fields.
ENDLOOP." at lt_comp assigning <comp>.
ENDFORM. " getfields
The big advantage is: Just change your structures type and anything else works with no change at all.
Sorry for the bad formatting, I don't know why all SDN list entries get compressed...
regards,
Clemens
C.