Application Development and Automation Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 
Read only

Memory full for Internal table

Former Member
0 Likes
1,116

Hi all,

I have an intenal table where I need to pass millions of records. I am getting short dump due to the memory full. The message is 'the maximum no of bytes passed is exceeded'.

To avoid this, I have used the Hash table, but still got the same error. If I execute the same program with less period, the program is getting exected, because the no.of records which we are passing is less. the problem comes during selecting many millions of records from the database table to internal table.

Please advice ASAP.

Thanks,

Venu

6 REPLIES 6
Read only

Former Member
0 Likes
913

Hi,

You can increase the size of memory of the Internal Table in the Declaration part.

Eg : Begin of Itab occurs 100

field names.

End of Itab.

Regards

Rose

Read only

Former Member
0 Likes
913

Hi Venu,

In this case you could use the PACKAGE SIZE addition in the SELECT query to process in limited amount of data, thus avoiding the memory overloads.

Eg:

SELECT *

FROM &table&;

INTO TABLE itab

PACKAGE SIZE <n>.

IF sy-subrc EQ 0.

*" Process the n records

ENDIF.

ENDSELECT.

regards,

Prabhu

Reward if helpful

Read only

0 Likes
913

I tried the cursor method using the package size, but still I got the same message.

Because, in this method we are selecting data from data base in packages, but ultimately appending to an internal table, so again, got the same message.

Thanks,

Venu

Read only

Former Member
0 Likes
913

Hi

I am sure it will not solve the issue because occurs <n> means at a time it allocates memory for <n> records only.

Regards

Aditya

Read only

Former Member
0 Likes
913

An internal table can hold data upto 2 GB .. it can be increased

to 4GB but it's not recommended ...

Read only

Former Member
0 Likes
913

Hi,

You can make use Field Groups for large volume of data.

Reward if helpful.

RSS.