Application Development and Automation Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 
Read only

Internal table Overflow

Former Member
0 Likes
3,776

Hi All,

I have a requirement in which I need to fetch the data from standard database table into an internal table, manipulate the data fetched and finally write it on the application server in form of .txt file.

But the number of records to be fetched are more than 4 Crores (= 40 million) and for that I am getting the dump,

"No more storage space available for extending an internal table"

The below possible solutions have been incorporated or ruled out:

· Data is being fetched using cursor with package size of 1 Lakh (= 100,000) records.

· The approach of using the logical data base has been ruled out.

· The memory allocated to internal table canu2019t be increased.

Please suggest a solution to the above problem.

Thanks,

Manish

Moderator message: please use international units only.

Edited by: Thomas Zloch on Mar 22, 2011 9:42 AM

1 ACCEPTED SOLUTION
Read only

Former Member
0 Likes
2,746

Manish,

It is not a good practice to fetch all the 40 Million records at a time from DB table.

Use Package size of around 100,000 records with parallel processing, using this approach report might take some time to execute but it will not run out of memory.

Thanks,

Ramnivas

13 REPLIES 13
Read only

Former Member
0 Likes
2,746

Hi,

Use the package size statement in the select query with 1crore (= 10 million).

If same has been used and still getting dump them ask your basis to increase memory of application area as internal table are dynamic table which do increase its memory at runtime and if sufficient space will not be available on the server it goes to dump, checkout many other threads also as same is looking very frequant issus.

Regards,

Umang mehta

Moderator message: please use international units only.

Edited by: Thomas Zloch on Mar 22, 2011 9:45 AM

Read only

ThomasZloch
Active Contributor
0 Likes
2,746

Your only option is package processing. When you say you have incorporated this already, then it was not properly done. In many installations you have at least 2 GB available before your process dumps, so packages of 100,000 rows should fit well, unless your row length is truely extraordinary. Then reduce the package size.

Make sure you are clearing/freeing all internal tables before processing the next package.

Thomas

Read only

former_member389775
Discoverer
0 Likes
2,746

Hi,

You should use the package size statement but you could improve it by using it on a online parallel processing mode, it should pump up your performance.

Check this example: http://help.sap.com/saphelp_nw04/helpdata/en/22/0425c6488911d189490000e829fbbd/content.htm

Regards,

Helder Gonçalves

Read only

Former Member
0 Likes
2,746

Hi Manish,

If after changing package size also,this is not working,there is a kind of dirty approach.

As you are fetching 40 million records from one table,so it seems that this is not a kind of daily report.so you can use a less performance report also.

try something like this :

all the records would be categorised into groups, like order type,document type.so first select all those document types.and select data from database for each document type.write those records into ur file,refresh ur internal table and select again.

hope this helps

Read only

Former Member
0 Likes
2,747

Manish,

It is not a good practice to fetch all the 40 Million records at a time from DB table.

Use Package size of around 100,000 records with parallel processing, using this approach report might take some time to execute but it will not run out of memory.

Thanks,

Ramnivas

Read only

ravi_lanjewar
Contributor
0 Likes
2,746

Hi,

I am giving you perfect solution, Which i have found.

My Question why you are reading all data at sinlge attempt ?

Package size will not also help you at all.

You have to read some chunk of data in loop and process that much of record append data to you resultant table, refresh the internal table and call commit satement inside the loop to reset the exection time (i.e. timeout error will not come)

Below is just an idea

If your resultant data is having the more entries 10 million then store it to database table and download it to excel if required.

Read only

0 Likes
2,746

Package size will not also help you at all.

Please elaborate how you arrived at this wrong conclusion.

call commit satement inside the loop to reset the exection time (i.e. timeout error will not come)

a very nasty workaround, misusing the COMMIT WORK statement, and this process here might well run in batch process

If your resultant data is having the more entries 10 million then store it to database table and download it to excel if required

10 million records in Excel!?

Thomas

Read only

0 Likes
2,746

Hello Ravi,

As said by Thomas already please dont give a solution whcih is a workaround and also dont make wrong Statment like

Package size will not also help you at all.

Regards

Sandipan

Read only

0 Likes
2,746

Hi Thomas,


What do you mean by Package size ?

Define the package in select SQL or reading data in loop (i.e. 1000 or 2000 which is sutaible figure after test)

Recently, I have face same problem. I first try for package size between 10 to 100, but it won't work,

then I decided to used read data in loop like 100 records then it work. I don't know why this happend ?.


If your resultant data is having the more entries 10 million then store it to database table
 and download it to excel if required
10 million records in Excel!?

By mistake written 10 million it. 1 million approximatly.

Thomas

Read only

SuhaSaha
Product and Topic Expert
Product and Topic Expert
0 Likes
2,746

Hello Ravi,

I first try for package size between 10 to 100, but it won't work,

then I decided to used read data in loop like 100 records then it work. I don't know why this happend ?

If you're not sure, why confuse others? PACKAGE SIZE we're talking about is the one we use with the FETCH statement. Read the F1 documentation for further clarification.

By mistake written 10 million it. 1 million approximatly.

Still incorrect!! If i am not mistaken, max recs you can fill in an excel is ~65k.

Cheers,

Suhas

Read only

0 Likes
2,746

Ravi, in order to clarify, I was referring only to PACKAGE SIZE option of the SELECT or FETCH statements. In my experience this is the option to choose when the amount of data you are processing can become so large that you will run into memory problems (heap, extended, roll back areas, etc.) when processed in one chunk.

Ideally the package size is a parameter on the selection screen or a global constant that can be adjusted to a value that provides the best compromise between runtime and memory consumption.

Thomas

P.S. up until Excel 2003 it was 65K rows, since then 1 million rows (I believe), but still a pain to scroll through

Read only

0 Likes
2,746

Hi All,

Thanks for your replies.

I fetched the records in package size of 100000 records.

The problem is solved now.

Thanks,

Manish

Read only

Former Member
0 Likes
2,746

Hi,

use command

UPDATE dbtab SET ... WHERE ...

for all of your update cases. Don't forget the COMMIT after that.

To find your update cases you can start a SELECT DISTINCT on your db table!

Regards

Klaus