‎2011 Mar 22 12:32 AM
Hi All,
I have a requirement in which I need to fetch the data from standard database table into an internal table, manipulate the data fetched and finally write it on the application server in form of .txt file.
But the number of records to be fetched are more than 4 Crores (= 40 million) and for that I am getting the dump,
"No more storage space available for extending an internal table"
The below possible solutions have been incorporated or ruled out:
· Data is being fetched using cursor with package size of 1 Lakh (= 100,000) records.
· The approach of using the logical data base has been ruled out.
· The memory allocated to internal table canu2019t be increased.
Please suggest a solution to the above problem.
Thanks,
Manish
Moderator message: please use international units only.
Edited by: Thomas Zloch on Mar 22, 2011 9:42 AM
‎2011 Mar 25 12:41 PM
Manish,
It is not a good practice to fetch all the 40 Million records at a time from DB table.
Use Package size of around 100,000 records with parallel processing, using this approach report might take some time to execute but it will not run out of memory.
Thanks,
Ramnivas
‎2011 Mar 22 3:41 AM
Hi,
Use the package size statement in the select query with 1crore (= 10 million).
If same has been used and still getting dump them ask your basis to increase memory of application area as internal table are dynamic table which do increase its memory at runtime and if sufficient space will not be available on the server it goes to dump, checkout many other threads also as same is looking very frequant issus.
Regards,
Umang mehta
Moderator message: please use international units only.
Edited by: Thomas Zloch on Mar 22, 2011 9:45 AM
‎2011 Mar 22 8:49 AM
Your only option is package processing. When you say you have incorporated this already, then it was not properly done. In many installations you have at least 2 GB available before your process dumps, so packages of 100,000 rows should fit well, unless your row length is truely extraordinary. Then reduce the package size.
Make sure you are clearing/freeing all internal tables before processing the next package.
Thomas
‎2011 Mar 22 10:09 AM
Hi,
You should use the package size statement but you could improve it by using it on a online parallel processing mode, it should pump up your performance.
Check this example: http://help.sap.com/saphelp_nw04/helpdata/en/22/0425c6488911d189490000e829fbbd/content.htm
Regards,
Helder Gonçalves
‎2011 Mar 22 11:52 AM
Hi Manish,
If after changing package size also,this is not working,there is a kind of dirty approach.
As you are fetching 40 million records from one table,so it seems that this is not a kind of daily report.so you can use a less performance report also.
try something like this :
all the records would be categorised into groups, like order type,document type.so first select all those document types.and select data from database for each document type.write those records into ur file,refresh ur internal table and select again.
hope this helps
‎2011 Mar 25 12:41 PM
Manish,
It is not a good practice to fetch all the 40 Million records at a time from DB table.
Use Package size of around 100,000 records with parallel processing, using this approach report might take some time to execute but it will not run out of memory.
Thanks,
Ramnivas
‎2011 Mar 25 1:08 PM
Hi,
I am giving you perfect solution, Which i have found.
My Question why you are reading all data at sinlge attempt ?
Package size will not also help you at all.
You have to read some chunk of data in loop and process that much of record append data to you resultant table, refresh the internal table and call commit satement inside the loop to reset the exection time (i.e. timeout error will not come)
Below is just an idea
If your resultant data is having the more entries 10 million then store it to database table and download it to excel if required.
‎2011 Mar 25 1:17 PM
Package size will not also help you at all.
Please elaborate how you arrived at this wrong conclusion.
call commit satement inside the loop to reset the exection time (i.e. timeout error will not come)
a very nasty workaround, misusing the COMMIT WORK statement, and this process here might well run in batch process
If your resultant data is having the more entries 10 million then store it to database table and download it to excel if required
10 million records in Excel!?
Thomas
‎2011 Mar 27 10:11 AM
Hello Ravi,
As said by Thomas already please dont give a solution whcih is a workaround and also dont make wrong Statment like
Package size will not also help you at all.
Regards
Sandipan
‎2011 Mar 28 6:44 AM
Hi Thomas,
What do you mean by Package size ?
Define the package in select SQL or reading data in loop (i.e. 1000 or 2000 which is sutaible figure after test)
Recently, I have face same problem. I first try for package size between 10 to 100, but it won't work,
then I decided to used read data in loop like 100 records then it work. I don't know why this happend ?.
If your resultant data is having the more entries 10 million then store it to database table
and download it to excel if required
10 million records in Excel!?
By mistake written 10 million it. 1 million approximatly.
Thomas
‎2011 Mar 28 9:07 AM
Hello Ravi,
I first try for package size between 10 to 100, but it won't work,
then I decided to used read data in loop like 100 records then it work. I don't know why this happend ?
If you're not sure, why confuse others? PACKAGE SIZE we're talking about is the one we use with the FETCH statement. Read the F1 documentation for further clarification.
By mistake written 10 million it. 1 million approximatly.
Still incorrect!! If i am not mistaken, max recs you can fill in an excel is ~65k.
Cheers,
Suhas
‎2011 Mar 28 9:30 AM
Ravi, in order to clarify, I was referring only to PACKAGE SIZE option of the SELECT or FETCH statements. In my experience this is the option to choose when the amount of data you are processing can become so large that you will run into memory problems (heap, extended, roll back areas, etc.) when processed in one chunk.
Ideally the package size is a parameter on the selection screen or a global constant that can be adjusted to a value that provides the best compromise between runtime and memory consumption.
Thomas
P.S. up until Excel 2003 it was 65K rows, since then 1 million rows (I believe), but still a pain to scroll through
‎2011 Apr 06 5:16 PM
Hi All,
Thanks for your replies.
I fetched the records in package size of 100000 records.
The problem is solved now.
Thanks,
Manish
‎2011 Mar 28 6:55 AM
Hi,
use command
UPDATE dbtab SET ... WHERE ...
for all of your update cases. Don't forget the COMMIT after that.
To find your update cases you can start a SELECT DISTINCT on your db table!
Regards
Klaus