‎2008 Aug 01 9:56 AM
Hi Experts,
To day we try to process a GB size file into SAP system through ..it is going to short dump..saying Unable to fulfil request for 5310 bytes of memory..
Could anyone give me some idea what could be the problem in this.
what may be the probable solution for this..
Regards
Rao
‎2008 Aug 01 10:19 AM
hi..
if u can download dat file into internal table den try using parallel processing...
regards
vivek
‎2008 Aug 01 10:19 AM
hi..
if u can download dat file into internal table den try using parallel processing...
regards
vivek
‎2008 Aug 01 10:26 AM
Hi Vivek,
Thanks for reply..is there anything can be done by basis guys like increasing space...because we dont want to change the code.
Regards
PT
‎2008 Aug 01 10:30 AM
Hi,
If you are uploading the file to application server, then check out the options for open dataset,
You have the option to specify the number of records to be stored in more at one time and then the data is uploaded in small packages
‎2008 Aug 01 10:32 AM
HI there,
We reading file from App server and processing and insert into some tables...
‎2008 Aug 06 4:02 AM
Hi,
Probably u can have 2 internal tables to fill ur data. Take half of the records into one table and another half to another table. It might be problem with internal memory allocation. Check with ur BASIS team as well. U might get some inputs from them.
If not possible from BASIS side then u have to change the code. Also analyze the dump on where exactly it is happening like while filling the itab or some where else???.
Thanks,
Vinod.
‎2008 Aug 06 3:52 AM
Hi..
Use a z table or sap memory to store data in parts because in one go u cant have all the data...
If u want code for parallel processing i can give you but its bit complex and it will take to understand it...
i think you should use sap memory...
if in any case you get a better solution do let us no....
regards
vivek