‎2016 Jan 02 12:17 PM
Hi Experts,
I am often facing this run time error when the job is run the background using parallel processing technique to accommodate more than one million records and length of each record is 1200 characters.
However, it is not happening when it is run in the background without parallel processing technique, and the job is getting executed successfully.
Below are the details of the shot dump;
Category: Resource shortage
Short text: no more storage space available for extending an internal table.
Below are the details of the profile parameters:
ztta/roll_area (1.000.000 - 15.000.000)
ztta/roll_extension (10.000.000 - 500.000.000)
abap/heap_area_total (100.000.000 - 1.500.000.000)
abap/heap_area_dia: (10.000.000 - 1.000.000.000)
abap/heap_area_nondia: (10.000.000 - 1.000.000.000)
em/initial_size_MB: (35-1200)
Memory location: "Session memory"
Row width: 1162
Number of rows: 0
Allocated rows: 1
Newly requested rows: 1395264 (in 174408 blocks)
The amount of storage space (in bytes) filled at termination time was:
Roll area...................... 6219056
Extended memory (EM)........... 2002751168
Assigned memory (HEAP)......... 2000082784
Short area..................... " "
Paging area.................... 0
Maximum address space.......... " "
SAP Release..... 731
SAP Basis Level. 0010
Operating system..... "Windows NT"
Release.............. "6.1"
Character length.... 8 Bits
Pointer length....... 64 Bits
Work process number.. 0
Database type..... "ORACLE"
Call Type........... "asynchronous with reply and non-transactional (emode 0, imode 0)"
Kindly help me to solve this issue.
Thank you,
Karteek.K
‎2016 Jan 04 6:18 PM
Hi Karteek,
This is caused due to huge memory allocation!
What happens is that the the an FM is being called withing a ITAB having huge number of records. This gives a time out issue. You can either have your BASIS team to increase the time for time out to occur. Ideally it can work but we never know as to how many records will be processed in future.
The best viable option would be to split your main file into smaller files and process it accordingly. You can process these smaller files asynchronously if your results are not dependent!
Cheers,
Varun
‎2016 Jan 04 6:26 PM
It's strictly a memory issue. It has nothing to do with a time-out.
Rob