Application Development and Automation Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 
Read only

memory problem in internal table

Former Member
0 Likes
5,231

hi

i have nerly 4 cores of records in my database table

now i need to get this data into a internal table

but i declared my internal table with occurs 0

it it going to dump showing TSV_TNEW_PAGE_ALLOC_FAILED because of low memory in the internal table

can any one tell me how to resolve this

9 REPLIES 9
Read only

Sm1tje
Active Contributor
0 Likes
2,908

If you really need to select all records at once, you will have to talk to basis administrators about your problem. Better would be to limit the number of records to be selected, or do multiple selects, or something likewise.

Read only

Former Member
0 Likes
2,908

Did you try with the hashed table type?

Read only

Former Member
0 Likes
2,908

hi,

use occurs 10 or something.

or else it is better to use declaring structure and then assign internal table of type that structure.

hope this helps u.

plz reward me if helpful.

Read only

Former Member
0 Likes
2,908

Internal table can store max of 2GB data.

If you want to fetch more than this, then i suggest do it in bunch i.e go for PACKAGE SIZE option in select statement.

SELECT *
FROM <table>
INTO TABLE itab
PACKAGE SIZE <n>.

IF sy-subrc EQ 0.
\*" Process the n records
ENDIF.

ENDSELECT.

G@urav.

Read only

Former Member
0 Likes
2,908

You want process large number of records. So you keep your internal table as buffer in between source and destination system.

Fill your buffer(internal table) and transfer to destination system. Refresh the buffer and again process it in a loop.

If you specify the PACKAGE SIZE addition in a SELECT ... ENDSELECT.

The lines are inserted in the internal table in packages of n lines.

if you use INTO TABLE addition it automatically refresh the internal table.

Hope it clarifies why use PACKAGE SIZE.

Read only

Former Member
0 Likes
2,908

Since available memory is the issue (I'm assuming this is occurring consistently), you may want to consider creating a database cursor (FETCH) to pull and process records in chunks; clearing internal table(s) before each fetch.

Read only

Former Member
0 Likes
2,908

Hi Ram,

Adding to Billy's comment, I would suggest to use OPEN CURSOR and FETCH. This is a normal processing technique in BI for data extraction where large number of records are extrated from a source system like R/3.

  
  OPEN CURSOR WITH HOLD gv_cursor FOR
    SELECT * 
      FROM table dbtab
      WHERE clause.

  DO.
    FETCH NEXT CURSOR gv_cursor
      INTO TABLE itab
      PACKAGE SIZE p_mxsize.

    IF sy-subrc NE 0.
      CLOSE CURSOR gv_cursor.
      EXIT.
    ELSE.
      "process itab here
    ENDIF.
  ENDDO.

p_mxsize is the number of records that should be procesed as a chuck e.g. 2000, 4000 or 5000.

Hope this helps.

Thanks

Sanjeev

Read only

Former Member
0 Likes
2,908

The error TSV_NEW_PAGE_ALLOC_FAILED means that more memory was requested by the system because the program needed to expand an internal table, but not is available. When Extended Memory is completely used up, the process will go into PRIV mode and it will start using Heap Memory in case of Windows or vice versa in case of Unix. Once this enters into PRIV mode, no other user will be able to use the corresponding work process. If there is enough memory for it to finish, You will not see the error.

Note: Internal table can store max of 2GB data.

Solution: 1) Process smaller chunks of data. (E.g. not all 50.000 records at once) is one option.

2) Create a memory snapshot right before the program dumps and analyze where the memory is consumed in order to understand if or where you can reduce memory consumption.

Read only

Clemenss
Active Contributor
0 Likes
2,908

Hi ram reddy,

sorry you have to read so much nonsense.

Note OCCURS does not change anything, this was last milenium when it had any effect

If page allocation fails, memory has been used over limit.

Ask basis to give you more, try in background task (more memory in background) or select less data: Strictly avoid SELECT *, think what you really need.

Regards,

Clemens