‎2008 Apr 02 3:38 PM
Now i am responsible for writing a program, including an internal table, which will store 3 million Datensätze.
But when ich ran this program, an error happened:
Runtime Error: TSV_TNEW_PAGE_ALLOC_FAILED
Short Text:
No more storage space available for extending an internal table.
What happened?
You attempted to extend an internal table, but the required space was not available.
CALL FUNCTION 'RRX_GRID_CMD_PROCESS' "Query ausführen
EXPORTING i_handle = g_handle
i_cmdid = 'STRT'
i_iobjnm = g_iobjnm
IMPORTING e_max_x = g_x
e_max_y = g_y
TABLES i_t_ranges = g_t_ranges
e_t_dim = g_t_dim
e_t_mem = g_t_mem
e_t_cel = g_t_cel
c_t_prptys = g_t_prptys
e_t_atr = g_t_atr
e_t_grid = g_t_grid
e_t_ranges = g_t_ranges
e_t_con = g_t_con
e_t_fac = g_t_fac
EXCEPTIONS inherited_error = 1
no_record_found = 2
terminated_by_user = 3
no_processing = 4
no_change = 5
dbcl_nosupport = 6
no_authorization = 7
x_message = 8
screen_canceled = 9
launch_url = 10
OTHERS = 11.
IF sy-subrc ne 0.
CALL FUNCTION 'DEQUEUE_E_BIW_IOBJ' EXPORTING IOBJNM = 'U_SSTMON'.
raise QUERY_RUN_ERROR.
endif.
if not p_sstmon is initial.
delete from /bic/pu_sstmon.
endif.
CALL FUNCTION 'DEQUEUE_E_BIW_IOBJ' EXPORTING IOBJNM = 'U_SSTMON'.
Output der Query in schlanke Tabelle mit erstem Feld 'y' umladen...
loop at g_t_grid into wa_g_t_grid.
i_grid-y = wa_g_t_grid-y.
i_grid-x = wa_g_t_grid-x.
i_grid-data = wa_g_t_grid-data.
if i_grid-data(1) = m1. "Hochkomma entfernen
shift i_grid-data.
endif.
append i_grid . Error happened hier
endloop.
refresh g_t_grid.
sort i_grid by y x.
Who could give me some suggestion to solve this problem?
Thank you in advance.
‎2008 Apr 02 3:55 PM
This is a basis setting that needs to be amended. But they probably will not do it.
Try to use the command FREE <itab> at the start of your code. This will release the memory back to the system.
Otherwise you will have to process data in chunks.
‎2008 Apr 02 4:25 PM
free the tables before using the FM or, before start-of-selection
‎2008 Apr 02 5:01 PM
Thanks for you replies.
In the start of my program I have written: Free i_grid.
When i run this program, this runtime error occurs.
‎2008 Apr 03 6:06 AM
Hi gang...
u will have to extract data in size of 2GB...as the internal table is having a lot of data..
u will find it in one of the blogs....just do a search for package size...
‎2008 Apr 03 11:05 AM
thank you for your info.
I think that using package size is only choice to avoid the huge internal tabel.
But now i have a question. I must sort this internal table. When i use package size, then i can only sort a part of internal table. This will cause data uncorrect.
SELECT * FROM zubpgrid INTO TABLE i_grid PACKAGE SIZE 10000.
Legende des Grids auswerten (1. Zeile)...
SORT i_grid BY y x.
LOOP AT i_grid.
if i_grid-y ne 1. "1. Zeile Ende
exit. "loop at i_grid
endif.
leere Titel in der RSRT-Ausgabe fehlen in der int. Tabelle.
Sie sind Texte zum Vorgängerfeld und werden hier für die Steuerung
eingefügt, mit Name der Schlüsselfeldes + '_TXT'
l_v_delta_x = i_grid-x - i_gridlegende-x.
if l_v_delta_x > 1. "da fehlte was
concatenate i_gridlegende-data '_TXT' into i_gridlegende-data.
add 1 to i_gridlegende-x. "das fehlende x
append i_gridlegende.
endif.
jetzt das aktuelle Feld aus i_grid...
i_gridlegende-data = i_grid-data.
i_gridlegende-x = i_grid-x.
append i_gridlegende.
DELETE i_grid.
ENDLOOP.
ENDSELECT
Who could help me to solve this problem?
Thanks in advance!
‎2008 Apr 05 4:08 PM
Hi ganf..
Can u tell me wat exactly u wanna sort and how it is gonna effect the later part of the code.
If u can tell me something about it..may be we can go about finding out something..
Plz do reward for all the useful answers.
‎2010 Apr 13 4:26 PM
> But now i have a question. I must sort this internal table. When i use package size, then i can only sort a part of internal
> table. This will cause data uncorrect.
> SELECT * FROM zubpgrid INTO TABLE i_grid PACKAGE SIZE 10000.
> SORT i_grid BY y x.
> LOOP AT i_grid.
> ....
> ENDLOOP.
> ENDSELECT
Guys.... why do you try to program in ABAP without any knowledge about SQL and/or programming experience?
you read millions of database records in an internal tables
.... sort the table
.... process each entry one by one
are you mad?
what about
SELECT ... FROM .... ORDER BY y x
.... <- here you have now each entry of the DB table in the desired order
ENDSELECT
this saves amounts of memory and it is much faster....
‎2008 Apr 05 7:40 PM
Hallo,
i would like to insert 3 mio Datensätze in an internal table (ii_grid). Because it is very huge, i must divide it into serveral parts with PACKAGE SIZE.
With SELECT....ENDSELECT Package size 10000 will be only 10000 Datensätze inserted into this table each time. Next time the second 10000 Datensätze will be inserted. The old 10000 will be lost. Now i face a problem. I need this 3 mio datensätze to be sorted.
who could help me solve this problem?
thanks!
‎2008 Apr 06 2:20 PM
‎2008 Apr 07 9:11 AM
Nobody could answer this question? When somebody could, i will give full points to him.
Thanks!
‎2010 Apr 13 5:47 PM
I solved this problem just yesterday....
This error appears the we used all memory available appending rows..we already know that.
the typical pattern for a program its something like this
select field1 field2
into lt_table
from table1
where ....
loop at lt_table into ls_table
...process data...
append gs_alv to gt_alv
endloop.if lt_table es too big, then our gt_alv will be big too...if we use a cursor, wich is even faster than the code befeore, we will use only one big table instead two...using only the half of the memory.
DATA: c_cursor TYPE cursor.
OPEN CURSOR c_cursor FOR
SELECT field1 field2
FROM dbtable
WHERE .........
DO.
FETCH NEXT CURSOR c_cursor INTO gs_alv.
IF sy-subrc NE 0.
EXIT.
ENDIF.
.....process data......
APPEND gs_alv TO gt_alv.
ENDDO.I hope it helps....
Edited by: Sebastian Bustamante on Apr 13, 2010 6:48 PM
‎2010 Apr 13 6:14 PM
Well, before anyone else tries to help, have a look at the date of the original post.
Rob
‎2010 Apr 13 7:19 PM
Judging by the typos, anything is possible so Gang Qin might return...
> ... which will store 3 million Datensätze.
>
> But when ich ran this program...
Cheers,
Julius