2008 Jun 18 10:23 PM
Hi Experts,
I need to download several database tables from the application server to my local pc in a flat file. The main problem is these tables are huge (PSA tables from BI), and there is not enough memory to download a table in one pass, so I wrote a code like this:
START-OF-SELECTION .
SELECT * FROM (p_dtable) INTO CORRESPONDING FIELDS OF TABLE l_it_table PACKAGE SIZE p_lines.
call METHOD CL_GUI_FRONTEND_SERVICES=>GUI_DOWNLOAD
EXPORTING
FILENAME = p_file
APPEND = 'X'
WRITE_FIELD_SEPARATOR = 'X'
IMPORTING
FILELENGTH = l_i_fsize
CHANGING
DATA_TAB = l_it_table.
ENDSELECT .
This code doesn't work if the select statement loops more that once, the runtime error (at the end of the message) is produced. Only thing I understand from the error, is I cannot combine SELECT ... ENDSELECT with foreground processing of GUI_DOWNLOAD.
Any sugestions of how to downlad a huge table to the front end PC??
thank you
*******************************************************************************************
The reason for the exception is:
One of the database selections included a database Commit.
The selection was then supposed to continue. Before a
database commit, however, all outstanding database selections
must be concluded.
Possible causes in the application program:
While a read process from a database cursor is taking place
(within a loop SELECT/LOOP/EXEC SQL or before a FETCH command),
one of the following statements is used:
- MESSAGE (apart from MESSAGE S...)
- COMMIT WORK
- ROLLBACK WORK
- BREAK-POINT
- WAIT
- CALL FUNCTION ... DESTINATION (synchronous RFC)
- CALL FUNCTION ... STARTING NEW TASK
- RECEIVE RESULTS
- CALL DIALOG
- CALL SELECTION-SCREEN
- CALL TRANSACTION
- CALL SCREEN, or any other statement that results in the display of a
new screen
2008 Jun 18 10:33 PM
This problem has come up before on this forum. The download causes a commit which causes the SELECT to break when PACKAGE size is used. If you search the forum, you should be able to find the other references.
Rob
2008 Jun 18 10:36 PM
Hi,
within SELECT...ENDSELECT..you cannot have a COMMIT WORK...I believe the GUI_DOWNLOAD function module trigger a commit work...That is the reason you are getting short dumps..
To overcome that...create another parameter in the selection screen...and then give 1 to download the first set of data...and so on..
something Like this.
parameters: p_count type int4.
data: v_count type int4.
SELECT * FROM (p_dtable) INTO CORRESPONDING FIELDS OF TABLE l_it_table PACKAGE SIZE p_lines.
v_count = v_count + 1.
if v_count = p_count.
exit.
endif.
ENDSELECT .
call METHOD CL_GUI_FRONTEND_SERVICES=>GUI_DOWNLOAD
EXPORTING
FILENAME = p_file
APPEND = 'X'
WRITE_FIELD_SEPARATOR = 'X'
IMPORTING
FILELENGTH = l_i_fsize
CHANGING
DATA_TAB = l_it_table.
Hope I am clear..
Thanks
Naren
2008 Jun 18 11:12 PM
Why don't you select all your data into 1 ITAB and append your file X,000 records at a time?
Select * from bseg into table i_bseg. " let's say you get back 1,000,000 recs
v_start = 1.
v_end = 50000.
* break it up into 20 segments of 50,000 records.
Do 20 times.
loop at i_bseg into wa_bseg from v_start to v_end.
append wa_bseg to i_download_table.
clear wa_bseg.
endloop.
add 50000 to v_start.
add 50000 to v_end.
call METHOD CL_GUI_FRONTEND_SERVICES=>GUI_DOWNLOAD to append file
refresh i_download_table.
enddo.
Edited by: robert phelan on Jun 19, 2008 12:12 AM
2008 Jun 18 11:19 PM
Hi,
Try this..you don't require a selection screen parameter.
data: v_count type int4,
v_index TYPE syindex.
DO. " Infinite loop.
v_index = sy-index.
clear: v_count.
SELECT * FROM (p_dtable) INTO CORRESPONDING FIELDS OF TABLE l_it_table PACKAGE SIZE p_lines.
v_count = v_count + 1. " Increment the counter.
if v_count = v_index.
exit.
endif.
ENDSELECT .
IF sy-subrc NE 0 OR l_it_table[] IS INITIAL.
EXIT.
ENDIF.
call METHOD CL_GUI_FRONTEND_SERVICES=>GUI_DOWNLOAD
EXPORTING
FILENAME = p_file
APPEND = 'X'
WRITE_FIELD_SEPARATOR = 'X'
IMPORTING
FILELENGTH = l_i_fsize
CHANGING
DATA_TAB = l_it_table.
ENDDO.
Thanks
Naren
2008 Jun 19 10:11 PM
Thank you all for the quick response, but all the approaches you provide use a single select statement, which reads the table in one pass and put it into memory, then the file is wrote in chunks.
Actually what I'm looking for is how to read a huge database table in chunks and each time write the corresponding one to a file in the front end..
correct me if I didn't get one of your solutions right
thank you