‎2019 Nov 21 1:48 PM
Hi,
I want to download data in chunks(1 lakh records at a time) into local system. Here, i am using cursors to do that. However, the cursor is closing implicitly whenever the system is asking for user permission to change or create a file and resulting in a dump.
I am getting the dump during the second fetch. I tried using cursor with 'WITH HOLD' to avoid the implicit closing, but i keep getting the same error. Is there any way to avoid this.
My Code:
DATA : itab TYPE TABLE OF lfa1,
* it2 TYPE TABLE OF lfa1,
cur TYPE cursor,
file_no TYPE i,
c(1),
path TYPE string VALUE 'F:\'.
OPEN CURSOR WITH HOLD cur FOR SELECT * FROM lfa1 .
do.
FETCH NEXT CURSOR cur INTO TABLE itab PACKAGE SIZE 100000.
if sy-subrc ne 0.
exit.
endif.
path = 'F:\'.
file_no = file_no + 1.
c = file_no.
CONCATENATE path 'file' c '.csv' INTO path.
CALL FUNCTION 'GUI_DOWNLOAD'
EXPORTING
* BIN_FILESIZE =
FILENAME = path
TABLES
DATA_TAB = itab.
ENDDO.
CLOSE CURSOR cur.
Error:
Category ABAP programming error
Runtime Errors DBSQL_INVALID_CURSOR
Except. CX_SY_OPEN_SQL_DB
Application Component Not assigned
Short Text: Cursor already closed or not open yet.
‎2019 Nov 21 3:27 PM
As the documentation says the WITH HOLD option only protects you from commits executed via native SQL. It will not be effective for GUI interactions which usually closes the cursor, so you must use some kind of a workaround, such as using the primary key of the table to split the records:
DATA: last_vendor TYPE lfa1-lifnr,
vendor_data TYPE STANDARD TABLE OF lfa1.
DO.
* Get next chunk
SELECT * FROM lfa1
UP TO 100000 ROWS
INTO TABLE vendor_data
WHERE lifnr > last_vendor
ORDER BY PRIMARY KEY.
IF sy-subrc <> 0.
EXIT.
ENDIF.
* Download next chunk
CALL 'GUI_DOWNLOAD'....
* Determine key to continue processing
last_vendor = vendor_data[ lines( vendor_data ) ]-lifnr.
ENDO.
‎2019 Nov 21 3:27 PM
As the documentation says the WITH HOLD option only protects you from commits executed via native SQL. It will not be effective for GUI interactions which usually closes the cursor, so you must use some kind of a workaround, such as using the primary key of the table to split the records:
DATA: last_vendor TYPE lfa1-lifnr,
vendor_data TYPE STANDARD TABLE OF lfa1.
DO.
* Get next chunk
SELECT * FROM lfa1
UP TO 100000 ROWS
INTO TABLE vendor_data
WHERE lifnr > last_vendor
ORDER BY PRIMARY KEY.
IF sy-subrc <> 0.
EXIT.
ENDIF.
* Download next chunk
CALL 'GUI_DOWNLOAD'....
* Determine key to continue processing
last_vendor = vendor_data[ lines( vendor_data ) ]-lifnr.
ENDO.
‎2019 Nov 21 4:10 PM
‎2019 Nov 21 4:17 PM
‎2019 Nov 22 12:28 PM
Hi Gabor Marian,
Thank you so much for your help.
The above code will definitely work if we know the table beforehand. But, what if the table is determined during the runtime and it has the composite key which allows the first columns to have repeated values. I think this procedure will not work.
I want to download data from any table which can accessed in the entire database with one program. Please, check the new code below.
FIELD-SYMBOLS : <itab> TYPE STANDARD TABLE.
Data : cur TYPE cursor,
file_no TYPE i,
c(1),
path TYPE string VALUE 'F:\'.
PARAMETERS : tab LIKE dd03l-tabname.
DATA: REFER TYPE REF TO data.
CREATE DATA refer TYPE TABLE OF (tab).
ASSIGN refer->* TO <itab>.
OPEN CURSOR WITH HOLD cur FOR SELECT * FROM (tab) . "INTO TABLE it.
do.
FETCH NEXT CURSOR cur INTO TABLE <itab> PACKAGE SIZE 1000.
if sy-subrc ne 0.
exit.
endif.
path = 'F:\'.
file_no = file_no + 1.
c = file_no.
CONCATENATE path 'file' c '.csv' INTO path.
CALL FUNCTION 'GUI_DOWNLOAD'
EXPORTING
* BIN_FILESIZE =
FILENAME = path
TABLES
DATA_TAB = <itab>.
ENDDO.
CLOSE CURSOR cur.Thanks,
Venkatesh.
‎2019 Nov 22 1:13 PM
908venkat
Multiple keys can also be handled, e.g :
...
WHERE ( key1 > last_keys-key1 OR
key1 = last_keys-key1 AND key2 > last_keys-key2
...)However as the table can be generic this requires determining the key structure and building the WHERE clause at runtime.
A simplified approach is to count the records and skip what you've already fetched:
PARAMETERS: db_table TYPE tabname,
packsize TYPE i DEFAULT 1000.
DATA: cur TYPE cursor,
downloaded_packages TYPE i,
package TYPE REF TO data.
FIELD-SYMBOLS: <package> TYPE STANDARD TABLE.
CREATE DATA package TYPE STANDARD TABLE OF (db_table).
ASSIGN package->* TO <package>.
DO.
OPEN CURSOR cur FOR SELECT * FROM (db_table) ORDER BY PRIMARY KEY.
" Skip records already processed
DO downloaded_packages TIMES.
FETCH NEXT CURSOR cur INTO TABLE <package> PACKAGE SIZE packsize.
ENDDO.
" Fetch current package
FETCH NEXT CURSOR cur INTO TABLE <package> PACKAGE SIZE packsize.
CLOSE CURSOR cur.
" Any records selected?
IF sy-subrc <> 0.
EXIT.
ENDIF.
" ... do the processing ...
downloaded_packages = downloaded_packages + 1.
ENDDO.
‎2019 Nov 22 3:35 PM
Hi Gabor Marian,
Thank you so much for your help. The above code will definitely work if we know the table beforehand. But, what if the table is determined during the runtime and it has the composite primary key which allows the first column to have repeated values even though it's a primary key? I think in this scenario it will not work. I want a program which can download data from any table in database in chunks. I will be using field-symbols for internal tables and work areas.
Is there any way to add the GUI_DOWNLOAD to exception so that the cursor do not close for that one function module?
Thanks,
Venkatesh.
‎2019 Nov 22 3:46 PM
908venkat
I just gave you a generic, working example via counting the packages (only the download logic is missing). No way to avoid the cursor issue.
‎2019 Nov 22 4:36 PM
" list of primary key columns (but ignoring the client column)
SELECT fieldname FROM dd03L
WHERE tabname = table AND keyflag = 'X' AND datatype <> 'CLNT'
ORDER BY position
INTO TABLE @DATA(dd03L_s).
DATA(where) = |lifnr > last_vendor|. " rework that (use the list of primary key columns)
SELECT * FROM (table)
UP TO 100000 ROWS
INTO TABLE <itab>
WHERE (where)
ORDER BY PRIMARY KEY.
‎2019 Nov 25 11:16 AM
Hi Gabor Marian and Sandra Rossi,
Thank you so much for your help.
@Gabor Marian - As i am a new user, I was not able to comment on friday. Whenever i tried i only see access denied message. I tried multiple times which is why it is showing multiple comments.
The solutions provided by both of you would definitely work. But, fetching the cursor multiple times to skip the data would take more time when there are millions of records. Hence, i am using the dynamic where condition.
Once again,
Thank you.
‎2019 Nov 25 12:45 PM
‎2019 Nov 26 6:57 AM
So essentially you're wanting to be able to download any db table to your PC. Seems like a bit of crazy requirement to me. What's the purpose? Back up the SAP database to a PC?
‎2019 Nov 22 12:32 PM
Latest Code:
FIELD-SYMBOLS : <itab> TYPE STANDARD TABLE.
Data : cur TYPE cursor,
file_no TYPE i,
c(1),
path TYPE string VALUE 'F:\'.
PARAMETERS : tab LIKE dd03l-tabname.
DATA: REFER TYPE REF TO data.
CREATE DATA refer TYPE TABLE OF (tab).
ASSIGN refer->* TO <itab>.
OPEN CURSOR WITH HOLD cur FOR SELECT * FROM (tab) . "INTO TABLE it.
do.
FETCH NEXT CURSOR cur INTO TABLE <itab> PACKAGE SIZE 1000.
if sy-subrc ne 0.
exit.
endif.
path = 'F:\'.
file_no = file_no + 1.
c = file_no.
CONCATENATE path 'file' c '.csv' INTO path.
CALL FUNCTION 'GUI_DOWNLOAD'
EXPORTING
* BIN_FILESIZE =
FILENAME = path
TABLES
DATA_TAB = <itab>.
ENDDO.
CLOSE CURSOR cur.
‎2019 Nov 25 6:34 PM
Hi Sandra Rossi,
I meant that i am going to use your suggestion to define a where condition at runtime.
DO.
OPEN CURSOR cur FOR SELECT * FROM (db_table) ORDER BY PRIMARY KEY.
DO downloaded_packages TIMES. " to skip the processed records
FETCH NEXT CURSOR cur INTO TABLE <package> PACKAGE SIZE packsize.
ENDDO.
FETCH NEXT CURSOR cur INTO TABLE <package> PACKAGE SIZE packsize."fetch current pkgThe above code also works the same way when i have less data. With above code, If i try downloading 10 lakh records with package size 1 lakh, I have to fetch 65 times approximately instead of 10 times. Unfortunately, I have millions(billion in very few cases) of records to fetch and this approach would take more time than yours.
I am just trying to say, your method is more feasible.
Thank you
‎2019 Nov 26 12:18 AM
Thanks. I say that it's just more compact visually to use SELECT rather than OPEN CURSOR.
OPEN CURSOR with WHERE and UP TO ROWS:
OPEN CURSOR cur FOR SELECT * FROM (db_table)
WHERE (where) " <=== add it
ORDER BY PRIMARY KEY
UP TO 100000 ROWS.
DO.
FETCH NEXT CURSOR cur INTO TABLE <itab> PACKAGE SIZE packsize.
IF sy-subrc <> 0.
EXIT.
ENDIF.
ENDDO.
CLOSE CURSOR cur.is technically the same as:
SELECT * FROM (db_table)
WHERE (where)
ORDER BY PRIMARY KEY
INTO TABLE <itab>
UP TO 100000 ROWS.No noticeable performance difference at database side between the two. No difference in features (well of course there are some, but they are not used in our current case).
In fact I really don't discuss your exact issue, it's just a general information.
SELECT is less verbose. I understand that you choose SELECT because of that.
‎2019 Nov 26 9:20 AM
You can also use the concept of EXPORT / IMPORT to DB.