Application Development and Automation Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 
Read only

Error while downloading data in chunks using open cursor

0 Likes
4,633

Hi,

I want to download data in chunks(1 lakh records at a time) into local system. Here, i am using cursors to do that. However, the cursor is closing implicitly whenever the system is asking for user permission to change or create a file and resulting in a dump.

I am getting the dump during the second fetch. I tried using cursor with 'WITH HOLD' to avoid the implicit closing, but i keep getting the same error. Is there any way to avoid this.

My Code:

DATA : itab TYPE TABLE OF lfa1,
* it2 TYPE TABLE OF lfa1,
 cur TYPE cursor,
 file_no TYPE i,
 c(1),
 path TYPE string VALUE 'F:\'.
 OPEN CURSOR WITH HOLD cur FOR SELECT * FROM lfa1 . 
do.
 FETCH NEXT CURSOR cur INTO TABLE itab PACKAGE SIZE 100000.
 if sy-subrc ne 0.
 exit.
 endif.
 path = 'F:\'.
 file_no = file_no + 1.
 c = file_no.
 CONCATENATE path 'file' c '.csv' INTO path.
 CALL FUNCTION 'GUI_DOWNLOAD'
 EXPORTING
* BIN_FILESIZE =
 FILENAME = path
 TABLES
 DATA_TAB = itab.
ENDDO.
CLOSE CURSOR cur.

Error:

Category ABAP programming error

Runtime Errors DBSQL_INVALID_CURSOR

Except. CX_SY_OPEN_SQL_DB

Application Component Not assigned

Short Text:      Cursor already closed or not open yet.
1 ACCEPTED SOLUTION
Read only

Former Member
3,574

As the documentation says the WITH HOLD option only protects you from commits executed via native SQL. It will not be effective for GUI interactions which usually closes the cursor, so you must use some kind of a workaround, such as using the primary key of the table to split the records:

DATA: last_vendor TYPE lfa1-lifnr,
      vendor_data TYPE STANDARD TABLE OF lfa1.

DO.
  
* Get next chunk
  SELECT * FROM lfa1 
    UP TO 100000 ROWS 
    INTO TABLE vendor_data
    WHERE lifnr > last_vendor
    ORDER BY PRIMARY KEY.
  
  IF sy-subrc <> 0.
    EXIT.
  ENDIF.

* Download next chunk    
  CALL 'GUI_DOWNLOAD'....

* Determine key to continue processing
  last_vendor = vendor_data[ lines( vendor_data ) ]-lifnr.
  
ENDO.
15 REPLIES 15
Read only

Former Member
3,575

As the documentation says the WITH HOLD option only protects you from commits executed via native SQL. It will not be effective for GUI interactions which usually closes the cursor, so you must use some kind of a workaround, such as using the primary key of the table to split the records:

DATA: last_vendor TYPE lfa1-lifnr,
      vendor_data TYPE STANDARD TABLE OF lfa1.

DO.
  
* Get next chunk
  SELECT * FROM lfa1 
    UP TO 100000 ROWS 
    INTO TABLE vendor_data
    WHERE lifnr > last_vendor
    ORDER BY PRIMARY KEY.
  
  IF sy-subrc <> 0.
    EXIT.
  ENDIF.

* Download next chunk    
  CALL 'GUI_DOWNLOAD'....

* Determine key to continue processing
  last_vendor = vendor_data[ lines( vendor_data ) ]-lifnr.
  
ENDO.
Read only

3,574

"ORDER BY lifnr" is missing.

Read only

3,574

Thanks sandra.rossi , corrected.

Read only

0 Likes
3,574

Hi Gabor Marian,

Thank you so much for your help.

The above code will definitely work if we know the table beforehand. But, what if the table is determined during the runtime and it has the composite key which allows the first columns to have repeated values. I think this procedure will not work.

I want to download data from any table which can accessed in the entire database with one program. Please, check the new code below.


FIELD-SYMBOLS : <itab> TYPE STANDARD TABLE.
Data : cur TYPE cursor, file_no TYPE i, c(1), path TYPE string VALUE 'F:\'.
PARAMETERS : tab LIKE dd03l-tabname.
DATA: REFER TYPE REF TO data. CREATE DATA refer TYPE TABLE OF (tab). ASSIGN refer->* TO <itab>.
OPEN CURSOR WITH HOLD cur FOR SELECT * FROM (tab) . "INTO TABLE it.
do. FETCH NEXT CURSOR cur INTO TABLE <itab> PACKAGE SIZE 1000.
if sy-subrc ne 0. exit. endif.
path = 'F:\'. file_no = file_no + 1. c = file_no.
CONCATENATE path 'file' c '.csv' INTO path.
CALL FUNCTION 'GUI_DOWNLOAD' EXPORTING * BIN_FILESIZE = FILENAME = path TABLES DATA_TAB = <itab>.
ENDDO.
CLOSE CURSOR cur.

Thanks,

Venkatesh.

Read only

3,574

908venkat

Multiple keys can also be handled, e.g :

...
  WHERE ( key1 > last_keys-key1 OR 
          key1 = last_keys-key1 AND key2 > last_keys-key2 
...)

However as the table can be generic this requires determining the key structure and building the WHERE clause at runtime.

A simplified approach is to count the records and skip what you've already fetched:

PARAMETERS: db_table  TYPE tabname,
            packsize TYPE i DEFAULT 1000.

DATA: cur                 TYPE cursor,
      downloaded_packages TYPE i,
      package             TYPE REF TO data.

FIELD-SYMBOLS: <package>  TYPE STANDARD TABLE.

CREATE DATA package TYPE STANDARD TABLE OF (db_table).
ASSIGN package->* TO <package>.

DO.

  OPEN CURSOR cur FOR SELECT * FROM (db_table) ORDER BY PRIMARY KEY.

  " Skip records already processed
  DO downloaded_packages TIMES.

    FETCH NEXT CURSOR cur INTO TABLE <package> PACKAGE SIZE packsize.

  ENDDO.


  " Fetch current package
  FETCH NEXT CURSOR cur INTO TABLE <package> PACKAGE SIZE packsize.

  CLOSE CURSOR cur.

  " Any records selected?
  IF sy-subrc <> 0.
    EXIT.
  ENDIF.

  " ... do the processing ... 

  downloaded_packages = downloaded_packages + 1.

ENDDO.
Read only

0 Likes
3,574

Hi Gabor Marian,

Thank you so much for your help. The above code will definitely work if we know the table beforehand. But, what if the table is determined during the runtime and it has the composite primary key which allows the first column to have repeated values even though it's a primary key? I think in this scenario it will not work. I want a program which can download data from any table in database in chunks. I will be using field-symbols for internal tables and work areas.

Is there any way to add the GUI_DOWNLOAD to exception so that the cursor do not close for that one function module?

Thanks,

Venkatesh.

Read only

3,574

908venkat

I just gave you a generic, working example via counting the packages (only the download logic is missing). No way to avoid the cursor issue.

Read only

3,574
Venkatesh K You can define any WHERE condition at runtime, and you can know what are the key columns of any table at runtime, roughly:
" list of primary key columns (but ignoring the client column)
SELECT fieldname FROM dd03L 
    WHERE tabname = table AND keyflag = 'X' AND datatype <> 'CLNT'
    ORDER BY position
    INTO TABLE @DATA(dd03L_s).

DATA(where) = |lifnr > last_vendor|. " rework that (use the list of primary key columns)

SELECT * FROM (table)
    UP TO 100000 ROWS 
    INTO TABLE <itab>
    WHERE (where)
    ORDER BY PRIMARY KEY.
Read only

0 Likes
3,574

Hi Gabor Marian and Sandra Rossi,

Thank you so much for your help.

@Gabor Marian - As i am a new user, I was not able to comment on friday. Whenever i tried i only see access denied message. I tried multiple times which is why it is showing multiple comments.

The solutions provided by both of you would definitely work. But, fetching the cursor multiple times to skip the data would take more time when there are millions of records. Hence, i am using the dynamic where condition.

Once again,

Thank you.

Read only

0 Likes
3,574
Venkatesh K I'm not sure to understand you last comment ("But, fetching the cursor multiple times to skip the data would take more time when there are millions of records. Hence, i am using the dynamic where condition."). More time than what? (do you mean a cursor without filter will take more time than a cursor with filter?) A WHERE can be used on both SELECT and OPEN, whatever the condition is expressed dynamically or statically. Whatever the WHERE condition is defined dynamically or statically, whatever it's for SELECT or OPEN, that will be the same performance globally.
Read only

matt
Active Contributor
0 Likes
3,574

So essentially you're wanting to be able to download any db table to your PC. Seems like a bit of crazy requirement to me. What's the purpose? Back up the SAP database to a PC?

Read only

0 Likes
3,574

Latest Code:

FIELD-SYMBOLS : <itab> TYPE STANDARD TABLE.
Data : cur TYPE cursor, file_no TYPE i, c(1), path TYPE string VALUE 'F:\'.
PARAMETERS : tab LIKE dd03l-tabname.
DATA: REFER TYPE REF TO data. CREATE DATA refer TYPE TABLE OF (tab). ASSIGN refer->* TO <itab>.
OPEN CURSOR WITH HOLD cur FOR SELECT * FROM (tab) . "INTO TABLE it.
do. FETCH NEXT CURSOR cur INTO TABLE <itab> PACKAGE SIZE 1000.
if sy-subrc ne 0. exit. endif.
path = 'F:\'. file_no = file_no + 1. c = file_no.
CONCATENATE path 'file' c '.csv' INTO path.
CALL FUNCTION 'GUI_DOWNLOAD' EXPORTING * BIN_FILESIZE = FILENAME = path TABLES DATA_TAB = <itab>.
ENDDO.
CLOSE CURSOR cur.
Read only

0 Likes
3,574

Hi Sandra Rossi,

I meant that i am going to use your suggestion to define a where condition at runtime.

DO.
  OPEN CURSOR cur FOR SELECT * FROM (db_table) ORDER BY PRIMARY KEY.
  DO downloaded_packages TIMES. " to skip the processed records
    FETCH NEXT CURSOR cur INTO TABLE <package> PACKAGE SIZE packsize.
  ENDDO.
 FETCH NEXT CURSOR cur INTO TABLE <package> PACKAGE SIZE packsize."fetch current pkg

The above code also works the same way when i have less data. With above code, If i try downloading 10 lakh records with package size 1 lakh, I have to fetch 65 times approximately instead of 10 times. Unfortunately, I have millions(billion in very few cases) of records to fetch and this approach would take more time than yours.

I am just trying to say, your method is more feasible.

Thank you

Read only

0 Likes
3,574

Thanks. I say that it's just more compact visually to use SELECT rather than OPEN CURSOR.

OPEN CURSOR with WHERE and UP TO ROWS:

OPEN CURSOR cur FOR SELECT * FROM (db_table) 
    WHERE (where)        " <=== add it 
    ORDER BY PRIMARY KEY
    UP TO 100000 ROWS.
DO.
  FETCH NEXT CURSOR cur INTO TABLE <itab> PACKAGE SIZE packsize.
  IF sy-subrc <> 0.
    EXIT.
  ENDIF.
ENDDO.
CLOSE CURSOR cur.

is technically the same as:

SELECT * FROM (db_table) 
    WHERE (where)
    ORDER BY PRIMARY KEY
    INTO TABLE <itab>
    UP TO 100000 ROWS.

No noticeable performance difference at database side between the two. No difference in features (well of course there are some, but they are not used in our current case).

In fact I really don't discuss your exact issue, it's just a general information.

SELECT is less verbose. I understand that you choose SELECT because of that.

Read only

3,574

You can also use the concept of EXPORT / IMPORT to DB.

  • To fetch the data and keep it in INDX tables
    -> When you save the data in clusters -> keep the cluster names like TABLE1, 2, 3 -> so that it becomes easy for you to understand at the time of downloading. Once data gets persisted then download with loop or do.
  • Download the data from the clusters
    Simply fetch all the clusters -> loop over it and download the data. Delete the data from clusters after download.