Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

Fetch next cursor/ invalid database interruption

Former Member
0 Kudos

Hi ,

I need to select documents from BKPF table and also select corresponding documents from BSEG table process them and write it to a text file, for a particular company code there are millions of records in bkpf table and millions in bseg respectively.

internal table cannot store millions of documents so, I am selecting in packets using Open cursor and Fetch Next cursor.

my issue is when I select 1000 records and process them and write to file, and when cursor fetches for next 1000 records it is terminated stating " Invalid interruption of Data base.

eg: for a particular selection there are 50,000 records total to be downloaded to text file.

code :

OPEN CURSOR gc_bkpf FOR

SELECT bukrs

belnr

gjahr

blart

budat

monat

cpudt

cputm

usnam

xblnr

bktxt

waers

FROM bkpf

WHERE bukrs EQ p_bukrs

and bstat IN (' ','A','B')

and budat in sbudat

AND gjahr in sgjahr

AND monat EQ i_monat-monat.

DO.

refresh out_tab.

clear out_tab.

refresh t_bkpf.

FETCH NEXT CURSOR gc_bkpf

INTO TABLE t_bkpf

PACKAGE SIZE 1000.

IF sy-subrc NE 0.

CLOSE CURSOR gc_bkpf.

EXIT.

ELSEIF sy-subrc EQ 0.

( processing logic )

*

*

*

*

*

CONCATENATE f_itab-belnr zdelim

f_itab-blart zdelim

f_itab-bktxt zdelim

f_itab-xblnr zdelim

f_itab-hkont zdelim

f_itab-ktoks zdelim

f_itab-txt20 zdelim

f_itab-usnam zdelim

f_itab-bukrs zdelim

f_itab-monat zdelim

f_itab-cpudt zdelim

f_itab-budat zdelim

f_itab-waers zdelim

wrbtr1 zdelim

dmbe2 zdelim

kzkrs zdelim INTO out_tab.

append out_tab.

endif.

      • Download records

perform dowload_new. ( In this perform I am using FM GUI_DOWNLOAD or open dataset )

enddo.

After downloading 1000 records when it fetches for next 1000, it is getting terminated, but I noticed if I remove the Perfom download_new and place it after enddo, it is not terminating and all the 50,000 records are downloaded.

I know it will work for less amount of documents, I am concerned about huge no of records.

Please let me know how to solve the issue.

Thanks

20 REPLIES 20

Former Member
0 Kudos

I think there is an implied COMMIT when you do the perform.

Rather than use PACKAGE SIZE, it would be easier to manipulate your select-options so that you are only selecting and downloading smaller chunks of data at once.

Rob

0 Kudos

Yes,

I noticed that, but I cant manipulate the select options... is there any other solution for this kind of issue.

Thanks

0 Kudos

Did you try using EXTRACT command?

0 Kudos

I think you have to extract all of your data first. Then if you want to do multiple downloads, break the files apart then.

Rob

Former Member
0 Kudos

Hi,

In such case, go for EXEC SQL....ENDEXEC, and call your open cursor inside this.

This will never time out and dump.

Regards

Subramanian

0 Kudos

Hi,

EXEC SQL.

OPEN CURSOR gc_bkpf FOR

SELECT bukrs

belnr

gjahr

blart

budat

monat

cpudt

cputm

usnam

xblnr

bktxt

waers

FROM bkpf

WHERE bukrs EQ p_bukrs

and bstat IN (' ','A','B')

and budat in sbudat

AND gjahr in sgjahr

AND monat EQ i_monat-monat.

ENDEXEC.

DO.

refresh out_tab.

clear out_tab.

refresh t_bkpf.

FETCH NEXT CURSOR gc_bkpf

INTO TABLE t_bkpf

PACKAGE SIZE 1000.

IF sy-subrc NE 0.

CLOSE CURSOR gc_bkpf.

EXIT.

ELSEIF sy-subrc EQ 0.

*

*

*

still it is not working," Invalid command pass to database"

please let me know if I have given wrong.

Thanks

0 Kudos

Why don't you use open SQL?

SELECT bukrs belnr gjahr blart budat monat cpudt
cputm usnam xblnr bktxt waers
INTO TABLE t_bkpf 
PACKAGE SIZE 1000
FROM bkpf
WHERE bukrs EQ p_bukrs
and bstat IN (' ','A','B')
and budat in sbudat
AND gjahr in sgjahr
AND monat EQ i_monat-monat.
  loop at t_bkpf.
    clear out_tab.
    concatenate .... into out_tab.
    transfer out_tab.... "to server file.
  endloop.
ENDSELECT.

As far as I know, none of the above commands would trigger a commit work - which is an interruption for the DB-access.

Regards,

Christian

0 Kudos

Hi Christian,

I tried that way, the problem still persists, any other solution

Thanks

0 Kudos

Hi,

which exact statement causes the problem (at which line the dump occurs)?.

Because even the simplest examples from documentation work in the same way:

DATA: file TYPE string VALUE `flights.dat`, 
      wa   TYPE spfli. 

FIELD-SYMBOLS <hex_container> TYPE x. 

OPEN DATASET file FOR OUTPUT IN BINARY MODE. 

SELECT * 
       FROM spfli 
       INTO wa. 
  ASSIGN wa TO <hex_container> CASTING. 
  TRANSFER <hex_container> TO file. 
ENDSELECT. 

CLOSE DATASET file. 

-> transfer inside of select ... endselect is possilble. Also the package example was pretty much copied from an example, anyway: this is not changing the situation of an open SQL-loop.

GUI_DOWNLOAD might be something different - here I would rather expect implicit commits - which is impossible for the select - endselect.

E.g. opening a popup (for filename?!) triggers a (implicit) commit. Either you ask the filenames in beforehand - or you won't be able to split the select with package. (About the rest of GUI_DOWNLOAD I'm not so sure - likely this isn't possible too, because the connection to the GUI (for saving of the file) will trigger a commit, too).

Regards,

Christian

0 Kudos

Hi Cristian,

Thanks for explaining....

If I select to transfer file to a Application server, it is working fine no issues, but when I select for presentation server it is terminating, the reason as you said might be due to GUI_DOWNLOAD which is performing a commit,

Users mostly download files on to their deskop,so it is mandatory to me include the option, how do I make the FM not to commit, any solution ?

Thanks

0 Kudos

Check my earlier reply. You will have to select all of your data into an internal table. Then you can divide the table into portions and download the portions.

Rob

0 Kudos

Hi Rob,

If I select all the data into Internal table first, still it terminates because the Internal table is not able to store the 2 millions of records or more, this is just for 1 period and if user gives range, data will be very huge.

not able to figure out solution.

Thanks

0 Kudos

OK - you should be able to transfer the data all at one time to the application server as Chrisian has suggested. Then write a separate program that read this file (or portions of it) and then does the download(s).

But will the user reallistically be able to deal with millions of records in a download? Remember that Excel has a limitiation of around 65,000 rows.

Rob

0 Kudos

ok, Thanks Rob,

data is downloaded as a text file, it is also limited but working fine for million records.

0 Kudos

I came across this in a different thread. Keep your original code, but change your OPEN CURSOR statement to OPEN CURSOR WITH HOLD.

The documentation says it will not be interrupted by a commit.

Rob

0 Kudos

Rob,

I tried that statement but it did'nt work, commit work was still happening,

Thanks

0 Kudos

Yes, the commit will still happen, but it shouldn't interrupt the SELECT.

Rob

0 Kudos

But it did, for second Fetch it terminated

OPEN CURSOR WITH HOLD gc_bkpf FOR

SELECT bukrs

belnr

gjahr

blart

budat

monat

cpudt

cputm

usnam

xblnr

bktxt

waers

FROM bkpf

WHERE bukrs EQ p_bukrs

and bstat IN (' ','A','B')

and budat in sbudat

AND gjahr in sgjahr

AND monat EQ i_monat-monat.

DO.

refresh t_bkpf.

FETCH NEXT CURSOR gc_bkpf

INTO TABLE t_bkpf

PACKAGE SIZE 1000.

IF sy-subrc NE 0.

CLOSE CURSOR gc_bkpf.

EXIT.

ELSEIF sy-subrc EQ 0.

*

*

*

*

*

*

*

*

perform download_new.

enddo.

please let me know if any wrong in the code.

Thanks

0 Kudos

Are you still looking for an answer here?

Sorry - I noticed that you marked it as solved, so I guess you are going with one of the other ways.

Rob

Message was edited by:

Rob Burbank

0 Kudos

Hi Rob,

quite some time has passed since you researched this problem but I can't find any other thread which would mention how to get around the interrupted database operation of an OPEN CURSOR WITH HOLD / FETCH loop. Did you ever find out what's causing this or did you even find a workaround you would like to share?

In my particular scenario, I am using the OPEN CURSOR WITH HOLD and SELECT INTO an itab in small packets. In the FETCH loop I am calling a remote client using CALL FUNCTION 'ZOP' which works well. Only if the action in 'ZOP' takes longer than 8 to 10 seconds, I am getting this message that a database operation was interrupted (in German:

Unzulässige Unterbrechung einer Datenbankselektion).

Best regards,

Chris