Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

Bypassing Open Cursor WITH HOLD / Fetch with a better Alternative

mritiz
Explorer
0 Kudos
614

Hi,

To pull huge amount of data, using OPEN CURSOR WITH HOLD / FETCH to get the data from DB to Application Server in small chunks otherwise it's throwing Runtime error due to lack of memory.

HOLD has been used to maintain the cursor position till which record data is processed with Parallel Processing.

Now, this statement is able to execute the process correctly, but now we're thinking of improving the performance of the program and packet size is causing an issue where its hitting DB multiple times.

Is there any method/technique through which we can pull all the data at once from DB to Application Server without causing RAM error.

Thanks & Regards,

Ritiz Mitra

6 REPLIES 6

matt
Active Contributor
0 Kudos
478

Parallel processing?

mritiz
Explorer
0 Kudos
478

Hi matthew.billingham,

That's already in place.

MKreitlein
Active Contributor
0 Kudos
478

Just an idea... do you use the maxsize parameter?

Maybe you can optimize here, to get the best number of lines out of each FETCH?

See, e.g. https://answers.sap.com/questions/2885346/extractor-parameter-maxsize--and-abap-parameter-ss.html

BR, Martin

Sandra_Rossi
Active Contributor
0 Kudos
478

Your question is very open ("is there anything else to improve") but you explained very little.

Also, if you did anything else related to improving the performance, please don't forget to mention it (parallel processing), and explain what you exactly did.

Maybe first you should explain why OPEN CURSOR causes a bottleneck, just to be sure you don't miss something...

mritiz
Explorer
0 Kudos
478

Hi sandra.rossi,

Sorry my bad, I do understand your point here.

My objective being, is it possible (techniques) to pull huge amount of data at once so that a HIT to DB won't be required for each iteration of data based on Packet Size.

Sandra_Rossi
Active Contributor
0 Kudos
478

It really depends on the context. It's not just one easy solution which works well for anything.