Application Development and Automation Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 
Read only

EXPORT/IMPORT using DATA BUFFER

prabhu_s2
Active Contributor
0 Likes
4,688

Hi

In a current process, there are around 500K records exported and imported to data buffer. While this works most of the time, occasionally we see failure in the job that implements export/import process. the job runs into HOLD status with reason SLEEP which I think is of resource bottleneck. I couldn't discard the reason of this error linked to the way how EXPORT/IMPORT is used.

Keeping the existing process of export/import to data buffer unchanged, how do I clear the data buffer?

Below is the code I have for import:

IMPORT p1 = lt_matl FROM DATA BUFFER iv_buff .

After the import how i can reset or clear the buffer iv_buff. FREE MEMORY ID results in syntax error as iv_buff must be a character type field.

  1. Any suggestion on how to clear the data buffer ?
  2. Is there any limitation on using IMPORT/EXPORT using DATA BUFFER for large volumes of data?
6 REPLIES 6
Read only

matt
Active Contributor
0 Likes
3,284

If FREE MEMORY ID iv_buff fails due to iv_buff not being a character field, that would indicate to me, that you need to make it a character field. What is its type?

Read only

prabhu_s2
Active Contributor
0 Likes
3,284

it is a RAWSTRING

Read only

matt
Active Contributor
0 Likes
3,284

Ah, I see it now:

Spot the difference.

  1. IMPORT p1 = lt_matl FROM DATA BUFFER iv_buff .
  2. FREE MEMORY ID iv_buff

1 puts information into a data buffer

2 frees a memory id.

Quite different things. Why would you expect it to work?

Try: clear iv_buff.

Read only

Sandra_Rossi
Active Contributor
0 Likes
3,284

If it's really IMPORT which makes the workprocess hang forever, then it's abnormal (it should just fail) and you should check if there are patches to correct that, or contact the SAP support if there are none.

Read only

prabhu_s2
Active Contributor
0 Likes
3,284

thanks Sandra .... what I missed to include in the post is the import is performed within a submit job in background .... currently I could see there are atleast 140 jobs created

Read only

Sandra_Rossi
Active Contributor
0 Likes
3,284

Ouch. 500 thousand lines * 140 jobs, that's a huge amount of memory! Because there is first the IV_BUFF variable which contains the compressed data, and you uncompress it into LT_MATL. PS: in that context, "DATA BUFFER" just means that the compressed data is inside the variable (IV_BUFF). Clearing the variable clears the "data buffer".