Application Development and Automation Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

Parallel processing with huge files from AL11

rajani_muvva
Explorer
0 Kudos
1,165

Hello All,

I have to read multiple files from Al11 to load data into SAP where I am doing multiple functions like material chnage, assigning class and chars. I tried using parallel processing creating RFC FM wrapper calling all these BAPI's with commit work inside it. program is going dump "Open SQL array insert produces duplicate records in the database".

Please let me know how to handle RFC when we have multiple updates in a single task against an object.

Thanks.

11 REPLIES 11

rajani_muvva
Explorer
0 Kudos
999

Hello All,

can anyone help me on max no of records per work process to be loaded (on an average), when I am loading huge data I am getting express document failed with DBSQL_DUPLICATE_KEY_ERROR while updating articles. can we handle this error at at the program level. Thanks!

Sandra_Rossi
Active Contributor
999

I don't understand your question, you are mixing too many things (AL11, RFC, BAPI, insert).

There's nothing special with RFC concerning INSERT. You handle INSERT the same way as in a normal program, you must not INSERT a line if one already exists with same unique key value.

For that, make sure each program is assigned specific lines to update, and handle the exceptions to avoid the short dumps.

PS: my answer is very generic because your question is very generic. If you want a precise answer, ask a precise question = provide your exact logic, table definition, ABAP code and so on.

rajani_muvva
Explorer
0 Kudos
999

Hi Sandra,

My requirement is to update the article and upload characteristic values in a single task, I created wrapper RFC to update article (BAPI maintain) and to update characteristic data (BAPI obj change), after every bapi call if return is successful I am using commit work.Wrapper RFC is called in report for parallel processing. please suggest is this approach correct?

For small packet sizes, I am able to load data successfully, but for huge data, I am seeing duplicate key errors. thanks!

Sandra_Rossi
Active Contributor
0 Kudos
999

You are still too much generic. So I will keep answering generically.

The problem is both your logic to distribute the materials and characteristics, and the way you (don't) handle the insert duplicate key exceptions.

If you want to continue the discussion, you will have to tell (much) more about your logic, the way you handle the exceptions, the exact root cause of the exception, etc.

We are in an ABAP forum, but except the words "RFC" and "BAPI", alas there's nothing else concrete in your question.

rajani_muvva
Explorer
0 Kudos
999

Hello sandra,

My requirement is to update the articles and to assign classes and characteristics to the same article.

Due to performance issues while running the program due to huge data in the files,

I am using parallel processing where I need to have all the above mentioned updates against article in one task.

I created wrapper fm, Inside wrapper FM

calling FM1 (BAPI_MATERIAL_MAINTAINDATA_RT) - update article

if sy-subrc = 0.

commit work.

else.

rollback area.

endif.

call FM2 (BAPI_OBJCL_CHANGE) - assign class and characteristics

if sy-subrc = 0.

commit work.

else.

rollback area.

endif.

In my report I am calling this Wrapper FM.

CALL FUNCTION 'wrapper FM' STARTING NEW TASK lv_task DESTINATION IN GROUP system
PERFORMING get_data ON END OF TASK
EXPORTING
itab1 = ltab1
itab2 = ltab2
itab3 = ltab3.
wait UP TO 5 SECONDS.

problem here is I am able to update the data, however I am seeing DBSQL_DUPLICATE_KEY_ERROR for FM2, How can I handle this exception in parallel processing. the error is CLF_HDR has failed due to insert with same key record. (before calling the FM I am sorting the data too).

I tried running FM1 alone only to update article - program was sucess, but for FM2 system is dumping with duplicate key. I would like to handle this exception. I would like to know where exactly I need to handle in the wrapper FM or custom report?


Thanks.

Sandra_Rossi
Active Contributor
0 Kudos
999

Thanks for the effort, but the issue is not about this part. It's not about calling via RFC.

The problem is that you update the same characteristic in 2 parallel RFC (or whatever) so the problem is your logic to distribute the materials and characteristics to the RFC.

You must change your distribution logic.

PS: also, the short dump is missing.

rajani_muvva
Explorer
0 Kudos
999

Hi Sandra,

Can you please elaborate on distribute the materials and chars to the RFC. I tried calling only FM2 too. still I am seeing duplicate key dump and here is the error message

An exception has occurred in class "CX_SY_OPEN_SQL_DB". This exception was

caught

in procedure "IF_EX_CACL_CLASSFICATN_UPDATE~AFTER_UPDATE" "(METHOD)" or

propagated by a RAISING clause.

Since the caller of the procedure could not have anticipated this

exception, the current program was terminated.

The reason for the exception occurring was:

The reason for the exception is:

When an Open SQL array insert is performed, trying to insert a record

into the database table "CLF_HDR" causes the function to terminate if a

record with the same key already exists.

(When an Open SQL individual record insert is performed, this situation

does not cause the function to terminate. SY-SUBRC is set to 4 instead.)

Thanks!

Sandra_Rossi
Active Contributor
0 Kudos
999

If you had posted all these information since the beginning, you would have immediately obtained support from people around here much earlier... (just one week)

Sandra_Rossi
Active Contributor
999

Although it seems that there's no patch for SAP internal implementations of BAdI CACL_CLASSIFICATION_UPDATE, maybe you can contact SAP Support for help.

Sandra_Rossi
Active Contributor
0 Kudos
999

About the "distribution" of data, I mean, what is your logic to avoid updating same materials, same characteristics in parallel?

rajani_muvva
Explorer
0 Kudos
999

I am skipping article upload based on custom validation and errors are not 1 to 1, and If I am loading 10K records with 4 work process all went well, when I am increasing the count I am seeing duplicates.

I am unable to debug the issue too to Identify where exactly this dump triggering (data).