Application Development Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 

Parallel processing in background using Job scheduling...

Former Member
0 Kudos

(Note: Please understand my question completely before redirecting me to parallel processing links in sdn. I hve gone through most of them.)

Hi ABAP Gurus,

I have read a bit till now about parallel processing. But I have a doubt.

I am working on data transfer of around 5 million accounting records from lagacy to R/3 using Batch input recording.

Now if these all records reside in one flat file and if I then process that flat file in my batch input program, I guess it will take days to do it. So my boss suggested

to use parallel processing in SAP.

Now, from the SDN threads, it seems that we have to create a Remote enabled function module for it and stuf....

But I have a different idea. I thought to dividE these 5 million records in 10 flat files instead of just one and then to run the Custom BDC program with 10 instances which will process 10 flat files in background using Job scheduling.

Can this be also called parallel processing ?

Please let me know if this sounds wise to you guys...

Regards,

Tushar.

1 ACCEPTED SOLUTION

Former Member
0 Kudos

Hi...

It probably will work OK...

Try with 3-5 running at the same time to see if LOCKS and/or concurrent processing are issues...

I have sucessfully used this concurrent processing technique for BDC's...

Dave...

8 REPLIES 8

Former Member
0 Kudos

Hi,

This would not be parallel processing because there might be locks and other for the same program.

So you will not be able to do parallel processing with the same custom BDC because there will be a lot of errors due to locks.

0 Kudos

Thanks for your reply...

So what do you suggest how can I use Parallel procesisng for transferring 5 million records which is present in one flat file using custom BDC.?

I am posting my custom BDC code for million record transfer as follows (This code is for creation of material master using BDC.)

report ZMMI_MATERIAL_MASTER_TEST

no standard page heading line-size 255.

include bdcrecx1.

parameters: dataset(132) lower case default

'/tmp/testmatfile.txt'.

      • DO NOT CHANGE - the generated data section - DO NOT CHANGE ***

*

  • If it is nessesary to change the data section use the rules:

  • 1.) Each definition of a field exists of two lines

  • 2.) The first line shows exactly the comment

  • '* data element: ' followed with the data element

  • which describes the field.

  • If you don't have a data element use the

  • comment without a data element name

  • 3.) The second line shows the fieldname of the

  • structure, the fieldname must consist of

  • a fieldname and optional the character '_' and

  • three numbers and the field length in brackets

  • 4.) Each field must be type C.

*

      • Generated data section with specific formatting - DO NOT CHANGE ***

data: begin of record,

  • data element: MATNR

MATNR_001(018),

  • data element: MBRSH

MBRSH_002(001),

  • data element: MTART

MTART_003(004),

  • data element: XFELD

KZSEL_01_004(001),

  • data element: MAKTX

MAKTX_005(040),

  • data element: MEINS

MEINS_006(003),

  • data element: MATKL

MATKL_007(009),

  • data element: BISMT

BISMT_008(018),

  • data element: EXTWG

EXTWG_009(018),

  • data element: SPART

SPART_010(002),

  • data element: PRODH_D

PRDHA_011(018),

  • data element: MTPOS_MARA

MTPOS_MARA_012(004),

end of record.

data: lw_record(200).

      • End generated data section ***

data: begin of t_data occurs 0,

matnr(18),

mbrsh(1),

mtart(4),

maktx(40),

meins(3),

matkl(9),

bismt(18),

extwg(18),

spart(2),

prdha(18),

MTPOS_MARA(4),

end of t_data.

start-of-selection.

perform open_dataset using dataset.

perform open_group.

do.

*read dataset dataset into record.

read dataset dataset into lw_record.

if sy-subrc eq 0.

clear t_data.

split lw_record

at ','

into t_data-matnr

t_data-mbrsh

t_data-mtart

t_data-maktx

t_data-meins

t_data-matkl

t_data-bismt

t_data-extwg

t_data-spart

t_data-prdha

t_data-MTPOS_MARA.

append t_data.

else.

exit.

endif.

enddo.

loop at t_data.

*if sy-subrc <> 0. exit. endif.

perform bdc_dynpro using 'SAPLMGMM' '0060'.

perform bdc_field using 'BDC_CURSOR'

'RMMG1-MATNR'.

perform bdc_field using 'BDC_OKCODE'

'=AUSW'.

perform bdc_field using 'RMMG1-MATNR'

t_data-MATNR.

perform bdc_field using 'RMMG1-MBRSH'

t_data-MBRSH.

perform bdc_field using 'RMMG1-MTART'

t_data-MTART.

perform bdc_dynpro using 'SAPLMGMM' '0070'.

perform bdc_field using 'BDC_CURSOR'

'MSICHTAUSW-DYTXT(01)'.

perform bdc_field using 'BDC_OKCODE'

'=ENTR'.

perform bdc_field using 'MSICHTAUSW-KZSEL(01)'

'X'.

perform bdc_dynpro using 'SAPLMGMM' '4004'.

perform bdc_field using 'BDC_OKCODE'

'/00'.

perform bdc_field using 'MAKT-MAKTX'

t_data-MAKTX.

perform bdc_field using 'BDC_CURSOR'

'MARA-PRDHA'.

perform bdc_field using 'MARA-MEINS'

t_data-MEINS.

perform bdc_field using 'MARA-MATKL'

t_data-MATKL.

perform bdc_field using 'MARA-BISMT'

t_data-BISMT.

perform bdc_field using 'MARA-EXTWG'

t_data-EXTWG.

perform bdc_field using 'MARA-SPART'

t_data-SPART.

perform bdc_field using 'MARA-PRDHA'

t_data-PRDHA.

perform bdc_field using 'MARA-MTPOS_MARA'

t_data-MTPOS_MARA.

perform bdc_dynpro using 'SAPLSPO1' '0300'.

perform bdc_field using 'BDC_OKCODE'

'=YES'.

perform bdc_transaction using 'MM01'.

endloop.

*enddo.

perform close_group.

perform close_dataset using dataset.

0 Kudos

Assuming that concurrent processing i.e batch scheduling is not possible to process 5 million records divided in 10 different flat files using 10 instances of same custom BDC due to lock errors, can someone suggest me how do I modify my above attached BDC code to tranfer 5 million records using parallel procesisng...

Former Member
0 Kudos

Hi...

It probably will work OK...

Try with 3-5 running at the same time to see if LOCKS and/or concurrent processing are issues...

I have sucessfully used this concurrent processing technique for BDC's...

Dave...

Former Member
0 Kudos

I’m not sure if your approach (splitting file into multiple chunks) is the better than true parallel processing. But if you want to try parallel processing, here’s what I would suggest –

Create RFC-enabled function module Z_CALL_MM01. In the FM, you would have just the call transaction to 'MM01' using itab...

In your program, you would have --

While EOF is not true.

Read input

Populate BDC itab

Call function Z_CALL_MM01 starting new task xxxx

Destination in group yyyy

Performing …. On end of task

Exporting….

Clear BDC itab (for the next task)

Endwhile.

christian_wohlfahrt
Active Contributor
0 Kudos

Hi Tushar!

Your approach can be called parallel processing (because you use several sessions) and you won't have lock problems based on this technic.

MM01 is a creation, only some number ranges might be locked - but that's the same with RFC.

I'm more wondering that MM01 is BDC-compatible.

If you want to be faster, use BAPI_STANDARDMATERIAL_CREATE or IDOC MATMAS0x (x depending on your release). Especially with IDOC you can have a massive parallel execution by a standard report.

Regards,

Christian

former_member378318
Contributor
0 Kudos

I agree with Christian. With such a large volume of data you are starting off on the wrong foot if you choose BDC. Go the BAPI or IDOC route both of which offer better performance and parallel processing opportunities.

Former Member
0 Kudos

Could you tell me why you are using batch-input recording for your data trnasfer from legacy and not data transfer workbench. I believe SAP uses direct input vs batch-input in data transfer workbench for material master.

I would check the following in SAP online help :

LO Material Master Data: Data Transfer Workbench (or check the program rmdatind). As an option - you can check LSMW tool which can do some data comversion during transfer (LSMW - Legacy System Migration Workbench).

There was a good article about LSMW in SAP Professiona Journal back in 2003.

I believe data transfer workbench allows you to skip creation of change documents which can help with runtime.