cancel
Showing results for 
Search instead for 
Did you mean: 

LSMW or BAPI

MangeshP
Active Contributor
0 Kudos
110

Dear Friends

I want to uploads documents ( 10 Lakh records with Each records has average 1 Original files of 10 MB).

Please guide me whether I should use BAPI or LSMW for this.

I dont want to use third party upload software if there is any.

With Warm Regards

Mangesh Pande

Accepted Solutions (0)

Answers (2)

Answers (2)

Former Member
0 Kudos

Hi,

I can give you a quick metrix:-

High Data quality and High Complexity = LSMW

Low Data quality and High Complexity = LSMW

High Data quality and Low Complexity = CATT, CT

Low Data quality and Low Complexity = BI

High Data volumn and High Complexity = LSMW

Low Data volumn and High Complexity = LSMW

High Data volumn and Low Complexity = LSMW + DX

Low Data volumn and Low Complexity = CATT, BI, CT

High Reusability and High Complexity = LSMW

Low Reusability and High Complexity = LSMW

High Reusability and Low Complexity = CATT

Low Reusability and Low Complexity = BI, CT

High Data volumn and High Data Quality = LSMW + DX

Low Data volumn and High Data Quality = CATT, BI, CT

High Data volumn and Low Data Quality = LSMW

Low Data volumn and Low Data Quality = BI

High Reusability and High Data Quality = CATT

Low Reusability and High Data Quality = CT

High Reusability and Low Data Quality = LSMW

Low Reusability and Low Data Quality = BI

High Reusability and High Data Volumn = LSMW

Low Reusability and High Data Volumn = LSMW

High Reusability and Low Data Volumn = CATT

Low Reusability and Low Data Volumn = BI, CT

Key:

BI: Batch input recording

LSMW: Legacy System Migration Workbench (with std. batch input recording, direct input, BAPI, IDoc)

CT: Call transaction

DX: Data Transfer Workbench

Depending on what is your most important desired attribute (target system data quality or reusability of DM tools, etc.), you can choose the appropriate migration technique. Note that this is at best a guidance matrix and not a hard-and-fast rule. E.g., In your case, youu2019ve indicated that your data volume is very high. So, LSMW is your best bet, unless you have a good reason NOT to go in for it.

If nothing helps, you might have to resort to custom programs and direct table updates, but for obvious reasons, that approach should be your last resort.

how I define the terms:

- Data volume:

o HIGH is >= (or nearing) 100,000 records of a given object type

o LOW is =< (or towards) 10,000 records of a given object type

o Obviously, thereu2019s a big gap but thatu2019s because these are just guidelines and not strict definitions. Use your discretion, based on what procedure you might prefer in a given scenario.

- Complexity: this denotes the level of deviation that the legacy dataset has, when compared to the standard SAP input format. Obviously, if the migration is from an SAP system to another SAP system, complexity might inherently be classified as low, while migration from a non-SAP system to an SAP system might, in some cases, qualify as complex. In other words, how much effort needs to be expended to actually make the data in a format that SAP can accommodate u2013 denotes complexity of the data migration exercise.

- Quality: this denotes the level of errors that you expect in the legacy system data. Let me take an example: say, for example, you have to migrate material master, which has more than a 100 fields that can potentially be migrated. Now, further assume that the legacy system is non-SAP, so that all these fields will have to be collated from various sources / tables (which, incidentally, might denote a high level of complexity!). Now, the checks, balances and validations that the legacy system has in place on all these fields might not be as tight as SAP exerts and expects. So, for a field, say, profit center, the legacy system might have a correct possible value in 80 out of 100 records on an average, but in the rest 20, invariably, there might have been typo errors or simply usage of now-expired profit centers where no one has bothered to clean the data. This denotes quite a low level of quality, which would in turn implies that your data migration technique will have to be able to catch all these errors (consider here that methods like CATT or BI with error handling switched off wonu2019t detect as many errors as, say, LSMW or a BAPI).

Regards,

Gaurav

MangeshP
Active Contributor
0 Kudos

Dear gaurav

Thanks for the detail reply. That was very enlightning stuff.

But complexity is there in the data. Each document type upload has different characteristics.

So how to go about?

With Regards

Mangesh Pande

iklovski
Active Contributor
0 Kudos

Hi,

Use LSMW, but with BAPI methodology.

Regards,

Eli