Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
Showing results for 
Search instead for 
Did you mean: 
Former Member
0 Kudos

Extraction: Extracting data from R/3 & using DB Connect.

Flat file Extraction: Once we create Info source, it automatically creates Data Source.

Extract Structure: Grouping of fields indicating what to be extracted from source.

Transfer Structure: Out of extracted fields what to be extracted source to transferred to B.W

Types of Extractors from R/3:

Lot of application areas: SD, MM, FI…..etc.,

Business Content Extractor: Any content delivered by SAP (Delivered Version), we must install to active version.            

Customer generated extractors: We don’t really create DS. Based on some other object it generates D.S(Extractor).


Generic Extractions: - We explicitly create our own DS using Tables, FM, Views, Domain, Infoset.

LIS is outdated.

Infoset Query:- We can build query multiple objects, reports, domain, table, view, FM structure,

  1. T.code: SQ02.

DB Query: we are able to build multiple tables

SQ01: Run Infoset Query

SQ02: Creation of Infoset Query

SQ03: Assign user group

Generic Extraction:- Different ways to generate Generic DS  using Table, Views, FM, Domain, Infoset

How to create the DS using database pool & extract data.

VBAK (MANDT will never extract to BW) VBELN, ERPAT, NETWR, WEARK has to send to BW

First Step: Go to RSO2. No flexibility to create Hierarchy DS.

  1. How do I choose Master Attribute, Transaction, Master Text)…Depending on the target. If it is cube Transaction data, Attribute is attribute text is text.
  2. Whenever we are creating DS it designating (Assigning) to Application Component.
  3. Only when we are creating text data, domain option will be enabled other than this only enabled from view, query, FM.

It defines DS with the extract structure when save the DS(MANDT field will never extract from source)

When save DS it automatically will go to field selection.

A) Selection                                                        B) Hide field

A) it enables at BI side, data selection table

B) will play on transfer structure(i.e., will not available at R/3 & BI transfer structure)

  1. Next step __MCE_ITEM__à Replicate the DS in BW

Extract structure in R/3

After replicating DS, trans structure in BW

When we activate Tr. Rules in DS it forms (creates) Tr.structure in in R/3

When we used based on FM, explicitly define the extract structure, by using FM, in FM if we used select statement it reads all the records,

If we want read all the records per packet...

Ex: 1000 per packet 100… 10 packets 100*10=1000

  1. Open Cursor 10*100. When the first time reads 100 records, if packet 1. Next time on words…..Cursor

LO Extraction (Mostly we go for LO)

To extract logistic data from R/3 to BW…We have 1) LO 2) COPA 3) Generic

Logistic: Any product based, we buy raw material and we sell to customer, all these transaction called Logistic.

  1. Raw Material__MCE_ITEM__àInventory__MCE_ITEM__àPP__MCE_ITEM__à __MCE_ITEM__à __MCE_ITEM__à __MCE_ITEM__àQM __MCE_ITEM__à delivered to cash…all these are in LO Extractor

Shaff Floor: Machinery Calculation. How much o/p coming on each Machine.

LO: we have DS generated by SAP (Delivered Version)

Generic:  we have to create DS.

Every LO data source come with extract structure. Initially only extract structure…When we replicate it creates Transfer Structure in R/3….When we activate transfer rules then transfer structure in R/3.


Every LO starts with 2LIS

11 indicates application component number (11: sales order).. We have 02: purchase order, 12: Delivery, 13: Delivery, 03 : Inventory and so on…

VAHDR: (VA: Event of application HDR: Header lever data).

MC11VA0HDR: (Extract Structure)

0(zero) root node of application component Hierarchy.


  1. Enquiry__MCE_ITEM__àQuotation __MCE_ITEM__à Sales Order __MCE_ITEM__àDelivery __MCE_ITEM__àPGI (Post Goods Issue i.e.. Literally give to customer)

How do we extract Master data:

We can extract using Generic or Business Content Extractor (DS)

Ex: Which business content extractor we have to use.

All DS are stored in table ROOSOURCE. We can find type of DS.

Transaction DS:- To extract DS from R/3 (Logistics).

ROOSOURCE (Table):- Execute fields:- Copy this field and Goto RODELTAM( Delta Process Details) ABR,AIE.

How do we decide (DSO) or Infocube

Depends on DS behavior, whenever change in the document if DS giving after image and before image we can load to cube directly, whenever the document after image we can load to DSO…it is called as Delta Process of DS, we can decide whether IC or DSO

Note:  DSO is   Detail level of reporting

                                    Detail level of Staging

Suppose Company depends on code

Scalability and layer architecture

First level is Staging (same as R/3 data)

Second level is Tranfer

Third level is Reporting

How to enable or Customize field property of Extract structure

  1. RSA6:__MCE_ITEM__à Any DS try to change __MCE_ITEM__à field selection of Extract Structure…Field (Selection Hide)

ROOS: Extraction…How it is enabled in which one has enabled, which is not…Maintain in the Table ROOS. .Settings can change for hide, selection.

  1. How to generate the LO DS:__MCE_ITEM__à When already DS is available, why do we need to generate DS.

            2LIS_11_VAHDR (Sales Order Header Data)

Step1: Identify the DS and it is in Delivered version, we have to install to active version.

To Activate      A) Navigate thru SBIW or RSA5 directly…..Install DS

                        B) If DS is active, inactivate the DS then go for maintenance

                        C) 100% we will never satisfied with extract structure we have to add fields go to

                                    __MCE_ITEM__à Maintain Extract Structure by selection from communication structure

            It allows to maintain with respect to SO Header only, 2LIS_11_VAHDR allows to maintain only from communication structure of header only which are related to sales order header.

Selection: will be available in Info Package for selection of fields.

Hide: When we don’t need then we go for Hide, will not be available in transfer structure in R/3, BW

Means of maintain extraction structure:__MCE_ITEM__à When we not satisfied with fields in the extract structure we go for add more fields from communication structure to extract structure.

Communication Structure 😆 MCVBAK, MCVBAP, MCVBUP:__MCE_ITEM__à add to extraction structure

Fields in right side are communication structure

Fields in left side are extract structure

3)  Inversion: Will enable only for Key figures only……When ever the document changed, it brings 2 records, One is before image(1000) and Second is After Image (1500)

Delta: Changed record will come to BW.. Value in BW is 3500; to get 1500 we have to use Inversion.

4) Field only : Enhancement Purpose only

5) Update Modes:

            Serialized V3 is Outdated

Based on updated mode what we specified in LBWE, how delta records are processed will be defined…It brings the data based on time, data and sorting

__MCE_ITEM__1)      Direct Delta: Specific to Application. Its against to application level. We can’t specify update mode to particular DS

After this replication the DS in BW. It reads complete details of metadata of DS, it also maintains time stamp at what time it is replicated.

All historical data migrated to BW with Init, and next on words loading with Delta.

VA01, VA02,VA03:__MCE_ITEM__à New record when user creates new SO

SETUP TABLES: setup table is implemented with concept of Cluster table. Cluster table is DB Layer, It can be viewed multiple table with primary key and foreign key relation in Application Layer.

In the setup table, Every DS shows each level of Header, Item, Schedule line item which stores in one table in the DB. In the application it shows in  different level, it shows different DS.

When Info package runs with full or init data comes from setup table, When Info package runs with delta data comes from RSA7( Delta Queue)

Before we do the Delta we must run init, Before run init we must fill the setup table by doing statistical setup, it filled the data from data base thru communication structure (Interface like)

Whenever we run Statistical Setup Table, it has fill the data from database to setup table meanwhile of the running the SST the end user may enter new records those records will be missed, it will be mismatch with R/3 and BW.

How do we do?

We have to speak with end user, init will run off time (like Sat, Sun i.e Basis people)will down the server) from Monday delta will run as it is.

Before we run SST table must clean (delete) the data, again we run the SST

After Init… Whenever end user creates new S.O, the system also creates one LUW(Logical Unit of Work), if modified the existing record, LUW has 2 records(Old and New records)

X- Before image

Blank 0-After image.

V2= Asynchronous Update (Default)

V1= Synchronous Update


            DS                                            SETUP TABLE

__MCE_ITEM__1) 2LIS_11_VAHDR                            MC11VA0HDR

Select in SE11, by giving this setup table to confirm the when DS data has cleaned or not we go to setup table in SE11.

T.Code: OLI*BW (here OLI7BW)

__MCE_ITEM__1)      We can restrict data by document data restriction.

__MCE_ITEM__2)      Control the setup

Name of the Run:

Transaction Data:

Transaction Time:

__MCE_ITEM__o   Re determine Update group

__MCE_ITEM__o   Update documents

__MCE_ITEM__o   Block all orders (Means Blocking orders)

__MCE_ITEM__o   We are following for 100 records in No. tolerated

For Sales: Queued Delta

Inventory: Un serialized V3

Direct Delta:

The extracted data is transferred directly to the delta queue with the Document Posting. Thus the sequence of the transfer of document agrees with the chronological order of data creation.


LO cockpit setting: Update mode as “Direct Delta” for the application (11) in LBWE transaction.

The exercise starts after the successful completion of nth Delta.

When run full ( Historical Data in 2 ways)

                  Full                  Initialize with data transfer

  1. 1)      Delete the setup table( LBWG)
  2. 2)      Run OLI*BW to fill the set up tables (Pre-calculated)
  3. When run Deltaà Data load in BW from Delta Queue (RSA7)

Whenever delta fails, will run repeat delta (present recs &Delivered

Posting frequency is less in Direct Delta

SMQ1: To delete the data in Queue

Unserialized V3( Only for new postings) [No changed Records]

V3 is asynchronous with Background

After run the init, Delta will be enabled . In delta Data maintain 2 sets.

  1. 1)      Delta            Present Data
  2. 2)      Delta Repeat Present +Previous Data (Present it stays until next delta run)

Whenever delta loads fails, we do repeat delta

Repeat Data contains previous delta records + present delta records.

Steps: 1) Set QM status not OK (Even it is red already, we have to set the QM status not OK)

2)` Delete the Bad request from data target once we delete Schedule the IP delta

LIS: Limitation in LIS

  1. 1) In LO we have one DS for Header



That is not the case in LIS

  1. 1)      In LIS we had single Ds to extract all Header, Item, Schedule line (Huge Volume come) With LIS
  2. 2)      In LIS we have (information Structure) these are like setup table.

It acts as transparent table, Delta records at stores at IS & Delta (RSA7)Un necessarily stores IS

3) Degrade the performance (due to sorting there are so many problem)

4) It can’t take different languages at a time

5) LUW1—English--- One segment---frequent data failures)

     LUW2—German--- One segment---frequent data failures) (Sequence + Segment) Initially LO only have V3 job

In Inventory no need of serialization of posting (in and out)

For this we go for Unserialized V3 in LO whenever we extract with Unserialized V3, don’t go for DSO, Its overwriting

When transaction is less we go for Direct Delta

When transaction is huge and serialization is not required we go for Queued Delta

When transaction is huge and Unserializaton is not required we go for Unserialized V3

How to generate LO DS

Ex: 2LIS_11_VAHDR( Sales order Header)

Install B.C Data Source by using RSA5 in R/3

Before activating search for structure

  1. SE11àData Type Searchà Structure…It will not be available.
  2. LO_Extract_04.3à when we activate transfer rules, now it creates transfer structure in R/3
  3. Flow:- R/3 source systemà Tra.Structureà PSAàTr. RulesàISàCube..Now we can start extraction of cube à IP createà RunàLoad.

Generating DS  (IMP)

How do we migrate Historical data and enable delta mechanism using Direct Delta

Migration means extracting from R/3 to BW is sales order created by and user for business. R/3 is already in live and BW is development (Implementation)

Historical transactions, all of them process through init and whatever new or changed transaction since last update delta.

  1. We have application VA01,VA02,VA03, Whenever creates SO in VA01àcreates new record in table.. 02 Changes in existing in table.

All of SO information stored VBAK,VBUK our concept is whatever Historical data by today, we must run through init, then enable for Delta.



Labels in this area