Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
Showing results for 
Search instead for 
Did you mean: 
0 Kudos


!!!!!!!Below are the step by step guidance to transfer the data from one data base to the another!!!

To begin with the data service feature, We need to create the structure as follows: Project -->jobs-->Dataflows

There are several platform techniques within BODS 4.1 which are used to migrate the data from the MSSQL to the SAP HANA. Query transformation is used to pull the data from the one layer to the another. Process to transfer the data from the one DB to another is show below:

Create Project

          Projects can be created in two ways as follows -

            1. Click on the icon to create the project (Attached is the screen shot below)

            2. Go to the Object Library present in the Left pane. Right click on the area and give New-->project to create the project 



Create Batch Jobs

Jobs can be created by right clicking on the Project Area Pane  and selecting New Batch Job or by placing the cursor in the Local Object                             Library  Right click on it and select New--> Batch Job. A Batch job can contain the following objects as Data flows, Workflows,                                        Script, Condition,Try/Catch   block, While Loop. I have created a job by the name DS_JOB (screen shot attached above).


Create Data Flows

Data Flows are used as the process to communicate or transfer of data over the channels. Data flows are deployed to place the source,                         transform and  target. We can use the tool pallete icons inside the data flow(screen shot attached below).                                                 

Highlighted icon in the figure is used  to trigger the source, transform and the target. Each dataflow within the job can be executed on the available job     server. It is an reusable object and it specify the requirements for extracting, transforming and loading the data from the source to target

(Attached is the tool palette for the reference)

Data Store Creation

Data can be extracted from the SAP system as well as from the NON SAP system. For Ex. I have triggered the data from the database MS SQL                        Server 2008. Here the data stores for the following database are created with the following credentials (Screen shot attached below for reference).To                    establish  the connectivity to the the MS SQL Server 2008 and SAP HANA, the following requisites are done.

             FIGURE 1



I have created the tables in the MS SQL server already. To trigger those tables in to my repository I have used this interface Data store, which act as                 the interface to pull the data from the one database to the another. Each user can use their own credentials to set up this data store level. This is just                 an  intermediate to trigger the data. I have created another connectivity to database SAP HANA which is used in the template end. These databases                   are  used for extracting data from the source to the target. Create another data store which points to the database SAP HANA and their details                           are listed below -

            FIGURE 2



Extracting data from the MS SQL server 2008 to SAP HANA


To extract the data from the SQL server to the HANA, we use the transformation Query to trigger the data. Place the tables avaialble in the SQL                        server  in to the work area to pull the data. Also place the template tables in the work area to move the available tables in the SQL to the HANA.

Since the input is a NON SAP system, data can be extracted easily using the interface data store. To trigger the data from the SAP system, we                        use  interface such as IDOC, BAPI and LSMW to extract. IDOC interface is deployed when we extract the data from the ECC to the BW system or                    vice versa. To refer to this, IDOC we create a logical system, RFC destination and port, Partner profiles to extract the data.


Click on the Template Table from the Tool palette to place in the template end



View data in HANA level comparison to MS SQL

Click on the SAP HANA  STUDIO to verify the data available in the system. I have triggered the table in to my respective schema and populated                         the  data from SQL to HANA. In the Query, have taken just the four fields as EMP_ID, LNAME, FNAME, REGION. Screen shot                                                 attached  below for the reference.



    Now the data is available in the HANA system under the respective schemas                                   








Labels in this area