Enterprise Resource Planning Blogs by SAP
Get insights and updates about cloud ERP and RISE with SAP, SAP S/4HANA and SAP S/4HANA Cloud, and more enterprise management capabilities with SAP blog posts.
Showing results for 
Search instead for 
Did you mean: 
Product and Topic Expert
Product and Topic Expert
This blog post is the second part of a series, and explains step-by-step how to load data to the staging  tables of the SAP S/4HANA migration cockpit (on-premise) using SAP Data Services. In this example, the data load is done using MySQL database as a source but, there are many other possible sources of data including different databases and SAP ERP systems. Furthermore, the example that will be presented in this blog is only a simple case, however, it can be used as a reference to build more complex data flows.

SAP Data Services is a data integration and transformation software application. It allows users to develop and execute workflows that take data from predefined sources called data stores (applications, Web services, flat-files, databases, etc.) and then allows the user to combine, transform, and refine that data, and then output the results back to the same source or to different data stores.

For more information about SAP Data Services, see:



Before you start:

The first step will be the creation of the required Data Stores. For this example, we use MySQL database as a source (1st Data Store to be created) and SAP HANA as a target (2nd Data Store to be created). In your case the source system may differ, while the target will always be the SAP HANA database where the staging tables have been generated.

Create mySQL datastore

  • Start the SAP Data Services Designer.

  • Logon to the Data Services repository.

  • Navigate to Data Store in the bottom left corner.

  • Create a new datastore for the MySQL database. The following screenshots show the properties for our example.

  • Create a new ODBC connection directly from this screen by using “OBDC Admin…” button. Use the “System DSN” tab for this so that the connection will also be visible when Data Services executes the job to load data in background (this may use a different windows user and not your own one).

  • You will need to select the correct driver to connect to the database.

  • In the next screen, you specify the connection details.

  • You can test the connection directly from this screen to be sure that it works before confirming the entry.

Create SAP HANA datastore

  • Create a new datastore. In our example, we refer to this with the name “DS_HANA”. Create a new ODBC connection from the resulting screen.

  • On the “System DSN” tab, create the new ODBC.

  • Select the proper driver

  • Fill the required information for your SAP HANA database. The following screenshot is just an example based on our landscape.

  • Test the connection.

  • Now back to the previous screen confirm and save the properties to create the Data Store.

  • The required Data Stores have now been created.

Filling the staging tables

  • Start Data Services Designer

  • Navigate to the Projects tab.

  • Open the folder in which you want to create your new job.

  • In the Project Area, right-click and create a new batch job.

  • Name the new job (for example mySQL_to_Staging)

  • Add a new DataFlow.

  • Double-click the DataFlow.

  • Add a Query operation


  • Navigate to Datastores.

  • Select your MySQL datastore

  • Select from the menu the option “Import By name”.

  • Enter a table name.


  • Drag the selected table to the dataflow (in our example the table is called “banks”).

  • Define it as source by right-clicking the table name and then selecting “Make Source”.


  • Select your SAP HANA datastore.

  • Import the table by right-clicking “Tables” and then selecting the option “Import by Name”.


  • Enter the name of your staging table (example “/1LT/DXX12000123”, no double quotes (“”) needed) or alternatively the name of the synonym if you want to make the flow reusable also in the following environment in your landscape (for example quality and production).
    (To learn more about the usage of synonyms, please see “Creating and using synonyms for renaming staging tables” as explained in the blog from sommerudo.)

  • Specify the owner, for example STAGING_MASTER (this is the user used by the database connection).

  • Click Import.

  • Drag the newly added table to the dataflow.

Note: You can find the names of the staging tables for your migration project on the Migration Object Details screen → Staging Tables. The table names are shown in the column called “Staging Table”.

For detailed information, see:



  • Define the new table as a target by right-clicking the table symbol and then selecting “Make Target” from the menu.

  • Link all elements.

  • Double-click the Query operation.

  • Complete the mapping from source to destination.

A mapping should be specified for every single field in the target structure. The mapping could be either based on a source field or, in case you don’t have the source, it should be mapped to an empty value (for empty value specify 2 single quotes ‘’). For example, for the target field “/1LT/PROCESSED” that is technically used to determine the processing status of this record you should specify a fixed mapping value with ‘’ (2 single quotes).

  • Save and execute the job.

  • After the job has been executed successfully, you can switch back to the tab containing the dataflow you just created and using the magnifier button in the source and target table you can preview the content to check the result.

After completing this last step, we have successfully loaded data in the staging table using SAP Data Services.

Now the next step will be to access the Migration Cockpit and start the data transfer so that the records available in the staging tables can be selected for the data load into SAP S/4HANA.


See the full list of this four-parted blog series and additional options to fill staging tables:

Part 1:  Migrating data using staging tables and methods for populating the staging tables

(Part 2 is this blog post)

Part 3: Using SAP HANA Smart Data Integration (SDI) to load data to the staging tables

Part 4: SAP HANA Studio – Data from File option