There are many reasons why content migration is needed. Just to name few - you need it when transporting changes between development and production HANA database, you might use it when merging or splitting HANA databases or you can use it during change of database design.
In last two blogs I was showing example how HANA database can be designed in terms of data architecture and security concept:
SAP HANA - Security Concept and Data Architecture (part 1)
SAP HANA - Security Concept and Data Architecture (part 2)
In case that you like this concept and you would like to implement it upon existing content - then you need to use content migration to relocate objects without breaking dependencies between them.
Before starting any activity you need to create detailed plan what you want to migrate and where. Be sure to document source and planned target location for each object in database.
Below you can find examples of rules you should follow when creating this plan:
Note: Manual adjustments for information models might be required in case that objects from one schema will be relocated into multiple target schemas.
Here you can see example of migration plan:
According to this plan you need to create and prepare target area (schemas and packages) where you intend to move current content. This can be done in same HANA database or in brand new HANA database. As a last step you will need to create other related objects like analytic privileges, security roles and users for modelers and applications.
Do not forget to prepare plan for external applications connecting to HANA database to understand how to deal with new location of objects (for example BO Data Services).
In order to successfully perform content migration you need following privileges.
In order to successfully perform export you need to have read access (SELECT privilege) against the object (or its schema) and you need to have EXPORT system privilege.
First you need EXECUTE privilege on stored procedure REPOSITORY_REST to be able to retrieve list of defined packages. Then you need REPO.READ privilege against the package from which you want to run export.
For table import you need to have IMPORT system privilege and write access to target schema:
Note: It might look like error but you do not need any other privileges (no need to grant SELECT, INSERT or DELETE privilege).
In case that target schema does not exist and you wish to create is as part of import process you need to grant the user system privileges IMPORT and CREATE SCHEMA (no other privileges are required).
Note: From owner perspective it does not matter who started the import. Object owner is always user SYSTEM.
Again main prerequisite is EXECUTE privilege on stored procedure REPOSITORY_REST to enable listing of existing packages. In order to be able to create models user needs to have REPO.READ and REPO.EDIT_NATIVE_OBJECTS privileges against target package.
Note: You need further privileges to be able to activate objects. (See my security blogs for more details.)
Migration of schema objects is relatively straightforward. Switch into perspective Information Modeler (Menu Window / Open Perspective / Information Modeler).
Choose Export option in Content section of the menu.
Select option Tables in SAP HANA Studio folder and click Next.
On next screen choose correct connection.
Select all tables that should be exported and add them to the list.
Select BINARY format for export and ensure that option “Catalog + Data” is selected. Click Finish.
Following directory structure will be created in selected location:
/usr/sap/HDB/HDB00/work
+--- index
+--- IDES
| +--- ZC
| +--- ZCOPAACTUAL2
| | +--- attributes
| +--- ZCOPAFORECAST2
| +--- attributes
+--- TECHED2011
+--- KN
| +--- KNA1
| +--- attributes
+--- MA
| +--- MARA
| +--- attributes
+--- SC
| +--- SCAL1
| +--- attributes
+--- T0
+--- T001W
+--- attributes
In case that you need to export high amount of tables it is more comfortable to use SQL interface. You can use SQL command EXPORT with following syntax:
EXPORT "<schema1>"."<table1>"[,"<schema2>"."<table2>",...] AS BINARY INTO '<target directory>'
[WITH REPLACE | WITH (REPLACE) CATALOG ONLY | WITH (REPLACE) DATA ONLY]
In our case we will use this command:
EXPORT "TECHED2011"."KNA1","TECHED2011"."MARA","TECHED2011"."SCAL1","TECHED2011"."T001W", "IDES"."ZCOPAACTUAL2","IDES"."ZCOPAFORECAST2" AS BINARY INTO '/usr/sap/HDB/HDB00/work'
In Information Modeler perspective choose again option Export in section Content.
Select option Information Models in Information Modeler folder and click Next.
On next screen choose correct connection.
Select package you wish to export and click Add - all objects in package will be added to the list. Select target folder on your host. Click Finish.
Wait for all jobs to end successfully.
Following directory structure will be created in selected location:
C:\Data\HANA\export
+--- HDB
+--- techedcopa
+--- analyticviews
+--- attributeviews
+--- calculationviews
Second part of this blog can be found here:
SAP HANA - Modeling Content Migration - part 2: Import, Post-migration Activities
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
13 | |
10 | |
7 | |
7 | |
6 | |
5 | |
5 | |
5 | |
4 | |
4 |