There are several reasons for migrating a Datasphere tenant. Some of them are: change of datacenter, change of SKU, integrating a DSP into a freshly created BDC instance. As of February 2024, there is no out of the box migration option for DSP.
Recognizing this challenge, we have developed a specialized service that simplifies and streamlines the migration of a DSP tenant to a new environment.
Sadly, there are limitations that prevent a migration via "button click". These limitations include: DP-Agent installation or re-entering credentials in migrated connections. These will remain manual tasks done by the customer. Nevertheless, our service can automize the migration of about 90% of the tenants content. This includes:
There are currently topics that are out of scope for an automated migration either because of technical limits or because the manual effort of migration is very low and its not worth automizing. These include:
*even though this data cannot be migrated from the old tenant, we can automize the re-initialization so that no manual efforts are needed.
**custom roles in DSP are per definition global roles and cannot be created. Hence, these need to be created manually after tenant provisioning to ensure smooth creation of any scoped roles that depend on these custom roles.
Update: We can now also migrate tables from Open SQL Schemas, which was not possible before.
For a full technical documentation of the migration service we would need about 20 blogs, so here I would like show one example and point out the technology used. For the migration, we use a script that we run locally to download the entire tenant to a folder structure. This folder structure contains everything that is needed to re-upload the content into the new tenant.
Our script is coded in javascript and uses DSP's command line interface (CLI) wherever possible. For special scenarios, which are (currently) not possible via CLI (e.g. creating connections, schedules, data migration) we are using internal API calls.
As an example, we would like to explain how our data migration works. Here, we want to migrate all data from local tables, that are not targets of Replication flows. For this we need 5 steps.
So if a space has 20 local tables we create 20 data flows. One could argue that transferring the tables in 1 replication flow vs. 20 data flows is easier but to use a local table in a replication flow, it needs to have a key column. To overcome this limitation to ensure also local tables without key columns can be migrated, we are using data flows.
For this scenario we use a combination of CLI (Step 4: data flow creation and Step 6: data flow deletion) and internal APIs (Step 1, 2, 3, 5 and 7).
We have been working on this service for multiple months to ensure a resilient and smooth transition to the new environment. This is also the feedback we got from the customers migrating with our service.
Furthermore we are ensuring that when provisioning SAP Business Data Cloud, we can help using an existing DSP tenant in this context as well.
If you are planning on migrating your DSP tenant (or you have a customer that wants to migrate), we are looking forward to align expectations and next steps with you!
Feel free to reach out: thore.bedey@sap.com, nils.henk@sap.com
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
25 | |
11 | |
8 | |
8 | |
7 | |
7 | |
6 | |
6 | |
6 | |
5 |