Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
Showing results for 
Search instead for 
Did you mean: 


Migration path

Now we can look at the preparation for the migration itself. During the migration we move the integration flow packages, security artefacts, value mappings, number ranges, and access policies and any other peripheral configuration required to make the integration work in the new multi-platform, cloud environment.


If you are only migrating a few integration flows (less than 20) that are mostly self-contained and do not use number ranges, value maps or access policies, then you can use the export-import functionality to export the packages from the source tenant and import them into the target tenant. For all other cases we recommend using the Migration Accelerator:

Migration accelerator

As mentioned previously, we will focus on the migration for Cloud Integration, but a similar process is required for API Management migration. The Migration Accelerator for Cloud Integration consists of a set of Postman collections. Postman is a client test tool that allows you to easily send documents over HTTP(S) to a server application.

I used version 8.6.1, the migration guide lists the minimum version required and I would suggest you check the guide

The steps to use the migration are simple:

  • Set up the user credentials

  • Run the readiness check and if all is clear

  • Run the migration

The Migration Accelerator and associated guide can be found here. It uses the API of the Cloud Integration tenants to copy content from a source CPI tenant to a target Cloud Integration tenant. The tool will copy

  • standard packaged integration content while preserving the configuration of the source system

  • custom content that has been versioned (not in draft mode and it will only take the latest version)

  • public certificates (if they do not already exist in the target system)

  • Value mappings

  • Number range objects

  • Access policies


It will not cover:

  • Message monitoring data

  • JMS queue data

  • Tenant sizing information

  • Data and messages stored in the database

  • Local and global variables, or

  • Any content stored in the Partner Directory


This also means that there will be some steps to complete once the migration tool has been used. To start with, we recommend reviewing the security artefacts as any user credentials, private certificates, custom certificates, and user credentials will need to be recreated in the new multi-cloud tenant.

Post upgrade steps

The final part of the migration phase is to prepare for testing. This is done in the development sub-account or test sub-account in the case of a 2-tier landscape.


For on premises systems, you will need to reconfigure your SAP Cloud Connector to connect to the new sub-account and create any destinations on the sub-account you needed in the NEO account. You will also need to set up your principal propagation settings in the new account so that you can test that these are reaching your back-end systems in the test phase. If you have now already done this in the preparation phase, you will want to configure your users and authorizations in the new sub-account for your developers, testers, and system administrators. You may also want to revise their role assignments or create custom roles and role collections.


Your security requirement may also require that you set up a custom domain on your new sub-account, so this is the point you will want to do this step as well (if you have not already done this in the preparation phase).


Next, you will want to revise the dependencies of the migrated integration flows. Any dependencies on environment variables (usually found in groovy script flow steps) will have to be checked as the multi-cloud environment relies on a different set of variables. Make sure you define local and global variables your scenarios rely on as well in the new environment.


Finally, you will want to switch over you client application to use the new multi-cloud, sub-account endpoints. This might also require revising you firewall configuration as the IP-ranges may have changed if you are using a proxy to connect on premises systems to SAP BTP.


The last step we will look at in detail is testing. Hopefully, you already have defined unit and integration tests for your scenarios. You will want to set up your new Test tenant to be able to run these tests making sure the client applications are now connected to the new environment, cloud connectors are adjusted, and security credentials are created accordingly.

Scenarios (integration packages) usually take about a week of testing but if you are organized and have a team, many of these tests can be run concurrently. Here is a sample estimation of the time required for a simple migration:

Transportation to the productive tenant should follow the normal path your organization uses. You may migrate collateral artefacts like number ranges and value maps from the NEO instance, but integration flows should be transported from your test instance to your production using a transport management system.



I will not go into detail on the final step, go-live. The only key steps are:

  • Launch and monitor your integration scenarios from the multi-cloud environment

  • De-commission older tenants post cool-off period.

Here, the cooling-off period refers to making sure any monitored messages in the NEO environment have been delivered and their respective processes have completed. Also, making sure, if you have used JMS queues, that the messages that were in those queues at the beginning of the cut-over, are now delivered and that they are no longer needed.


This brings us to the end of this blog (series). Thank you for reading and I hope it was helpful and informative. If you have any questions about moving from Cloud Integration on NEO to the new Integration Suite on the multi-cloud platform, please send an email to