CRM and CX Blogs by Members
Find insights on SAP customer relationship management and customer experience products in blog posts from community members. Post your own perspective today!
Showing results for 
Search instead for 
Did you mean: 


This blog would help you as an Architect to speed-up the migration process for your On-Prem / SAP Commerce from CCv1 (SAP Commerce Cloud on SAP Infrastructure) to CCv2 (SAP Commerce Cloud on Microsoft Azure Cloud). We will also see the difference between CCv1, On-Prem, and CCv2 and then look into detailed solutions for Data Compatibility and Data migration which is always the major concern of Architects and Project Management which will be the focus of this blog.

Back to Basics


This is also known as SAP Commerce cloud on SAP Infra since the Infrastructure and Data centers on which the environment runs are from SAP. Customers use to receive a detailed architectural diagram of the server configuration for each environment. Any deployment on STG and PROD required a service request ticket to SAP. The DB used here was HANA DB. We also use Splunk in many cases to check the server logs from QA/STG and PROD. The process to deploy any fix or troubleshooting issues was very complicated and had to go through a very slow process.


This is purely hosted on the customer location and managed by the customer alone without much involvement from SAP. The customer has their own choice to select any cloud providers (such as GCP / AWS ). The only support that SAP usually provides is those which is related to the Commerce Framework/Solution. The customer has the choice to select their own DB such as HANA, Oracle, or MySQL.


This is based on a public cloud which is from Microsoft Azure. DB in the case of CCv2 is Microsoft Azure SQL Database which is provided from Azure Cloud. Here we use Kibana for checking server logs and Dynatrace for monitoring.


Data Migration Architecture

Source :



Data Model Compatibility for CCv2

Before you migrate it's very important that you check that your data model has been declared with Microsoft Azure SQL database-specific mapping.

Data Type Database Mapping
hsqldb mysql sap sqlserver
java.lang.String VARCHAR(255) varchar(255) nvarchar(255) nvarchar(255)
String VARCHAR(255) varchar(255) nvarchar(255) nvarchar(255)
java.lang.Float float float(20,5) decimal(20,5) float
java.lang.Double double double decimal(30,8) float
java.lang.Byte smallint smallint smallint integer
java.lang.Character smallint smallint smallint char(4)
java.lang.Short smallint integer integer integer
java.lang.Boolean tinyint tinyint(1) decimal(1,0) tinyint
java.lang.Long bigint bigint bigint bigint
java.lang.Integer int integer bigint integer
float float default 0 float(20,5)
float default 0
double double default
double DEFAULT 0 decimal(30,8)
float default 0
byte smallint default
smallint DEFAULT 0 integer
integer default 0
char smallint default
smallint DEFAULT 0 integer DEFAULT '' char(4) default ''
short smallint default
integer DEFAULT 0 integer DEFAULT 0 integer default 0
boolean tinyint default 0 tinyint(1) DEFAULT
decimal(1,0) DEFAULT
tinyint default 0
long bigint default 0 bigint DEFAULT 0 bigint DEFAULT 0 integer default 0
int int default 0 integer DEFAULT 0 bigint DEFAULT 0 integer default 0
java.util.Date timestamp datetime timestamp datetime2
java.math.BigDecimal decimal(30,8) decimal(30,8) decimal(30,8) decimal(30,8) longvarbinary longblob blob image
HYBRIS.LONG_STRING longvarchar text nvarchar(5000) nvarchar(max)
HYBRIS.JSON longvarchar longtext nclob nvarchar(max)
HYBRIS.COMMA_SEPARATED_PKS longvarchar text nvarchar(5000) nvarchar(max)
HYBRIS.PK BIGINT bigint bigint bigint

It's recommended that you also check the help page Specifying a Deployment for Commerce Platform Types under the section 'Advanced Deployment'

Data Maintenance before DB migration

Data maintenance is an essential part of every enterprise application, and one must consider the impact of the growing amounts of data and avoid migrating unwanted data. I will be covering the Data Maintenance topic in detail on another blog where we will understand the need for answering the following question

  • What data is live?

  • What data can be archived?

  • What data need to be cleaned up?

The SAP Commerce Architect must consider a step for Data CleanUp as part of the CCv2 migration project.

It's important that you check the below points about your Data before migrating from one DB to Azure DB

  • The uniqueness of each record

  • No Whitespaces and Case-sensitive codes

Data Migration

Once you are ready to start migrating data, you can follow the below guidelines that are already available :

NOTE: Don't forget the VPN requirement to allow the Azure system to access your onPrem DB and have a secure way to transfer the data from Source (Older DB) to Target (Microsoft Azure SQL Database).


Data Refresh and Backup on CCv2

During multiple projects of CCv2, I have seen that client is always concerned about the backup of PROD Data and the data refresh activities on a lower environment mainly QA/STG systems.

With CCv2 it's so easy to have a backup of your Data, which preserves the environment's database instances and media storage structure. Applying this Data Backup to any of the selected environments is very flexible. It's a feature available in the Cloud portal as shown in the below screenshot.

NOTE: Hourly backup is usually done on PROD

DB Backup

Having PROD-like data on the Lower environment (QA/STG) should be done with Anonymized PROD data. The following documents helps you to achieve this requirement :


Thank you for reading this blog post, hope this helped you to migrate your On-Prem Data to CCv2 Cloud with a clean and faster approach.

Please feel free to share your feedback or thoughts or ask questions in the Q&A tag below.

Q&A Link

Stay tuned for more solutions. 🙂