CRM and CX Blogs by Members
Find insights on SAP customer relationship management and customer experience products in blog posts from community members. Post your own perspective today!
Showing results for 
Search instead for 
Did you mean: 
Hello Everyone,
In this Blog Post, I will explain the basics of the Cloud Hot Folder & the steps to Implement it.


Cloud Hot Folder

is a file-based, asynchronous integration mechanism for importing data.

SAP Commerce Cloud uses Cloud Hot folder with Microsoft Azure Blob Storage. It removes the need for local or shared directories.

There are 3  Ways to configure mappings:

  1. Default Mapping Channel

  2. Unzip channel

  3.  Unmapped Channel

Here We will use Default mapping to configure.

Azure Blob storage:

is Microsoft's object storage solution for the cloud. Blob storage is optimized for storing massive amounts of unstructured data. Unstructured data is data that doesn't adhere to a particular data model or definition, such as text or binary data.


Before We start Please download these 3 extension packages:

  1. cloudcommons

  2. cloudhotfolder

  3. azurecloudhotfolder


So let's start with steps to implement cloud hot folder:


Step 1: Configure  Cloud Folder Properties:

In local Extensions add the following properties:
#Check Azure cloud Blob properties for Connection string and account name =
# Name of the Blob Container in Azure
# Automatically create the Blob Container when Hybris starts up.
# Name of the hotfolder in the Blob Container, tenantId defaults to master in local development.${tenantId}/hotfolder

# Required to process hot folder files.
# Properties from cloudhotfolder/
# Specify what store folder, catalog, price config and default file names mapped onto hotfolder processes


Now we Need to create a new XML file where we will do all the configurations for Importing data

Step 2: Schema definition 

<?xml version="1.0" encoding="UTF-8"?>

<beans xmlns=""



Step 3: Target Configuration 

We need to configure What type of files it will accept to import. Here we provide regex.
<bean class="org.springframework.beans.factory.config.MethodInvokingFactoryBean">
<property name="targetObject" ref="hotfolderInboundFileChannelMappings"/>
<property name="targetMethod" value="put"/>
<property name="arguments">
<bean class="java.util.regex.Pattern" factory-method="compile">
<constructor-arg value="(?i)^(Product).*.(.csv)$" />
<ref bean="ProductBatchFilesCloudHotFolderProc"/>

<int:channel id="ProductBatchFilesCloudHotFolderProc"/>


Step 3: Service Activator

This is the step which depicts starting step for Cloud Hot Folder where we provide Impex header attributes that you want to set.
  <!--Service Activator-->

<int:service-activator input-channel="ProductBatchFilesCloudHotFolderProc" output-channel="batchFilesHeaderInit"
method="execute" />

<!--Header Setup -->

<bean id="ProductCloudHotFolderHeaderSetupTask" class="de.hybris.platform.acceleratorservices.dataimport.batch.task.HeaderSetupTask">
<property name="net" value="false"/>

Step 4: Custom Mapping

Here we define 2 beans

  1. Define converter mapping: Here we define the name of the converter and mapping: filename of .csv that this mapping will be used for( here we expecting a file that starts with product-**.csv).
     <bean id="ProductConverterMapping"
    <property name="mapping" value="Product"/>
    <property name="converter" ref="ProductConverter"/>

  2. Converter: Here We write both the Impex INSERT_UPDATE statement and define rows(number of inputs, mandatory fields )."+" indicates a mandatory field.
    Give the name of the model in type property value.

<bean id="ProductConverter" class="de.hybris.platform.acceleratorservices.dataimport.batch.converter.impl.DefaultImpexConverter">
<property name="header">
INSERT_UPDATE Product;code[unique=true];name[lang=$lang];$catalogVersion;$approved
<property name="impexRow">
<property name="type" value="Product"/>


Step 5: Define Header

If you don't want to provide a header using headerSetupTask, you can create a bean and define that bean in Converter just like we defined above (productImpexHeader)
<bean id="productImpexHeader" class="java.lang.String">
<value># ImpEx for importing data into $CATALOG$


Step 6: Configure Inbound channel adapter:

Here we configure Inbound related configurations and properties:
    <int:inbound-channel-adapter id="ProductAzureInboundChannelAdapter"
<int:poller fixed-rate="${}"
<int:transactional synchronization-factory="ProductAzureSynchronizationFactory"

Step 7: Configure Inbound channel Synchronizer:

Here we configure basic configurations to do while importing data.
<bean id="ProductAzureBlobInboundSynchronizer"
<constructor-arg name="sessionFactory" ref="azureBlobSessionFactory"/>
<property name="remoteDirectory" value='#{azureHotfolderRemotePath}' />
<property name= "moveToRemoteDirectory" value="#{azureHotfolderRemotePath}/Product/processing"/>
<property name="deleteRemoteFiles" value="${}"/>
<property name="preserveTimestamp" value="true"/>
<property name="filter" ref="azureHotfolderFileFilter"/>
<property name="comparator" ref="azureHotFolderFileComparator"/>



Step 7: Configure Outbound channel adapter:

Here We configure what to do if files get imported.
Where do you want to move files when failed or Passed successfully.
 <int:transaction-synchronization-factory id="ProductAzureSynchronizationFactory">
<int:after-commit channel="ProductAzureArchiveOutboundChannelAdapter"/>
<int:after-rollback channel="ProductAzureErrorOutboundChannelAdapter"/>

<int:outbound-channel-adapter id="ProductAzureArchiveOutboundChannelAdapter" ref="ProductAzureArchiveMessageHandler"> </int:outbound-channel-adapter>
<int:outbound-channel-adapter id="ProductAzureErrorOutboundChannelAdapter" ref="ProductAzureErrorMessageHandler"></int:outbound-channel-adapter>

<bean id="ProductAzureArchiveMessageHandler" parent="abstractAzureMoveMessageHandler">
<property name="remoteDirectory" value="#{azureHotfolderRemotePath}/Product/archive"/>
<bean id="ProductAzureErrorMessageHandler" parent="abstractAzureMoveMessageHandler">
<property name="remoteDirectory" value="#{azureHotfolderRemotePath}/Product/error"/>


Step 7: Other Basic Configurations:

We can create these beans and make changes based on our requirements.
  <bean id="ProductAzureBlobSynchronizingMessageSource"
<constructor-arg name="synchronizer" ref="ProductAzureBlobInboundSynchronizer"/>
<property name="autoCreateLocalDirectory" value="true"/>
<property name="localDirectory" value="#{azureHotfolderLocalDirectoryBase}/"/>
<property name="maxFetchSize" value="${}"/>

<bean id="batchTransformerTask"
<property name="fieldSeparator" value="," />
<property name="encoding" value="UTF-8" />
<property name="linesToSkip" value="0"/>
<property name="cleanupHelper">
<null/><!-- not needed for tests -->

<bean id="azureHotfolderLocalDirectoryBase" class="java.lang.String">
<constructor-arg name="value"
<bean id="azureHotFolderBlobSession"
<constructor-arg name="client"
<constructor-arg name="containerName"
<constructor-arg name="createContainerIfNotExists"

<bean id="azurePsuedoTxManager"

Step 8: Import the above XML Configuration in extension-spring.xml

 <import resource="extension/custom/cloudHotFolder-ProductReceiving-spring.xml"/>

Step 9: Impex Runner Task

Sometimes requirement comes to take a custom approach if Impex failed while importing data.
In that case, we can define a bean :
<!-- Impex Runnner Task -->
<bean id="batchRunnerTask" class="com.melco.integration.service.customService.CustomImpexRunnerTaskService">
<property name="sessionService" ref="sessionService" />
<property name="importService" ref="aopMonitoringImportService" />
<lookup-method name="getImportConfig" bean="importConfig" />

The class mentioned above should extend AbstractImpexRunnerTask.

Write your custom code inside processFile method where you can define what actions you want to take if there is an error while importing.


if (importResult.isError() && importResult.hasUnresolvedLines()){ perform some action}


What happens to the placed file?

  1. The cloud hot folder moves any file placed in the Blob directory to a temporary processing directory.

  2. The cloud hot folder downloads the file from the processing directory to a standard hot folder in SAP Commerce Cloud.

  3. Standard hot folder decompresses the file and converts it into Impex format using ImpexConverter for import.

  4. When the hot folder finishes processing the file, the cloud hot folder moves it from the temporary processing directory to the error/archive directory.


I hope this tutorial gave you a small overview of how the Cloud hot folder works and what are the steps to configure it.
If you have any doubts please feel free to drop a comment.

Thanks & Regards,
Sagar Bhambhani