Technology Blog Posts by Members
cancel
Showing results for 
Search instead for 
Did you mean: 
pulkit_bhatnagar
Explorer
1,416

Integration scenarios require frequent data communication and exchange between different applications and environments. Cloud data platforms such as Snowflake provide efficient data storage and analytics. Snowflake Adapter for SAP Integration Suite is a great tool to solve your integration problems while offering a reliable and scalable capability for data-related integrations.

Snowflake Adapter in SAP CI

Snowflake Adapters can be used to implement various data solutions, a few of them are as follows:

  • Transaction Analysis for retail business : Seamlessly load high-volume transactional data into Snowflake to analyze buying behavior, sales trends, and inventory performance enabling smarter business decisions.
  • Data Archival & Historical Analysis : Adapter can be used Move large datasets into Snowflake for secure storage and long-term trend analysis without impacting operational systems.

  • Supply Chain Data : Integrate logistics, ERP, and vendor data into Snowflake to monitor supply chain efficiency.

With this, in the following sections, we will explore the key functionalities and features of the adapter in detail.

Overview

The Snowflake adapter is designed to be employed as a receiver adapter. In such a scenario, SAP Cloud Integration acts as the initiator of the calls. The adapter uses database authentication and JDBC as the transport protocol.

You can perform various CRUD operations on your Snowflake application and choose from a list of different staging storage options, such as Amazon S3, Google Cloud Storage, and Microsoft Azure.  

Below is the high-level overview of how the adapter interacting with Snowflake Application and SAP Cloud Integration.

pulkit_bhatnagar_0-1759233529958.png

 

Key Features

The Snowflake Adapter provides the following key features:

  • Bulk Export and Import: Supports reading files from multiple input formats like CSV, JSON, and Avro. Export data to your desired format using unload operation.
  • Snowflake Operations Support: Snowflake Adapter offers support for standard Snowflake operations (CRUD Operations).
  • Multiple options for Staging Storage: Snowflake Adapter provides various options to load data into staging storage like Amazon S3, Google Cloud Storage, or Microsoft Azure.
  • Secure Authentication: Provides secure authentication with Database Account and
    Key-Pair options.


Snowflake Adapter Integration: Practical Use Case Walkthrough
:

Imagine a business scenario where 1 million records from an ERP system need to be loaded into Snowflake. If we take the straightforward route of simply fetching the data from ERP and pushing it into Snowflake, the process could drag on for several minutes. But what if we could speed things up dramatically?

By smartly combining a set of Integration Suite palettes with the Snowflake adapter, I have built a custom batch processing mechanism with which, the same operation can be completed in just a matter of seconds—delivering both speed and efficiency.

In this blog, I’ll Walk you through how to handle such high-volume data insertion with the Snowflake adapter, and we’ll also take a look at how the execution unfolds inside Snowflake itself.

Before we begin, let us assume that a table is available in Snowflake with the required schema.

CI iflow Design

pulkit_bhatnagar_2-1759233617842.png

Mapping Component:

Once the data is received in SAP Integration Suite; the payload will be mapped to required column headers, which are defined in the target table in Snowflake. We employ an iterating splitter to split the records into groups (defined in the Grouping field) and then insert the records in Snowflake. 

Splitter Component:

Using the Iterating Splitter, you can enable the Streaming checkbox to improve efficiency by splitting a large composite message into smaller chunks for processing. Additionally, you can enable Parallel Processing of split messages to enhance efficiency. You can specify this by providing value for Number of Concurrent Processes.

pulkit_bhatnagar_4-1759233662441.png

Note: You can modify the grouping, utilize parallel processing and adjust the number of concurrent processes based on your requirements.

XML Modifier Component:

XML Modifier is used to remove the XML declaration from the XML body to make XML compatible with the snowflake.

pulkit_bhatnagar_5-1759233691016.png

Content Modifier Component - Payload for Snowflake Adapter Insert Operation:

Content Modifier is used to store the payload to be inserted in the table. The following instructions are handy while creating the body for the payload. 

  1. Specify the table structure including datatype and columns in the <metadata> tag. 
  2. Use the <row> tag for values to be inserted. Below is the sample XML Payload for the Insert operation:
<root> 
	<metadata> 
		<fieldname datatype = “VARCHAR”>CREATION_DATETIME</fieldname> 
		<fieldname datatype = “VARCHAR”>RECORDTYPE</fieldname> 
		<fieldname datatype = “VARCHAR”>DOC_NO</fieldname> 
		<fieldname datatype = “VARCHAR”>ITEM</fieldname> 
		<fieldname datatype = “VARCHAR”>CREATED_ON</fieldname>	 
		<fieldname datatype = “VARCHAR”>REF_ITEM</fieldname> 
		<fieldname datatype = “VARCHAR”>CREATED_BY</fieldname> 
	<metadata> 
	<row> 
		<CREATION_DATETIME>23012025</CREATION_DATETIME> 
		<RECORDTYPE>010101</RECORDTYPE> 
		<DOC_NO>2020202</DOC_NO> 
		<ITEM>12345</ITEM> 
		<CREATED_ON>01012025</CREATED_ON> 
		<REF_ITEM>011</REF_ITEM> 
		<CREATED_BY>AUSR</CREATED_BY> 
	</row> 
</root>

Note: metadata/row tags and datatype attributes are case-sensitive and must be in lowercase. 

Insert Using Snowflake Adapter

Switch to the Processing tab to perform the Insert.

pulkit_bhatnagar_7-1759233814493.png

Now the iflow designing and configuration is complete, the iflow is ready to accept the data.

-------------------------------------------------------------------------------------------------------------------------------

Demo:

Data has been triggered from ERP and data has been inserted into snowflake.


pulkit_bhatnagar_8-1759233843826.png

Navigate Monitoring, you can observe that the records are inserted in Snowflake.

pulkit_bhatnagar_9-1759233871340.png

pulkit_bhatnagar_10-1759233887961.png

You can verify the same by logging into the Snowflake application. In the Monitoring section, SQL queries are created for individual records.

pulkit_bhatnagar_11-1759233918072.png

Earlier using a direct insertion approach, the data load into Snowflake typically required approximately 10–12 minutes. By implementing the optimized iflow design outlined above, the same volume of records is processed and inserted into the Snowflake application within 30–35 seconds, demonstrating a significant improvement in performance and efficiency.

Final Thoughts

In summary, the Snowflake adapter enables smooth, secure, and scalable integration between cloud applications and Snowflake. The successful insertion of records highlights its efficiency and reliability, positioning it as a key enabler for enterprises looking to modernize their data landscape and leverage Snowflake’s analytics capabilities.

References: https://api.sap.com/package/SnowflakeAdapterforSAPIntegrationSuite/overview

Note: The Snowflake adapter is available as part of your SAP Integration Suite license.

 




2 Comments