Integration scenarios require frequent data communication and exchange between different applications and environments. Cloud data platforms such as Snowflake provide efficient data storage and analytics. Snowflake Adapter for SAP Integration Suite is a great tool to solve your integration problems while offering a reliable and scalable capability for data-related integrations.
Snowflake Adapter in SAP CI
Snowflake Adapters can be used to implement various data solutions, a few of them are as follows:
With this, in the following sections, we will explore the key functionalities and features of the adapter in detail.
Overview
The Snowflake adapter is designed to be employed as a receiver adapter. In such a scenario, SAP Cloud Integration acts as the initiator of the calls. The adapter uses database authentication and JDBC as the transport protocol.
You can perform various CRUD operations on your Snowflake application and choose from a list of different staging storage options, such as Amazon S3, Google Cloud Storage, and Microsoft Azure.
Below is the high-level overview of how the adapter interacting with Snowflake Application and SAP Cloud Integration.
Key Features
The Snowflake Adapter provides the following key features:
Snowflake Adapter Integration: Practical Use Case Walkthrough:
Imagine a business scenario where 1 million records from an ERP system need to be loaded into Snowflake. If we take the straightforward route of simply fetching the data from ERP and pushing it into Snowflake, the process could drag on for several minutes. But what if we could speed things up dramatically?
By smartly combining a set of Integration Suite palettes with the Snowflake adapter, I have built a custom batch processing mechanism with which, the same operation can be completed in just a matter of seconds—delivering both speed and efficiency.
In this blog, I’ll Walk you through how to handle such high-volume data insertion with the Snowflake adapter, and we’ll also take a look at how the execution unfolds inside Snowflake itself.
Before we begin, let us assume that a table is available in Snowflake with the required schema.
CI iflow Design
Mapping Component:
Once the data is received in SAP Integration Suite; the payload will be mapped to required column headers, which are defined in the target table in Snowflake. We employ an iterating splitter to split the records into groups (defined in the Grouping field) and then insert the records in Snowflake.
Splitter Component:
Using the Iterating Splitter, you can enable the Streaming checkbox to improve efficiency by splitting a large composite message into smaller chunks for processing. Additionally, you can enable Parallel Processing of split messages to enhance efficiency. You can specify this by providing value for Number of Concurrent Processes.
Note: You can modify the grouping, utilize parallel processing and adjust the number of concurrent processes based on your requirements.
XML Modifier Component:
XML Modifier is used to remove the XML declaration from the XML body to make XML compatible with the snowflake.
Content Modifier Component - Payload for Snowflake Adapter Insert Operation:
Content Modifier is used to store the payload to be inserted in the table. The following instructions are handy while creating the body for the payload.
<root>
<metadata>
<fieldname datatype = “VARCHAR”>CREATION_DATETIME</fieldname>
<fieldname datatype = “VARCHAR”>RECORDTYPE</fieldname>
<fieldname datatype = “VARCHAR”>DOC_NO</fieldname>
<fieldname datatype = “VARCHAR”>ITEM</fieldname>
<fieldname datatype = “VARCHAR”>CREATED_ON</fieldname>
<fieldname datatype = “VARCHAR”>REF_ITEM</fieldname>
<fieldname datatype = “VARCHAR”>CREATED_BY</fieldname>
<metadata>
<row>
<CREATION_DATETIME>23012025</CREATION_DATETIME>
<RECORDTYPE>010101</RECORDTYPE>
<DOC_NO>2020202</DOC_NO>
<ITEM>12345</ITEM>
<CREATED_ON>01012025</CREATED_ON>
<REF_ITEM>011</REF_ITEM>
<CREATED_BY>AUSR</CREATED_BY>
</row>
</root>Note: metadata/row tags and datatype attributes are case-sensitive and must be in lowercase.
Insert Using Snowflake Adapter
Switch to the Processing tab to perform the Insert.
Now the iflow designing and configuration is complete, the iflow is ready to accept the data.
-------------------------------------------------------------------------------------------------------------------------------
Demo:
Data has been triggered from ERP and data has been inserted into snowflake.
Navigate Monitoring, you can observe that the records are inserted in Snowflake.
You can verify the same by logging into the Snowflake application. In the Monitoring section, SQL queries are created for individual records.
Earlier using a direct insertion approach, the data load into Snowflake typically required approximately 10–12 minutes. By implementing the optimized iflow design outlined above, the same volume of records is processed and inserted into the Snowflake application within 30–35 seconds, demonstrating a significant improvement in performance and efficiency.
Final Thoughts
In summary, the Snowflake adapter enables smooth, secure, and scalable integration between cloud applications and Snowflake. The successful insertion of records highlights its efficiency and reliability, positioning it as a key enabler for enterprises looking to modernize their data landscape and leverage Snowflake’s analytics capabilities.
References: https://api.sap.com/package/SnowflakeAdapterforSAPIntegrationSuite/overview
Note: The Snowflake adapter is available as part of your SAP Integration Suite license.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
| User | Count |
|---|---|
| 12 | |
| 9 | |
| 7 | |
| 7 | |
| 6 | |
| 5 | |
| 3 | |
| 3 | |
| 3 | |
| 3 |