cancel
Showing results for 
Search instead for 
Did you mean: 
Read only

Data connection between Azure Data Lake Storage Gen2 and SAP IBP

peter_casper
Contributor

Accepted Solutions (1)

Accepted Solutions (1)

matthias_fitz
Explorer

Hello @peter_casper 
we've implemented an import from a Azure Data Lake. As the two colleagues have already mentioned, you have to use two configurations. To help you figure this out faster, than we did, I'll provide you some screenshot of our configuration. If your requirement is to send data there, you can use the identical configration but switch source and target system within the data flow. 


First you need to set up the general connection, that links the CI-DS to the Azure data lake. You need to provide the Data Lake Store name, the access data, the Local directory (on the CI-DS data agent), the remote path prefix within the data lake and the container. 

matthias_fitz_0-1734601949179.png

Next, you need to create the file format. In the configuration part you link it to the connection created beforehand.  

matthias_fitz_1-1734602218539.png
Once you saved it, you have to build the file structure that you are either reading from or writing to the data lake.

matthias_fitz_2-1734602322216.png

This is all configuration, next you only need to create a data flow for it and exchange data. Feel free to ask further questions if needed :). 

Best regards,
Matthias

 

peter_casper
Contributor
0 Kudos

Hi Matthias,

Thanks for the illustrated step by step guide. This will help a lot.
I just have an understanding question related to the data agent. So far I am only aware of installing agents on OS level of on prem SAP systems which are taking over the communication to the SAP IBP cloud. Where is the agent you are referring to running while setting up the Azure Data Lake Datastore in CI-DS?

Cheers, Peter

matthias_fitz
Explorer

Hi @peter_casper,


that is a great question I cannot answer you for sure. We established that connection quite a while ago, so I had to go through the communication we documented. First, we are using the same data agent that is used for the connection to ERP, so you do not ned another one. But it is necessary to to provide a directory in the data agent, that can be used as a temporary data storage. In my screenshot you can see that in the "Azure_Cloud_Connection" setup. On the right hand side, section File System, we assigned the "Local Directory" = "/temp". I cannot give you further information on the setup that was necessary on the data agent, because the digital communication stopped there because I met personally with my colleague from the SAP Basis team that is responsible for the data agent.

Next to that, we had a little issue with the authorization provided in Azure, but that was some try and error, until it was fixed. 

Answers (2)

Answers (2)

JoyD
Product and Topic Expert
Product and Topic Expert
0 Kudos
Hi In CI-DS, you can create Azure Data Lake Storage Gen2 type File location. Then associate a File Format Group with this ADLS Gen2 file location. Then use the FFG and IBP as dataflow source/targets. Please refer below https://help.sap.com/docs/cloud-integration-for-data-services/help-center-for-sap-cloud-integration-... Best Regards, Joy
April_Miao
Product and Topic Expert
Product and Topic Expert
0 Kudos
You can create a File Location with Azure Data Lake Storage. See details in this document Create a File Location Object(https://help.sap.com/docs/cloud-integration-for-data-services/help-center-for-sap-cloud-integration-...) Then create a task to read from IBP datastore then load into a File upload to Azure.