Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
Arsheen_Jahan
Explorer
16,372

Introduction:


SAP BO Data Services is an ETL tool used for Data integration, data quality, data profiling and data processing. It allows you to integrate, transform trusted data-to-data warehouse system for analytical reporting.BO Data Services consists of a UI development interface, metadata repository, data connectivity to source and target system and management console for scheduling of jobs.

Azure Data Lake includes all the capabilities required to make it easy for developers, data scientists and analysts to store data of any size, shape, and speed, and do all types of processing and analytics across platforms and languages. It removes the complexities of ingesting and storing all your data while making it faster to get up and running with batch, streaming and interactive analytics.

In this blog our focus is on ingesting data from SAP to Azure Data Lake using SAP DS.

Requirement:


We have a requirement to load SAP data to Azure Data Lake.

Solution:


SAP Data Services is an ETL tool from SAP and it improves the quality of data across the enterprise. As part of the information management layer of SAP’s Business Technology Platform, it delivers trusted, relevant, and timely information to drive better business outcomes.

Keeping in mind strong compatibility with SAP BW extractor and the strong ETL functionality it was decided to use SAP Data Service for extracting Data from SAP to Azure.

Creating data store configuration to connect DS to SAP:

For configuring SAP application data store in DS, please refer below link:

SAP application datastore

Importing source to DS


Please go through below link for importing SAP datasource in DS:

Importing SAP datasource in DS

Development

For development we are using standard approach please refer the below blog

DS development approach

Let’s discuss about how to connect DS to Azure Data Lake by creating file location object

File Location objects can then be used in the file configurations within a Dataflow to define the locations of the files that need to be processed or created.

As our target system is Azure Data Lake, we need to configure file location object to connect Azure to DS.

Right-click the File Locations folder in the Formats tab of the object library and select New.


Figure 1: To create new file location


Enter a Name for the new object in the Create New File Location dialog.


Figure 2: Naming for file location object


Select a protocol from the Protocol drop list. The default is Local.

To load target file in Azure Data Lake, we need to select “Azure Cloud Storage” from drop down list:


Figure 3: Protocol options


Then click on Edit for changing the remaining properties:

We have 3 Authorization types to configure:


Ø  Shared Key


Ø  File(blob) Shared Access Signature


Ø  Container Shared Access Signature



Figure 4: Authorization Types



File configuration with authorization type Shared Key:


When we are configuring with shared key, we need to provide below highlighted details:


Figure 5: File Configuration with Shared key


If we want to keep it as Default Configuration, then select it as YES. Click on Apply and Ok.

We can configure multiple configurations in a file location.

We will configure the other configuration with authorization type File(blob) Shared Access Signature in same file location.

Format-> File Location->SAP_AZR_BLOB-> Right Click Edit-> click on Edit:


                                                                     Figure 6: Edit

Once the edit window will open from click on left topmost icon for creating new configuration:


Figure 7: Adding new configuration


It will ask for configuration name add the one which is identifiable and click Ok:


Figure 8: New configuration name


When we are configuring File(blob)Shared Access Signature, we need to provide below highlighted details:


Figure 9: Configuration with File(blob)Shared Access Signature


You can find more detail on below link:

Azure Data Lake Store protocol options

Execution:


After DS job execution job fetched the data from SAP and loaded it into Azure blob file.


Figure 10:File availability in Azure blob



Summary:


Data Services integrates with many other products including SAP Data Quality Management, SAP Information Steward, SAP BEW, and SAP Master Data Management, non-SAP systems and others, all for different purposes and using different methods. I hope this blog helps on how SAP Business Objects Data Services can be used to extract data from SAP datasources (Extractors), and data can be easily loaded to Azure.

We will come with more blogs on the issues we face while loading data from SAP to Azure Data lake using DS, please stay tuned.

Please feel free to share the feedback and your thoughts in a comment.

Please refer below links for related topics:

SAP application datastore

Importing SAP datasource in DS

DS development approach

Azure Data Lake Store protocol options
2 Comments
Labels in this area