cancel
Showing results for 
Search instead for 
Did you mean: 

How to ingest data from SAP BW to Azure SQL DB Table using SAP DI Graphs (Pipeline)

joy_guha
Explorer
0 Kudos
1,722

Hello SAP DI Experts,

We have a requirement to send BW ADSO data (having 8 columns) from BW on HANA system to a New Table in Azure SQL DB having similar table structure like BW ADSO (with 8 columns).


Steps we followed:

1. A New Table is created in Azure SQL DB with same table structure like BW ADSO.

2. Azure SQL DB connection is created under SAP DI connections with AZURE_SQL_DB connection type. Connection check is OK.

3. We set up BW on HANA system connection under SAP DI connections with BW connection type. Connection check is OK.


Challenges we are facing:

1. Created New Graph with “Data Transfer” operator to transfer data from BW ADSO, however the Connection IDs available for Target Mapping does not have the option to select Azure SQL DB connection. Hence, “Data Transfer” operator could be used to transfer data from BW ADSO to the New Table in Azure SQL DB.

2. We tried using “SQL Consumer” operator, however, we find that this operator can read from Azure SQL DB and produce a structured output. But here we do not find any option to read data from BW ADSO and insert that data into the New Table of Azure SQL DB. Hence, this operator could not be used for our requirement.

3. We tried with “Flowagent SQL Executor” operator, however, the database on which SQL statement will be executed are not clearly understandable and hence we could not make out how we can read data from BW ADSO using SQL statement and ingest the data into the New Table of Azure SQL DB with “Flowagent SQL Executor” operator.


Our Question to this Forum:
As we are facing challenge around how we can transfer data from BW ADSO to a Table in Azure SQL DB, hence, we would like to know which SAP DI Operator or SAP DI Graph Template or may be combination of different SAP DI Operators should we use to achieve the solution for such kind of requirement.

If anyone has any experience on handling such requirement or is aware of any blogs with similar requirement, please share your valuable input.

Many Thanks.

Best Regards, Joy.

Accepted Solutions (0)

Answers (5)

Answers (5)

amish1980_95
Participant
0 Kudos

Hello Joy,

I hope your issue gets resolved quickly, I will pitch in with my few cents if I find something useful.

I am also trying to connect to MS azure cloud SQL DB and I am facing a issue while connecting to it from SAP Data intelligence: Error while connecting to service: ESOCKETTIMEDOUT while connecting to Azure Cloud SQL DB | SAP Comm...

The DB port is open as DB is accessible from other applications(e.g.BODS).

Is there any other configuration to be maintained for fix the connection?

Thanks,

Indu.

joy_guha
Explorer

Hi Indu,

To set up AZURE_SQL_DB connection in SAP DI, we need the following information:

1. Host/Server Name of Azure SQL DB

2. Port #

3. SQL Database Name and Table Name

4. User ID and Password - with which SAP DI and Azure SQL DB will communicate.

All the above information were provided to us by SQL DB Administrator and we were able to connect successfully with them.

We have not faced the error which you have mentioned, therefore, we suggest that you get this error checked by SQL DB Administrator. May be they can have some pointers.

Please share your outcome.

Best regards,

Joy.

shaktik
Explorer
0 Kudos

Dear Joy,
I will suggest , if you can read data from BW , you will be getting this info in message. Try storing it in a variable and convert it into CSV/Dataframe using Python logic. Once you are able to do this you can play around with data using custom logic and push it to Azure as well

I will prefer not to stage data in DI and try handling it at runtime unless you have business requirements to stage. Please note this may not require necessarily custom operator but definitely custom logic.

Thanks
Shakti

joy_guha
Explorer
0 Kudos

Hello SAP DI Experts,

We raised an Incident with SAP regarding this and we received a response that Azure SQL DB is not a supported target in SAP DI neither SAP has any plans to include this feature very soon in SAP DI.

Hence, as Michal mentioned, we need to try this with Custom Operator.

Thank you.

Best Regards,

Joy.

HWO
Explorer
0 Kudos

Hi Joy,

maybe it is easier to transfer data from ADSO via ODP to vora and from there to azure 😉

Best regards,

Tobi

joy_guha
Explorer
0 Kudos

Thanks Tobias for responding.
Please note that SAP Vora connection type is deprecated and will be removed in future releases as per SAP Note 3036489.

Hence, we are not considering this option at present.

Thank you.

Best Regards,

Joy.

michal_majer
Active Participant
0 Kudos

Hello Joy,

You could split this operation on two parts.
At first, use Data Transfer operator to save data to the csv.file, then read this file and insert data to Azure using SQL Consumer.

I know it's not perfect, but I couldn't find any other working solution.

Let me know what You decided 🙂

Best,

Michal

joy_guha
Explorer
0 Kudos

Hello Michał,

Thanks for your response.

I tried this approach as well, however, we could not succeed as there seems to be some file permission issue while reading csv.file from Data Lake using SQL command.

What we did:

1. Used SAP DI ABAP Operator to transfer BW ADSO data and persisted the file in a CSV format under Data Lake.

2. Used SAP DI Flowagent SQL Executor Operator to read the CSV file (in Data Lake) and tried with ‘BULK INSERT’ SQL Command to insert data into the New Table in Azure SQL DB.

The SQL code is as follows:

BULK INSERT dbo.xxxxx

FROM 'DI_DATA_LAKE/external/xyz.csv'

WITH

(

FIRSTROW = 2,

FIELDTERMINATOR = ',',

ROWTERMINATOR = '\n',

TABLOCK

)

On execution of the Graph, it failed with below message:

Messages: Graph failure: flowagentsqlexecutor1 failed with the following error:

DBS-070401: |Session Job_fc8ea3e4aab94e50b0987d7cf13436c0 ODBC data source <cf-analytics.database.windows.net,1433> error message for operation <SQLExecute>: <[SAP BODS][ODBC SQL Server Wire Protocol driver][Microsoft SQL Server]Cannot bulk load because the file "DI_DATA_LAKE/external/xyz.csv" could not be opened. Operating system error code (null).>. RUN-050304: |Session Job_fc8ea3e4aab94e50b0987d7cf13436c0 Function call <sql ( Datastore_996e5b5046c5444fabca50a7580cbeee, BULK INSERT dbo.xxxxx FROM 'DI_DATA_LAKE/external/xyz.csv' WITH ( FIRSTROW = 2, FIELDTERMINATOR = ',', ROWTERMINATOR = '\n', TABLOCK ) ) > failed, due to error <70401>

Most likely, the persisted csv.file in Data Lake do not have the required permission to be read by SQL Command.

Hence, we are interested to know if there are any better approach where without persisting BW ADSO data into Data Lake, we can still be able to push BW ADSO data to the New Table in Azure SQL DB.

OR

Is it mandatory to persist BW ADSO data in a csv.file (in Data Lake) before moving them to Azure SQL DB? In that case, how do we grant permission to read that csv.file (in Data Lake) so that it could be opened during the execution of SQL Command?

Many Thanks.

Best Regards, Joy.

michal_majer
Active Participant
0 Kudos

I guess it's not possible to point path to file as You did. Graph probably couldn't find the file. The SQL Statement runs directly on the database, and file must be available from database point of view.

Now I also see the Table Producer Operator is not supporting Azure SQL DB which is shame 😕

I couldn't find any native operator that will allow to insert data directly to Azure DB. For now You could try to create custom operator using JS / Node.js / Python / GO to implement that on Your own.