cancel
Showing results for 
Search instead for 
Did you mean: 

Snowflake as a target for SAP Datasphere

mateusz_mikulski
Participant
0 Kudos
1,317

Hi,

is it possible to store Datasphere view results in Snowflake, ideally without any additional software in between?

The idea I would like to test is:

  • Expose BW4 query as an InfoProvider;
  • Read the query as a remote table in Datasphere view
  • Consume the view in Snowflake (and store data there)

I am able to find some materials about how to achieve that with SAP Data Intelligence (below), but I would prefer to use SAP Datasphere instead.

Any ideas how to do that?

Link to SAP DI - Snowflake blog post:

https://blogs.sap.com/2023/01/19/loading-data-into-snowflake-database-through-sap-di-custom-operator...

Accepted Solutions (0)

Answers (1)

Answers (1)

JulianJuraske
Participant
0 Kudos

Hello,

so mor my understanding you trying to build this Pipiline:
BW(Query)->DSP->Snowflake.

First and former it's depending on your BW System Version, but with this Link you should get an Idea how to release the Query's to DSP:https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/a3d4a2f91bea4810ba8839ff73577dac.html?locale=en-US

Within DSP all you got to is import the Querys by following Modell Transfer Wizards.

https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/a3d4a2f91bea4810ba8839ff73...

The Wizard will than create all the corresponding Objects like Views Fact Models, Dimensions,ect.
If needed you have to expose the Objects for Consumption (this makes them available for SAC/3rd Party Tools)

For Consuming the Data in a 3rd Party Tool you have two Options:

1. O-Data Services
https://help.sap.com/docs/SAP_DATASPHERE/43509d67b8b84e66a30851e832f66911/add771abf6f54c9d8de4c7e470...
However you can only use View's as a Source no Perspectives (they will get created during the Modell Transfer)
2. Open SQL-Schema
https://help.sap.com/docs/SAP_DATASPHERE/43509d67b8b84e66a30851e832f66911/4db6f5a329af44509ae422ad70...
This will allow you to also consume the created Perspektives.

Sidenote:
At the Moment Analytical Models can only be consumed by SAC.
So you may have to rebuild created Analytical Models during the Modell Transfer with the former Object Type.



mateusz_mikulski
Participant
0 Kudos

I don't need Model Transfer, I can also use query (as an InfoProvider) and read it in Datasphere as a remote table. This is working fine. Then, the query/remote table can be used in a DSP view and consumed from there. This is working with some tools, for example PowerBI (with ODATA data source, see: https://developers.sap.com/tutorials/data-warehouse-cloud-bi7-connect-powerbi.html).

I am wondering if the same can be achieved with Snowflake, or do I need additional ETL tool between Datasphere and Snowflake.

JulianJuraske
Participant
0 Kudos

Look at the last two Links for consuming exposed Data, O-Data and Open SQL Schema + ODBC.
This is all the Information from SAP Side.

If this Link don't answer your Question, maybe you should ask it in a Snowflake Forum :).

mateusz_mikulski
Participant
0 Kudos

I'm familiar with those links you shared, and while they can be helpful, they're quite general. What I was hoping to find here was someone who has actually established a connection between a DSP and Snowflake, and who can provide some insights.

JulianJuraske
Participant
0 Kudos

I assume Snowflake can Import from ODBC ?
Here is the Link for the SAP Developer Tutorials on how to connect other BI Tools with SAP Datasphere:

https://developers.sap.com/group.data-warehouse-cloud-bi-connect.html