Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
Ian_Henry
Product and Topic Expert
Product and Topic Expert
11,384
In this blog post I will describe how we can push datasets from SAP Data Intelligence to SAP Analytics Cloud (SAC).

In Data Intelligence, we have two pipeline operators, SAC Formatter and SAC Producer.
There is a great example graph Push to SAP Analytics Cloud, to help get you started.

To achieve this we used these capabilities

  • SAP Data Intelligence Modeler

    • Decode Table

    • SAP Analytics Cloud Formatter

    • SAP Analytics Cloud Producer



  • SAP Analytics Cloud

    • App Integration - Data Set API

    • Dataset consumption



  • SAC Dataset Limitations

    • 2 billion rows

    • 1000 columns

    • 100 MB per HTTP request




Previously I have shared How to Automate Web Data acquisition with SAP Data Intelligence, now we can extend that, by pushing that dataset to SAC.

The previous graph looks as below and runs in Data Intelligence.



Before we can push the data to SAC, we need to format the data into the new message.table format.  For this we use the Decode Table operator. Set input format as CSV



Using the SAC Cloud Formatter, we specify how SAC should create the dataset.


In the exchange rate data download from the web, we do not have usable column headings, those need to be provided in the Output Schema.



Switching to SAP Analytics Cloud.

SAC has API access that we need to enable, we navigate to System Administration - App Integration.
Here we see the parameters used to authenticate.
We need to add a new client enable this access through by adding a new OAuth Client.



Complete the OAuth Client request as below, replacing the Data Intelligence host name

For CF SAC Tenants use this format for the redirect URI
https://<SAP_Data_Intelligence_Hostname>/**

For Neo Tenants use
https://<SAP_Data_Intelligence_Hostname>/app/pipeline-modeler/service/v1/runtime/internal/redirect


 SAC CF OAuth Client Information


Press the Add button, and it will generate a Client ID and Secret to use in our Data Intelligence operator.


SAC CF OAuth Client Config


 

If you have a Neo Tenant the New OAuth Client screen is slightly different, you are required to enter the OAuth Client ID and Secret that we will require in Data Intelligence.


SAC Neo OAuth Client Configuration


 

Copy and paste your OAuth URLs from SAC and your OAuth Client ID and Secret information into the SAP Analytics Cloud Producer.


SAP Analytics Cloud Producer Node Config


Our completed pipeline looks like this



Run the pipeline and check the WireTap connected to the SAP Producer.
It shows an openapi.status_code 401 Unauthorized message.



To resolve the openapi.status_code 401 Unauthorized, we must authorize the API Token access.  This is available through the Open UI option in the SAP Analytics Cloud Producer



Opening the UI, brings you here, click on the link, which grants authorization.



Granting permission shows you the Access Token



We should stop and re-run the pipeline, and check the Wiretap once more.

If you see other messages such as openapi.status_code 403 Forbidden then you will likely need to get the SAC dataset API enabled on your SAC tenant, this can be done by logging a support ticket or a Jira if you are SAP employee.



If we do have the dataset API enabled, the output should be similar to the Wiretap below showing we received 2 API responses.

At 11:07:57 we see the existing SAC datasets and their IDs.

At 11:08:01 we see the creation of our new dataset DH_WEBDATA_EUR_GBP



Switching to SAC we can verify the dataset has been created under "My Files"



Opening the dataset shows our data safely in SAC



From here we can quickly build some cool visualisation and even use that data with Smart Predict.


Conclusion


We can now seamlessly push data into SAC from SAP Data Intelligence Cloud or on-premises. This could be simple data movement or to integrate the output of machine learning models from Data Intelligence in SAP Analytics Cloud.  I hope this blog post has helped you better understand another integration option.
44 Comments