Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
3,751

Introduction

Many companies are using SAP Analytics Cloud (SAC) for planning alongside Power BI for reporting. However, users often request to utilize SAC planning data within Power BI. This integration can be challenging due to differences in data handling and capabilities. If you are facing similar demands, our blog provides a practical solution to this issue.

In this blog post, we will guide you through creating an export service for SAC that integrates seamlessly with Power BI using Azure Data Factory (ADF). This straightforward setup leverages the SAC Data Export Service API, ADF and Multi-Action API Step to ensure minimal compute usage and cost efficiency.

To demonstrate how this setup works in practice, we've created a demo video. Watch the video below to see how updating data in an SAC story seamlessly updates the data in Power BI. This will give you a clear understanding of the real-time integration between SAC and Power BI using ADF.

ADF_DEMO_GIF.gif

Prerequisites

Before we begin, ensure you meet the following prerequisites

  • SAC Tenant, including Planning Model and related story
  • Azure Data Factory Account
  • Azure Storage Account in ADLS
  • Power BI

Step-by-Step Guide

  1. Creating the Data Export Pipeline in ADF
  2. Triggering ADF Pipeline through Multi-Action API Step
  3. Connecting Power BI with Azure Data Lake Storage
  4. Running the entire process

Creating the Data Export Pipeline

The pipeline consists of four main parts:

  1. REST Service: Create a REST service for the desired model.
  2. Copy Table Names: Identifies all endpoints for the specified model.
  3. Lookup Tables: Outputs the names obtained from "Copy Table Names".
  4. ForEach Loop: Uses the SAC REST API to retrieve data from the endpoints and copy it to ADLS.

ADF_PIPELINE.png

To help you visualize the above-mentioned steps, we've created a short GIF that quickly demonstrates the process.

pipeline_GIF.gif

 Setting up REST Service

  1. Create a REST Service:
    1. Set up a REST Service with base URL:
      {your SAC tenant}/api/v1/dataexport/providers/sac/{model ID}

    2. Choose "OAuth 2.0 Client Credential" for the authentications type. We will name this service "REST_EXPORT_API_DEMO".

  2. Generate OAuth Credentials:
    1. In your SAC tenant, navigate to 'System > Administration > App Integration' to find your token endpoint and create a client to get the client ID and client secret.

 Setting up "Copy Table Names"

  1. Create a REST Dataset:
    1. Set up a REST dataset using the linked service "REST_EXPORT_API_DEMO".
      Name this table "Tables".

  2. Data Source Configuration:
    1. In "Copy Table Names" select the REST dataset you just created.
      Screenshot 2024-06-18 075716.png

       

    2. Create a CSV file to store the table names. The file path should be:
      {Storage account}/{model ID}/Table_names.csv
      This will be used as the sink for "Copy Table Names".
      Screenshot 2024-06-18 075333.png

       

    3. Use the just created CSV file as the sink dataset.
      Screenshot 2024-06-18 075809.png

       

    4. Ensure that the mapping includes only the table names.
      Screenshot 2024-06-18 075621.png

       

Setting up "Lookup Tables"

Use the sink dataset from the "Copy Table Names" step as the source for "Lookup Tables".

Screenshot 2024-06-18 080448.png

 

 

 

Configuring the "ForEach" Loop

Under settings, set the "Items" to: @activity('Lookup tables').output.value
Screenshot 2024-06-18 080524.png

 

 

This configuration ensures the loop iterates over all table names, using them in the subsequent copy data activities within the loop.

Configuring "Copy Data" within the "ForEach" loop

  1. Set up a new REST Dataset: This will be called "REST_API_ForEach". We will use the "REST_EXPORT_API_DEMO" as linked service, but will now add a dynamic relative URL.
    Screenshot 2024-06-18 080632.png

     

     

    Screenshot 2024-06-18 080936.png

     

  2. Setting up Source: We will use the above dataset as source and the item setting that configured under "Configuring the "ForEach" Loop". This way we will access the endpoint for each table in the given model.
    Screenshot 2024-06-18 081014.png

     

  3. Creating a dynamic JSON sink dataset: The JSON sink dataset is created such that we will have one unique JSON file for each table in the model, which will inherit its name from the model.
    Screenshot 2024-06-18 081048.png

     

     

    Screenshot 2024-06-18 081150.png

     

     

Setting the sink in "Copy Data": We now simply use the above created table along with the "Items" configured under "Configuring the "ForEach" Loop".
Screenshot 2024-06-18 081238.png

 

The ADF pipeline is now fully configured and should export the model from SAC into the Azure blob storage, under the path defined in the first step of  "Creating a dynamic JSON sink dataset".

Triggering ADF Pipeline through Multi-Action

Register an App in Azure

  1. Go to 'Home > App Registration > Endpoint' and copy the Token URL:
    "OAuth 2.0 token endpoint (v2)".
  2. Go to 'Home > App Registration' and click '+ New Registration'.
  3. Name your app and select 'Accounts in this organizational directory only (Single tenant)'.
  4. Copy the 'Application (client) ID', this is your "OAuth Client ID".

Configure OAuth Credentials

  1. Go to 'Certificates & Secrets' and create a new client secret. Copy the 'Value' as your client secret.
    oauth cred config 1.png

     

  2. Ensure that the created App, "SAC", has the permission 'user_impersonation'.
    oauth cred config 2.png

     

  3. Assign the app the role of 'Data Factory Contributor':
    'Azure Portal > Data factories > Your Data Factory > Access Control (IAM) > Add Role Assignment'.

Create the Connection in SAC

Create a connection using the OAuth Client ID, Secret and OAuth 2.0 token endpoint (v2) retrieved in the previous steps.
create connection in SAC.png

The OAuth client ID is a unique, autogenerated ID that identifies your Azure application to SAP Cloud Platform and verifies its authorization to access requested resources. The specific OAuth client ID used above is given as the "Application ID" retrieved in the last step of "Register an App in Azure".

The secret is a confidential password that is associated with the client application. It is used by SAC to prove its identity to the Azure platform. It is the 'Value' retrieved in the first step of "Configure OAuth Credentials".

The "Token URL" is the endpoint that SAC will use to request an access token. The access token is a temporary authorization that is used by the client application to access the requested data or services.

Creating the Multi-Action

  • Add a data action: to publish changes made to the story before exporting.
    add a data action.png

Configure the export API step: To do this, choose the connection made earlier and use the API URL:
config export api step.png

The URL is given by:
https://management.azure.com/subscriptions/{Subscription   ID}/resourceGroups/{Resource Group Name}/providers/Microsoft.DataFactory/factories/{Data Factory ID}/pipelines/{Pipeline ID}/createRun?api-version=2018-06-01  

Secondly, choose the request method "Synchronous Return".
sync return.png

  • Add the Multi-Action to a story which is uses the same model as the one in ADF.

Connecting Power BI with ADLS

Retrieve the Endpoint of the ADLS

  • Go to your storage account in Azure where the tables are saved and search for endpoints.
  • Copy the Data Lake Storage endpoint (not the resource ID).

Configure Power BI

  • In Power BI, click 'Get Data' and select 'Azure Data Lake Storage Gen 2'.
  • Paste the endpoint and add the path to the folder:
    https://{storageaccount}.dfs.core.windows.net/{Blob container}/{model_id}

Unpack Data

For each table, unpack the data. Repeat this process for all tables.


  1. unpack 1.png

  2. unpack 2.png

  3. unpack 3.png

  4. unpack 4.png

Running the entire process

  1. Modify the Story in SAC: Make changes and trigger the multi-action.
  2. Monitor ADF: Go to ADF and ensure the pipeline is triggered and completed successfully.
  3. Refresh Power BI Data: Refresh the data in Power BI to reflect the changes.

By following these steps, you can efficiently export data from SAC to Power BI using ADF, ensuring a seamless and cost-effective integration.

2 Comments
Prabodh_Misra
Explorer

Excellent!!!

deshmukh
Participant

Nice blog. GIFs are blurry and not clear. Would it be possible to display them clearly please.

Labels in this area