Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
hoangvu
Product and Topic Expert
Product and Topic Expert
4,424

Intro

With our August 2024 release, we are happy to share the general availability of our OData API to extract analytics data from SAP Signavio Process Intelligence.

SAP Signavio’s OData API is a scalable, out-of-the-box interface. It aims to make process mining data available to a broader audience and accessible in 3rd party tools other than SAP Signavio, including e.g. SAP SAP Analytics Cloud and Microsoft Excel. 

Users can integrate valuable process insights in their existing enterprise dashboards, democratizing the value of SAP Signavio Process Intelligence overall. 

Additionally, SAP’s data integration capabilities like SAP Cloud Integration within SAP Business Technology Platform can be leveraged to further ease the integration of the OData API. 

This will play a crucial part for our customers to streamline their process mining insights into central analytics and reporting tools, such as SAP Analytics or related BI solutions.

In this blog post, we would like to give you an overview of this new capability.

We will guide you through the steps on how to create the API token, then how to setup an OData view that defines which data can get extracted by the API and then ending with the actual consumption of the API in Microsoft Excel and SAP Analytics Cloud.

Create API Token

Let's start by selecting the process.

hoangvu_0-1723619243042.png

In the top right corner, select the process settings option.

hoangvu_1-1723619243042.png

When you now enter your process settings within a process of SAP Signavio Process Intelligence, you will find a new tab called "OData Views". Here you can define an API token that is required for the technical authentication against the API and also an OData view that selects the right data from your event log.

hoangvu_2-1723619243165.png

Let's start by creating a new API token. Here you can define the new name of the token and set the validity duration of the token. By default, the duration is set to 90 days.

hoangvu_3-1723619243151.png

Once you done, you can simply create your token, which then displays your token exactly once. Hence, make sure to copy it down as you will not be able to retrieve the token again once you leave the page.

hoangvu_4-1723619243054.png

Create OData View

Now we want to define the data that can be extracted with this OData API. For that we require the same SIGNAL query we use on our dashboards to query the data in our event log. In this example, I used a table widget on a dashboard and viewed it in SIGNAL mode, where I can easily copy over the SIGNAL query into my clipboard.

hoangvu_5-1723619243237.png

Now going back to my OData view tab where I created my API token beforehand, I can create an OData view.

hoangvu_6-1723619243056.png

Here I can provide a new name for the OData view, a description relating the data that can be extracted and also paste in the SIGNAL query which I copied over from the table widget in my dashboard.

hoangvu_7-1723619243121.png

Note that the column name in your SIGNAL query can't have a blank space, hence we recommend you to replace it with an underscore or letter.

Once you are done, you can click on create and you get navigated back to the OData view page, where your newly created OData view is ready to be used.

hoangvu_8-1723619243226.png

Load data into Microsoft Excel

To perform a quick test against the API before connecting it to proper enterprise applications, you can use tools such as Postman or Microsoft Excel. In this example, as it is an OData API, you can simply open a new Excel worksheet and get data via the power query option.

hoangvu_9-1723619243184.png

Here you select the option OData.

hoangvu_10-1723619243193.png

Here you need to define the URL and insert the API token as password under basic authentication. Note that the URL is built up in this format:

https://<baseUrl>/pi/signal/odata/v1/<resource>

The base URL is based on the API endpoint where your Signavio workspace is hosted. In my case the name of the OData view is "NPS_Case_Status".  The username is not important as the API token is implicitly already linked to your Signavio user. In this case, I used "unknown", but you can use any username as it will not be applied. More details on this can be found in our official documentation.

hoangvu_11-1723619243060.png

Once the API call was successful, you should see the dataset of your OData API.

hoangvu_12-1723619243148.png

Once you are happy with the data query, you can close and load the data into your excel worksheet.

hoangvu_13-1723619243118.png

Connect API to SAP Analytics Cloud

Now we want to use the API to extract data into SAP Analytics Cloud. For that we can create connection.

hoangvu_14-1723619243075.png

Here we select the OData Services option.

hoangvu_15-1723619243214.png

Here we can provide the OData Connection a name, URL and the API token, similarily to the setup in the Excel worksheet. Also here, the user name is set as mandatory, but will not be technically required for the authentication against the API. In our example, we use a Signavio workspace in the EU region, which can be seen in the respective URL, which can be found in the documentation. Again here, the user name is not required. I used "not_required", but you can use any word as it doesn't get used during authentication.

More details on this can be found in our official documentation.

hoangvu_16-1723619243198.png

Once the connection was successfully saved, we can now used it in a model.

hoangvu_17-1723619243160.png

You can use your newly created connection as a baseline for your model.

hoangvu_18-1723619243181.png

Then the system displays you all created OData Views where your user has access to. In my example, I have access to these four OData views that were able in my single process. We can select the "NPS_Case_Status" OData API and provide it a similar name.

hoangvu_19-1723619243085.png

After that we can select what columns we want to display in SAP Analytics Cloud. In our case, we would like to see all data as part of the OData API.

hoangvu_20-1723619243078.png

Once the data was successfully imported, you should be able to the newly extracted data under data foundation. Here you can use the extensive analytical capabilities in SAP Analytics Cloud to merge this model with your existing analytics artifacts.

hoangvu_21-1723619243220.png

And that is it! Let me know what you think of this new API and what you use it for. In case of questions, feel free to reach out to us.

Additional resources

SAP Help

SAP Business Accelerator Hub

August 2024 release summary

Postman Collection

Enable Now

Analytics API in SAP Signavio Process Governance

5 Comments
thomasbodenmuellerdodek
Product and Topic Expert
Product and Topic Expert

Awesome @hoangvu  - that a great enhancement 

LeandroRibeiro
Participant

very nice blog!

thank you.

niels1515
Participant
0 Kudos

@hoangvu Thanks for the good blog article, sounds very exciting and meaningful.

Do you have a few examples where the connection to the SAC makes sense? 

Ragards,

Niels

hoangvu
Product and Topic Expert
Product and Topic Expert

Hi @niels1515

one use case is that you want to centralise all your process analytics efforts into one reporting and analytics layer like SAC, so that your business analyst does not have to go into multiple solutions to find their process insights. With this API, you can extend your current analysis of a process in SAC with additional process mining findings. Let me know if this makes sense to you.

Best regards,

Hoang

JD_WongLoera
Active Participant

Thanks for communcating this @hoangvu , very handy

hi @niels1515  

I agree witht the feedback provided by @hoangvu .

in addition I want to give a tangible example where this would be relevant.

In retail organizations of consumer packaged goods, it is well known that Seasonal Products and Allocations are usually a nightmare to deal with if there are no good analysis processes and dashaboards sliced and diced from different angles.

For example, many unique seasonal SKU/Products are developed and produced for the Halloween season. These products are developed and planned according to Marketing analysis and their potential financial benefit to the organization (Sales Revenue), because they usually they have a Premium Price; as well usually, Marketing tries to balance with the Production Team and the Demand Planning team, for production-time at the factory's production lines to produce these unique seasonal SKUs from the standard core products, and why it makes sense to produce them at some times cost of Core Skus with a regular volume but not a Premum price.

The Allocation Process, is usually common to take place in Halloween, and many times it is a complex process to observe and mine.

Many companies do not even like to "Model" an allocation process, because that way there is less accountability of who made the decisions and why many times customers without an initial allocation, ended up with getting product shipped and customers that were good customers that provided a forecast, ended up with nothing (i have seen it happen! ) or less volumes than what they forcasted and which the demand planner had commited to them per agreed allocation business rules.

In such cases, SAC would already have dashboards for Marketing, Sales and Finance, many times focusing on the Revenue at high level, and perhaps making it look like some of these SKUs sales were a success, because indeed all volume has sold. 

But if we can also pull process-mining data related to production processes and allocation processes, that would show that there were lots of Allocation Business Rules broken in the process which damages customer relationship and their trust in Demand Planning and Supply Network Planning and inefficiences in internal Production and Supply Network Planning processes, due to the Allocation business rules not being followed.

SAP Signavio Process Intelligence in this example, would support in process mining and  undersanding the amount of cases (case_id) that that show up as variants, that in an allocation process would never have happened for example ( which would demontrate adherence to business rules typical of Allocation process best practices)

Then we could have SAC dashboards that show widgets based on Process Efficieny metrics side by side with the typical Sales and Marketing dashboards.

This would many times, paint a very different picture of success of seasonal new product launch and final sales revenue, compared to all the increased cost issues and customer service issues related to the above examples I provide, which I have actually seen in many organizations.

This is why, Process Execution intelligence is, from my point fo view, very important. We now have the chance to mine information that will  better able to paint the entire picture of effectivity and alignment to business models.