Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
Showing results for 
Search instead for 
Did you mean: 
Active Contributor

👉🏿back to blog series or jump to GitHub repos🧑🏽‍💻

<<part1  part 3>>

Hello and welcome back to your ABAP Cloud with Microsoft integration journey. Part 1 of this series got you covered with basic SAP RAP business object consumption via OData using Microsoft Excel.

Today we will be building on top of that and uplevel the OData API authentication method and governance to enterprise grade🤘.

I know, I know, it is rather annoying given how many different authentications blog posts in this community are out there, but recent customer conversations made me do it anyways. It remains a challenge in cross- vendor setup till today while being a top priority.

So, therefore we will raise the client authentication method on SAP BTP ABAP environment used in part 1 from Basic Auth (Username/Password) to client certificates.

Since there is no enterprise-readiness without governance features like quota usage limitation and throttling we will add Azure API Management to the solution. In addition to that it lifts the burden from the client apps to maintain a certificate of their own. The consumers only need to care about API authorization handled via Microsoft Entra ID (formerly Azure AD). The complexity of mTLS with the ABAP Cloud OData API is abstracted away by Azure API Management.


Currently SAP supports OAuth2 only on SAP S/4HANA Cloud – ABAP Environment (aka. Embedded Steampunk). See my docs entry to learn about the configuration of SAP Principal Propagation with OAuth2SAMLBearer for Excel (Power Query).

See the available OAuth2 flows requiring human action during the request using service keys of your BTP ABAP Environment in Thomas Wiegand’s insightful blog post. They are a good fit for testing. For production and service to service communication X.509 would be recommended as outlined in this blog till OIDC gets released fully.

You may extrapolate from the guidance given in this post to SAP API Management where needed.

If you are looking for BTP on Azure alternatives, see SAP API Management and SAP Credential Store on the Discovery Center.

Fig.1 Overview of login flow with transition from OAuth2 flow to certificate based authentication

Adele is one of the demo users provided with any free M365 sandbox, which includes Microsoft Entra ID. This way you can re-produce the setup without bugging any of your internal Azure admins, that don’t want to come out and play🤸🏽🕹️.

🛈Note on the side: In case you don’t want to have a single certificate applied for the BTP API, you could map the user id ( via a JWT attribute or unique claim to the desired client certificate known to SAP via the Azure APIM policy, supporting Principal Propagation. However, the SAP communication arrangement on the ABAP environment supports only one technical user.

Ready to go?

Obtain a client certificate from SAP supported certificate authorities

SAP maintains their list which CAs BTP accepts here (#2801396). Find the Microsoft trusted root CA list here. Ideally choose a certificate issuer that matches both lists for maximum convenience. However, any CA that is not trusted by SAP will not work. So, self-signed certificates are not possible.

There are multiple paths to a functional client cert setup. I will describe just the one that was most straightforward to me.

For this blog, I chose DigiCert, which has built-in experience in Azure Key Vault and is trusted by Azure API Management out-of-the-box.

Get ahold of the PFX file, which contains the certificate and the private key. We will need that for Azure API Management.

Download the client certificate (PEM file) that contains the public key from DigiCert. We will need that for BTP.

Enable X.509 on your inbound communication scenario on ADT

Navigate to your communication scenario that you created in part 1 as part of the SAP developer tutorial. Make sure X.509 is checked to enable client certificate authentication.

Fig.2 Screenshot of x509 authentication method activation

Move over to BTP to adjust your existing user or create a new one.

Create the Communication User on BTP

Navigate to the “Maintain Communication User” Fiori app, upload the certificate containing the public key (in my case my PEM file). Add this user to your Communication Arrangement.

Fig.3 Screenshot of certificate setup in communication user

Fig.4 Screenshot of communication arrangement with client cert setup

Download the metadata file for OData v2 (v4 has challenges on Excel currently) for the upload into Azure API Management in the next step.

Add the OData v2 booking API to Azure API Management

You have a choice of creating your SAP API layer using OpenAPI or OData natively.

For OpenAPI have a look here and this blog on how to convert from OData EDMX where needed. Also check the configuration requirements for OpenAPI for OData clients here. For instance, pay attention to the rewrite-uri policy. Otherwise, you will see redirects from steampunk (http 307).

For native OData proceed with the Azure portal directly.

Fig.5 Screenshot of OData create experience

Add the pfx file to Azure API Management

Check the official Microsoft docs entry for more details and context beyond this blog post.

Navigate to Azure Key Vault -> Objects -> Certificates -> Generate/Import and follow the wizard.

Fig.6 Screenshot of cert upload to Azure Key Vault

It is not mandatory to use Azure Key Vault as the certificate store. Azure API Management has its own trust store. But I highly recommend Key Vault for built-in expiry mechanisms to rotate certificates and handling their lifecycle.

Fig.7 Screenshot of Azure Key Vault linked certificate config in Azure APIM

In case your chosen CA is not “well-known”, you need to also upload the intermediary and root certificate to APIM. See the first tab “CA certificates” in above Screenshot.

Check this Microsoft blog for more details about the various flavours and permutations of the CAs.

Configure M365 Organizational Account authentication with Azure APIM

Now that your steampunk API is fronted by Azure APIM, let’s configure the remaining parts of the flow outlined in fig.1.

🛈Be aware

Excel (Power Query) OData login via Organizational Account requires a custom domain setup for your APIM instance. Have a look here for the details. Furthermore, I configured trust between Power Query and Azure API Management. See the standard app id of Power Query listed under authorized applications. This way no additional consent from any user like Adele will be required.

To configure the authentication flow open the policy code view from “All Operations” or “API Policies“ tab.


Automate the API + policy creation using Azure Developer CLI. See this repos using SAP Cloud SDK for OData for details.

Fig.8 Screenshot how to navigate to APIM policy code view for OpenAPI APIs

Paste the provided policy and configure your custom API Management domain in line 37 as well as your certificate ID in in line 55. Check the settings for the exposed host and path on the OData response transmitted through APIM in line 68 to match your setup.
<base />
<!-- if empty Bearer assume Power Query signin request as described here: -->
<when condition="@(context.Request.Headers.GetValueOrDefault("Authorization","").Trim().Equals("Bearer"))">
<set-status code="401" reason="Unauthorized" />
<set-header name="WWW-Authenticate" exists-action="override">
<value>Bearer authorization_uri={{AADTenantId}}/oauth2/v2.0/authorize?response_type=code%26client_id=a672d62c-fc7b-4e81-a576-e60dc46e951d</value>
<validate-jwt header-name="Authorization" failed-validation-httpcode="401" require-scheme="Bearer">
<openid-config url="{{AADTenantId}}/.well-known/openid-configuration" />
<claim name="scp" match="all" separator=" ">
<!-- After successfull JWT validation drop Authentication header and add client certificate for SAP BTP ABAP environment -->
<set-header name="Authorization" exists-action="delete" />
<!-- drop unsupported response encoding "br" if present -->
<set-header name="Accept-Encoding" exists-action="override">
<value>gzip, deflate</value>
<!-- Select client cert from Azure Key Vault or Azure APIM certificate store by id
<authentication-certificate certificate-id="apim-client-cert-steampunk" />
<base />
<base />
<!-- ensure downstream api metadata matches apim caller domain in Power Query
Drop *.Path if SAP paths are mapped 1:1 from APIM. Find other URI components if needed here:
<find-and-replace from="@(context.Api.ServiceUrl.Host + context.Api.ServiceUrl.Path)" to="@(context.Request.OriginalUrl.Host + context.Api.Path)" />

See this docs entry for more details about the client certificate authentication policy.

The only thing left to do is an integration test 😎

Open Excel Desktop and pull the SAP bookings using Adele’s user. Navigate to the Data ribbon -> Get Data -> From Other Sources -> From OData Feed -> supply your Azure API Management API endpoint fronting the Steampunk OData API. Choose Organizational Account and hit Sign-in.

Alternatively, login with Excel using Adele already. In my case below I have a split: logged in on Excel with my corporate user Martin and only using Adele for the OData connection. Any combination is possible.

Fig.9 Screenshot of Microsoft Entra ID login with user Adele on Excel Desktop

During the next stage you may influence the OData request as usual.

Fig.10 Screenshot of OData entity selection on Excel after successful login via AAD and BTP client cert

And finally marvel at the successful OData pull in Excel. Remember the authentication flow in Fig.1. The request from Excel (Power Query) went through Microsoft Entra ID and used client certificate authentication for the BTP ABAP environment.

Fig.11 Screenshot of successfull OData load on Excel Desktop with live query connection

What else?

Before going all in with Azure API Management you might want to perform your first integration test with the BTP ABAP environment using the client certificate manually. There are certainly multiple ways to do this again. See below my approach using Postman’s certificate store as a reference.

Navigate to the “Settings wheel” on the top right -> Choose Certificates tab and supply the PFX file including the passphrase. Make sure to configure “No Auth” on the Authorizations tab as below.

Fig.12 Screenshot of manual client certificate authentication test from Postman

Verify the certificate has been applied from the Console.

Fig.13 Screenshot of Postman console with attached client certificate on successful OData response from BTP ABAP environment

Learn more about the setup for the presented use case utilizing SAP Principal Propagation (OAuth2SAMLBearer flow) using my Microsoft docs entry:

Enable SAP Principal Propagation for live OData feeds with Power Query | Microsoft Learn

Learn more about mTLS setup consideration with Azure API Management here: Secure APIs using client certificate authentication in API Management - Azure API Management | Micro...

All this setup with wizards and guidance for a single API is nice for deeper understanding and first “get your hands dirty” style of learning things. But what about operationalizing the approach at scale?

See our APIOps guidance to automate the approach and employ governance. I can also recommend our SAP Cloud SDK repos using Azure Developer CLI that handles to whole SAP OData API lifecycle with Azure API Management in a programmatic way.

Final Words

That’s a wrap 🌯you saw today how you can implement production-ready client certificate authentication for your OData APIs on BTP ABAP Environment using Azure API Management. As the cherry on the cake, we utilized it in combination with Microsoft Entra ID authentication from Microsoft Excel for an integrated end-to-end experience of the business user.

#Kudos to 278832, niklasalbers, and thwiegan for showing me around BTP ABAP Environment (Steampunk) authentication ins-and-outs 🫀. And finally, bobbiromsft, who always accelerates me big time on certificates in prototyping endeavors.

Find all the resources to replicate this setup on this GitHub repos. Stay tuned for the remaining parts of the steampunk series with Microsoft Integration Scenarios from my overview post.


Labels in this area