Hello and welcome back to your ABAP Cloud with Microsoft integration journey. Part 3 of this series got you covered with modern GraphQL API definition on top of your ABAP Cloud RAP APIs to expose a single API endpoint that may consume many different OData, OpenAPI, or REST endpoints at the same time.
Today will be different. Sparked by a
SAP community conversation with
dpanzer and
lars.hvam including a
community question by
rammel on working with files with ABAP Cloud, I got inspired to propose a solution for the question below:
Before we dive into my proposal see here a list of alternative options that I came across as food for thought for your own research.
Mount a file system to a Cloud Foundry app |
Create custom API hosted by your CF app and call via http client from ABAP Cloud |
Connect to SFTP server via SAP Cloud Integration |
Design iFlow and call via http client from ABAP Cloud |
Integrate with SAP Document Management Service |
Call SAP BTP REST APIs from ABAP Cloud directly |
Integrate with SAP BTP Object Store exposing hyperscaler storage services using SDKs |
Create custom API hosted by your CF or Kyma app and call via http client from ABAP Cloud |
Serve directly from ABAP Code via XCO |
Base64-encode your file content, wrap into ABAP code, and serve as XCO class. Lars likes it at least 😜. There were sarcastic smiles involved and some more “oh please”, so take it not too seriously. |
Raise an influencing request at SAP to release something like the former NetWeaver MIME repos |
Live the dream |
A
common theme among all the options is the
need to interact with them from
ABAP Cloud via the built-in http client. On the downside some options require an additional app on CF or Kyma to orchestrate the storage interactions.
Ideally ABAP Cloud integrates directly with the storage account to reduce complexity and maintenance.
You guessed rightly my own proposal focusses on direct integration with Azure Blob
To get started with this sample I ran through the SAP developer tutorial “
Create Your First ABAP Cloud Console Application” and steps 1-6 of “
Call an External API and Parse the Response in SAP BTP ABAP Environment. This way you can easily reproduce from an official reference.
Got your hello world on Eclipse? Great, onwards, and upwards in the stack we go then 🪜. Or down to the engine room – that depends on your perspective.
All the blob storage providers offer various options to authenticate with the service. See the current coverage for Azure
here.
Fig.1 Screenshot of supported authentication methods for Azure Storage
The Microsoft Entra ID option offers superior security capabilities compared to access keys – which can be leaked or lost for example – and is therefore recommended by Microsoft.
For developer ease, I left the code using the simpler to configure “
Shared-Access-Signature (SAS) tokens” commented on the
shared GitHub repos. SAS tokens can be created from the Azure portal with two clicks.
The
shared key approach requires a bit of hashing and marshaling on ABAP. Use the
ABAP SDK for Azure to accelerate that part of your implementation. Check the “
get_sas_token” method for reference.
Anonymous read access would only be ok for less sensitive content like static image files or the likes because anyone can access them once they have the URL.
For an enterprise-grade solution however, you will need to use a more secure protocol like OAuth2 with Microsoft Entra ID
Technically you could do the OAuth2 token fetching with plain http-client requests from ABAP Cloud. See
this blog by
jacek.wozniczak for instance. However, it is recommended to use the steampunk “Communication Management” to abstract away the configuration from your code. Think “external configuration store”. Also, it reduces the complexity of your ABAP code, because Communication Management handles the OAuth2 flow for you.
📢Note: SAP will release the needed capability to maintain OAuth2 scopes in communication arrangements as part of your ABAP Cloud requests with the upcoming SAP BTP, ABAP environment 2402. |
So, till then you will need to use the BTP Destination service. Target
destinations living on subaccount level by calling them like so (omitting the i_service_instance_name, thank you
thwiegan for calling that out
here😞
destination = cl_http_destination_provider=>create_by_cloud_destination(
i_name = |azure-blob|
i_authn_mode = if_a4c_cp_service=>service_specific
).
Or call
destinations living on Cloud Foundry spaces like so:
destination = cl_http_destination_provider=>create_by_cloud_destination(
i_name = |azure-blob|
i_service_instance_name = |SAP_BTP_DESTINATION|
i_authn_mode = if_a4c_cp_service=>service_specific
).
For above Cloud Foundry variation you need to deploy the “standard” communication scenario
SAP_COM_0276. My generated arrangement id in this case was “SAP_BTP_DESTINATION”.
Be aware, SAP marked the approach with BTP destinations as
deprecated for BTP ABAP. And we can now see why. It will be much nicer doing it from the single initial communication arrangement only, rather than having the overhead with additional services and arrangements. Looking forward to that in February
😎
Not everything is “bad” about using BTP destinations with ABAP Cloud though. They have management APIs, which the communication arrangements don’t have yet. Also, re-use of APIs across your BTP estate beyond the boundary of your ABAP Environment tenant would be useful.
A fully automated solution deployment with the BTP and Azure terraform providers is only possible with the destination service approach as of today.
See
this TechEd 2023 session and watch this new
sample repos (still in development) for reference.
The application flow is quite simple once the authentication part is figured out
Access your communication management config from your ABAP web Ui:
https://your-steampunk-domain.abap-web.eu20.hana.ondemand.com/ui#Shell-home
Steampunk supports the
typical set of authentication flows for outbound communication users using http that you are used to from BTP. I chose the OAuth2 Client Credentials grant because that is most widely referenced in the BTP world and reasonably secure.
Fig.2 ABAP Cloud API flow including OAuth2 token request from Microsoft Entra ID
Since I am integrating with an Azure Storage account, I will need to authenticate via Microsoft Entra ID (formerly known as Azure Active Directory).
Yes, Microsoft likes renaming stuff from time to time, too
😉.
Using the
Azure Storage REST API I can create, update, delete, and list files as I please.
The Entra ID setup takes a couple of clicks
Create a new App registration from Microsoft Entra ID service on your Azure portal and generate a new secret. Beware of the expiry date!
Below preferred option will start working once SAP adds the scope parameter for OAuth2 Client Credentials grant as described before.
Fig.3 Screenshot of attribute and secret mapping for ABAP Cloud Outbound user
For now, let’s have a look at a destination on subaccount level instead. Be aware the scope parameter needs to be “
https://storage.azure.com/.default” (see fig.4 below, additional properties section called “scope” on the bottom right). That is also the setting that we are missing for the preferred approach mentioned above.
The standard login URLs for OAuth token endpoints on Microsoft Entra ID are the following:
https://login.microsoftonline.com/your-tenantId/oauth2/v2.0/token
https://login.microsoftonline.com/your-tenantId/oauth2/v2.0/authorize
Fig.4 Screenshot of attribute mapping from Entra ID to SAP BTP Destination
So far so good. Let’s roll the integration test from our ABAP console application on Eclipse (ADT).
Fig.5 Screenshot of file interaction from ABAP Cloud and data container view on Azure
Excellent, there is our booking request: Safe and sound stored as Azure Blob, posted from ABAP, and read again seamlessly.
See the
shared Postman collection to help with your integration testing.
Thoughts on production readiness
The biggest caveat is the regularly required OAuth2 client credential secret rotation. Unfortunately, credential-free options with Azure Managed Identities are not possible, because BTP is hyperscaler-agnostic and does not expose the underlying Azure components to you.
Some of you might say next: let’s use client certificates with “veeery long validity time frames like 2038” to push out the problem beyond so far out someone else will have to deal with it. Well, certificate lifetimes get reduced more and more (TLS certs for instance have a maximum of 13 months at
DigiCert since 2020) and you have to rotate them eventually, too
😉. With shorter certificate lifetimes more secure hashing algorithms come into effect much quicker for instance.
I will dedicate a separate post on client certificates (mTLS) with steampunk to consume Azure services.
What about
federated identities? You could configure trust between your SAP Cloud Identity Service (or Steampunk auth service) and Microsoft Entra ID to allow requests from ABAP Cloud to authorize Azure services. However, that would be a more complex configuration with implications for your overall setup causing larger integration test needs. And we embarked on this journey to discover a simple solution not too far away from AL11 and the likes, right?
😅
See a working implementation of federated identities with SAP Cloud Identity service consuming Microsoft Graph published by my colleagure mraepple in his blog series here.
Ok, then let’s compromise and see how we can automatically rotate secrets. Azure Key Vault exposes events for secrets, keys, and certificates to inform downstream services about due expiry. With that a small low code app can be provided to perform the secret update. See
below sample that went the extra mile asking the admins via Microsoft Teams if they wanted to perform the change or not:
Fig.6 Architecture of secret rotation with Azure Key Vault and secret refresh approval
A new secret for the app registration on Entra can be generated with the
Microsoft Graph API like so. See
this post for details on the Azure Key Vault aspects of the mix.
To apply that flow and propagate the new secret to steampunk, we need to call BTP APIs to save the new secret. See the BTP REST API for Destinations
here to learn about the secret update method.
Have a look at my earlier
blog post for specifics on how to do the same with certificates.
Estimated cost for such a secret rotation solution for 1000 rotations per month is around 2$ per month. With simpler configurations and less rotations, it can be covered by free tiers even.
Once you have applied the means of automation as discussed above you may incorporate this into your DevOps process and live happily ever after with no manual secret handling
😊.
Final Words
That’s a wrap
🌯you saw today how – in the absence of an application server file system and NetWeaver MIME repository (good old days) – you can
use Azure Storage Account as your external data store from BTP ABAP Environment (steampunk) using ABAP Cloud. In addition to that, you
gained insights into the proper setup for authentication and what flavors are supported by steampunk now. You got a glimpse into
automated deployment of the solution with the BTP and Azure terraform provider.
To top it up you learnt what else is needed to
operationalize the approach at scale
with regular secret/certificate rotation.
Check
SAP’s docs for external APIs with steampunk for further official materials.
What do you think
dpanzer and
lars.hvam? Not too bad, is it?
😉
Find all the resources to replicate this setup on
this GitHub repos. Stay tuned for the remaining parts of the steampunk series with Microsoft Integration Scenarios from my
overview post.
Cheers
Martin