Spend Management Blogs by SAP
Stay current on SAP Ariba for direct and indirect spend, SAP Fieldglass for workforce management, and SAP Concur for travel and expense with blog posts by SAP.
Showing results for 
Search instead for 
Did you mean: 

To many of you familiar with Ariba Integration Tool Kit (ITK) - it's likely gonna feel like the end of an era. With Ariba ITK reaching end of support, it's decision time: To be pro-cloud or not to be, and moreover if you are willing to learn from experience and wage a path towards being an Intelligent Enterprise.

In this blog post I'm going to talk about:

  1. ITK on BTP

  2. How this offering aligns with SAP's holistic Integration Strategy

  3. Making ITK on BTP smart-er


1. ITK on BTP: New cool kid in SAP town

With BTP gaining more traction by the day and becoming the client favourite when it comes to anything SAP Extensibility, it's probably a no-brainer what's a worthy alternative for the legacy Ariba ITK. The new cloud-cousin of ITK built on Integration Suite, mimics most if not all the capabilities of the legacy ITK. Here's an amazing blog post by my colleague gabrielmendes where he explains how the ITK on BTP works, with references to the community content available for you to accelerate the deployment of this solution.

Off late I've been helping my clients adopt this community content - basically the standard integration flows on integration suite. Some aspects that I love about this solution:

  • Modularity - The modular nature of Cloud Integration coupled with the straightforward way these IFlows have been built rules out any language specific considerations, making it easy for the clients to get up to speed with how these work.

  • Not a black box (anymore) - There's no denying the fact that the legacy ITK was immensely useful, but at the end of the day being deep rooted in technicalities, it was a black box.

    While I'm at it, some personal rant:
    I've spent a lot of time understanding how couple of functionalities worked in the legacy ITK, and it was always overwhelming to say the least. The lack of documentation on the legacy ITK - documentation which is actually correct - further adds to the brain drain.

  • Community Support - The ITK on BTP with its open/community built nature intends to put an end to the black box nature and shine some light on the ease with which this cloud offering can be adopted. Being hosted on Integration Suite opens it up to the wider community of citizen developers, garnering much wider support when it comes to enhancing functionalities or even squelching bugs.

2. How this offering aligns with SAP's holistic Integration Strategy

Intelligent Enterprises are Integrated Enterprises

Throughout 2023, I've had various conversations - with clients, partners, colleagues - and it is apparent that SAP is truly moving towards transforming businesses into Intelligent Enterprises.

The Suite Qualities such as Aligned Domain Model are taking the integration capabilities one step closer to the vision of Intelligent Enterprises. The ITK on BTP is piggy-backing on this vision and utilising customer's existing Integration Suite landscapes to become a one-stop shop for supporting yet another integration scenario. This is eventually not only helping the customers directly but also driving adoption - Cloud Adoption, and we can't be any more excited about that.

How, you ask?

  • One less place to worry about Integration MonitoringLeverage Cloud Integration's out-of-the-box monitoring capabilities, that's it - you won't have to depend on the Log files stored on your file system. Customer's utilising Integration Suite stand to gain a lot in this area because now they no longer have multiple places where their integrations are scattered. The 'Monitoring' tab has got you covered.Or even if you like having multiple integration platforms, SAP has yet again got you covered. Unleash the power of Cloud Integration’s OData APIs and find out how. Performing Cross Product Analysis couldn't have been any more easier.

  • Safeguard Cloud Transformation 

    Customer using Cloud ALM (CALM) can start utilising the integration between SAP Cloud Integration and CALM for safeguarding their Cloud Transformation, accelerating time to value and ensuring business continuity, in yet another attempt to harmonise UX.


3. Making ITK on BTP Smart-er

Being hosted on SAP Cloud Integration, as you can imagine you have all the ingredients ready to take the standard community content up many notches 😉. While standard content doesn't offer these enhancements, we can certainly pivot it into a more nuanced and sophisticated integration using the plethora of features in Cloud Integration and extensibility opportunities of SAP Ariba.

1. Support Advanced Scheduler configuration for Data Upload Iflows

The support for Advance Scheduler in the Timer event has been a blessing ever since it got rolled out this year. You are no longer constrained with rigid schedules that the timer previously offered. Instead you can schedule your integration flows in a much more flexible, yet accurate manner right down to the second.

Unfortunately, the standard Data Upload Integration flows makes use of the inbuilt scheduler that the SFTP adapter provides. This inbuilt scheduler is not smart enough ie. it lacks the advanced configuration.

To get past this limitation, you can instead update the Integration Flow to use Poll Enrich step to poll SFTP server to fetch files. The scheduling in such case can now be controlled by the Timer Start Event 👌.

Poll Enrich + Timer Start 🙌


This approach has a 🍒 on top. You can now have much more control on how polling to SFTP server is done since you are now using a Poll Enrich Step. This step provides you control over cases when there is no file available in the SFTP directory to import. So in case you want to throw an exception - maybe in a scenario where you want granular monitoring - you can do that as well.

File Availability Check


2. Prevent Data Import Failures if Import Batch Data is already running

If you've used the legacy ITK, you would already know that in case you perform 2 subsequent imports using the Import Batch Data event or for that matter any other event, the 2nd import fails since the Import Batch Data task is already in Process/running in Ariba. The standard ITK on BTP has similar shortcomings. You are prone to facing this problem in case you have multiple imports which are utilising similar task eg. Import Batch Data task, or if the integration flow is scheduled to run very frequently - causing the new Import Batch Data task to overlap with an already running Batch Import in Ariba. Since there is no request queuing support for the integration events in Ariba, we land up in this hot mess.

The good:

The standard iflow on such a failure will fail if the task is still running. However, it will try to re-process the old file again on the next run.

The bad:

You need to wait till the next run which might cause operational inconsistency in the solutions.

Let's make the Integration Flow smarter

There is a way to gracefully handle such a situation. SAP Ariba provided Integration Monitoring API for Procurement as well as for Strategic Sourcing. These APIs provide the status of the Import Batch Data integration event or any other ITK event, so you can in-real time get the information if the task is currently running or not. Since you are already in the Cloud Integration landscape, positioning an additional call to this API is VERY EASY.

Utilising the status of the Import task in Ariba, your Integration flow can now decide if it needs to pause the execution (if the Import task in Ariba is still In Process) for a while and retry again after a specific time period, or is it good to go ahead send the files to SAP Ariba.

There can be several ways to make your Integration flows for master data import conflict averse, this is just one way I implemented this conflict resolution mechanism.

Subprocess for graceful handling of conflicts


3. Using an On-Premise scheduler

There might be other customers who have invested in building their own On-Premise schedulers, which they don't want to let go of. Making these On-Prem schedulers might be tricky to complement with the standard ITK on BTP IFlows, since these IFlows are using the inbuilt schedulers.

The solution to this problem is fairly straightforward - convert all these IFlows such that they can be triggered via HTTPS calls from the outside world. Now you can add the logic to trigger these iFlows from your On-Prem schedulers - the best of both worlds, isn't it.

4. SMARTER -  But just these limited enhancements?

Of-course not. These are just the scenarios that I came across recently and thought were worth sharing. The most interesting part of working as a consultant is the uniqueness with which disparate customers utilise our solutions. There can be 'n' number of enhancements that customers might require or deem necessary on top of what the standard ITK on BTP provides, to seamlessly adapt to their business processes or enhance those processes to tread the path to becoming an Intelligent Enterprise. There's certainly no 'One size fit all' in Enterprise Software world. However, building solutions using technology stacks that can support technological evolution is what will be advantageous in the disruptive tech landscape. Embracing the ITK on BTP is a step in that direction (PERIOD)

5. Parting Notes

With the end-of-support date for legacy ITK soon approaching, there has been a lot of traction on this topic. I would love to hear how you and your customers have been adopting the ITK on BTP and what are the pitfalls that you have come across in this solution. You can share about your journey about this solution or ask questions about what I've shared in this blog-post in the comments down below. As always, you can reach out to the wider SAP Community to leverage their expertise.
1 Comment