Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
4,698
Scope:

The scope of business requirement is to integrate reporting data for goods procured for a time period (say weekly) within Ariba application to an on-premise SAP application (here, BW). The date received at BW system would be used for further business processes as per customer needs.

This blog post mainly focuses on the key concepts of SAP Ariba in integrating with SAP CPI.

  • Pagination: Ariba sents data in pages, and each page has a reference of the next page with field (pagetoken). If this field exists with data, meaning we have another page data to be fetched

  • Refresh token: Most regularly used concept in API design, to update the initial Authorization token if it is expired with new Access token. Token expiry would be checked with the field (expires_in) data from Ariba

  • Rate limit: For synchronous calls to Ariba API’s, there is a rate limit per minute which means at any given time frame in a minute application accepts 3/4 calls from our CPI. So, if the number of hits to Ariba application per minute is crossed and still more page data to be fetched should be checked and process again after a while.


All above points should be kept intact while designing your integration flow. Hope this blog post offers readers an idea to design such similar business requirements.

Implementation:

            The design of iflow can be in numerous ways as per ones logic handling and thinking, below explanation is one between many. Here the iflow design split's into two flows, first one is to get the Access token at the initial call and second one for main logic process covering above all concepts.

Iflow1:

Initial iflow which is scheduled to run once every week, this calls the Ariba's tokenAPI to get the Authentication token details and stores in the data store with Global Visibility, which we would use in the main iFlow.


Details of steps are below:

1 Request-Reply: Call Ariba tokenAPI to get access_token, refresh_token, expires_in, fields data as a JSON message

2 Content Modifier: Set some initial property variables with constant values, like PageToken to constant “ ” and ExecutionTime to “0”. These variables will be updated based on the logic and as per value changes in the second flow (main message process). ‘FromDate’ property would be a date value stored in global variable in the Main iflow(iFlow2) after its successful execution as a last step               (For first run, 'FromDate' would take a default value)

3 Script: Append these custom property variables to the TokenPayload, which we can use it in Main flow. And ToDate is the calculated value from ‘FromDate + 7’, addition of date with value ‘7’ is a configurable variable and we can adjust it’s value. This ensures every run within 7 days range to fetch the data from Ariba.

4 JSON to XML: Convert above JSON message to XML, for ease of use as xpath in the main flow

5 Write: Store the XML payload to the Global data store

Below is the sample XML message which we form and store it in Global Data Store (for our understanding's)

<?xml version='1.0' encoding='UTF-8'?><request><realm>skf-1-T</realm><TemplateName>ZV0_ERPOrder</TemplateName<dateFrom>2020-06-26T00:00:00Z</dateFrom><dateTo>2020-07-03T00:00:00Z</dateTo><PageToken> </PageToken><exeSec>0</exeSec><access_token>111c4ac3-af83-4c80-91c2-28ec6aa9ef6f</access_token><refresh_token>e0b075dd-db55-4403-902a-844c3eb9abcd</refresh_token><token_type>bearer</token_type><scope/><expires_in>1440</expires_in></request>

 

Iflow2 (Main Flow):

I am not going to explain entire flow steps of this main flow, but would give key inputs of the design below. This flow logic is just for reference and each might have different views in handling this requirement. Before going to each steps in detail, below are few quick bits we need to keep in mind along with the explanation.

  • Main flow scheduled for every 5/10 mins for an hour immediate after the iflow1 run. This is a weekly once run

  • From date and To date would be of week range, used in query parameter while calling Ariba API

  • Update Token payload in data store each time you have new Access token and Page token values


Here goes the modular processes within the iflow.

  • Initial Integration process.



1 Content modifier: Initial parameters to set property variables with startTime. This start time is the current time (${date:now:HH:mm:ss}) which we will use in calculating the ExecutionTime and compare with expires_in, from xml saved in data store from  iflow1

2 Select data store: Fetch the Access Token payload stored from iflow1

3 Router: Check for data existence

4 Filter: To filter the token payload from tags ‘/messages/message’ of data store operation

5 ContentModifier: Store the xml payload in a property variable, this holds the access token payload prepared in iflow1. Each time it would be updated as and when changes to any values occur in further logic steps

6 Local Process Call

7 Write Variable: The current date is stored in the global variable, If this Main iFlow fails, the EndDate would not saved and will not updated as the ‘FromDate’ in the iFlow1 for next run. So we will not miss any data being fetched within week range because of any technical errors.

 

  • Local Sub Process call from step 6 above (Integration Process)



Loop the sub process, which has the main logic steps for message process and transformation. Check for looping conditions in the picture above, First check – if still pages exists from Ariba to process the message further and Second check – To, terminate the loop hand held with Rate limit check per minute

  • Local Sub Process (Looping call from above process)



This is the main process, where actual message process would happen. Check for number of hits to Ariba API per minute would be handled using the standard header variable. Secondly, we would need to store the Page_Token each time on its existence and use it in subsequent calls in the query statements. Thirdly, we need to check for the token expiry (this logic is separated in another local sub process explained lastly).

Detail steps:

1 Router: Check the standard header parameter(X-RateLimit-Remaining-minute) value in the condition. If ${header.X-RateLimit-Remaining-minute} = '0', then store the access token to the data store, and the iFlow starts in another 5/10 mins as per the schedule configuration. And end the flow

2 Request-Reply: If the RateLimit per minute or hits from CPI to Ariba application still remains, we will get the actual procurement data (JSON message in this scenario). The property variables used in calling url, would be from the xml payload fetched from data store in first step.

Example url formation to call the API:

Address: https://<Aribahost>/api/<apiname>/views/${property.viewName}

Query: realm=${property.realm}&filters={"createdDateFrom ":"${property.dateFrom}"," createdDateTo":"${property.dateTo}"}&pageToken=${property.pageToken}

Initial pageToken value would be blank coming from iflow1 (xml message)

3 JSON to XML: Convert JSON to xml, feed to mapping step

4 Content Modifier: Store the Page_Token data to the property variable ‘pageToken’

Next steps are sending the after mapping message to BW and storing the updated xml file with pagetoken and other values to the same Global Data store entity. The comes to check the Token expiry, if it is expired we need to give a call to another Ariba API to get the new Access Token, Refresh Token, Expiry_In etc., and store it in the same Data store again.

Below are the two local IP flows go hand in hand to check and update the new values.


Groovy code to check token expiry time.

String strtTime = map.get("startTime"); // initial start time of the iflow

String edTime = map.get("endTime"); // Current time just previous to this groovy step

String expiry = map.get("token_expiry"); //Value from Ariba API call in the iflow1

String exeSec = map.get("exeSeconds"); // Initial value as ‘0’ secs

 

String seconds = ( Date.parse( 'HH:mm:ss', edTime ).time - Date.parse( 'HH:mm:ss', strtTime).time ) / 1000

message.setProperty("execsec", seconds);

int exe = exeSec as Integer;

int sec = seconds as Integer;

int exp = expiry as Integer;

exe = exe + sec ;

exe = exe + 60; //Keeping some buffer seconds would help us

message.setProperty("exeSeconds", exe); // The actual execution secs calculated

if(exe > exp){

message.setProperty("refresh", "X"); // Flag value used in next Router step

}

return message;

}

Call to process ‘Refresh Token’, would go to the below sub process.


This process execute the sequential steps to fetch new Access Token and updated our xml file to the Global Data Store for next use in our logic.

Query used in calling this API – “grant_type=refresh_token&refresh_token=${property.refresh_token}” //property value updated in the step 4 of callPageNation local process

For calculating the number of hits to Ariba, initially I did with custom script codes and logic steps and later I explored that SAP is providing us with a very interesting header variable (X-RateLimit) which provides us with the remaining hits number to an API per a minute. This helped a lot in reducing the custom script codes and few additional steps.

Hope this blog post helps in handling the Pagination, Rate limit remaining per minute concept of Ariba.

Happy Reading!!
3 Comments
Labels in this area