Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
lalitmohan
Product and Topic Expert
Product and Topic Expert
1,812

lalitmohan_0-1731045514162.png

Introduction

This blog outlines how King Abdullah University of Science and Technology (KAUST), as an early adopter, successfully integrated SAP Ariba with SAP Datasphere to build advanced analytics stories focused on procurement data. This integration was achieved as part of a proof-of-concept (PoC) project with the SAP Datasphere Adoption Program, showcasing the capabilities of SAP Ariba data in fulfilling specific procurement requirements within SAP Datasphere.

 About KAUST

Established in 2009, King Abdullah University of Science and Technology (KAUST) is a graduate research university in Saudi Arabia focused on tackling scientific and technological challenges in fields such as food and health, water, energy, environment, and the digital domain. Through interdisciplinary research and innovation, KAUST aims to address both global and regional issues.

Recently, our team at SAP collaborated with KAUST on a short-term PoC project. The goal was to help KAUST leverage SAP Datasphere and SAP Ariba Foundation Model for actionable procurement insights by building analytics stories tailored to KAUST's requirements. This blog provides an overview of the project, highlighting the steps taken to meet KAUST's unique procurement needs.

Business Challenge

KAUST needed a robust solution for analyzing procurement data in real-time. They required:

  • Improved Spend Visibility: Consolidating procurement data across multiple sources for a comprehensive spend analysis.
  • Real-Time Reporting: Transforming end-of-day reports into near real-time analytics for faster decision-making.
  • Cost Optimization: Reducing manual processes and improving operational efficiency.

Solution

Utilizing SAP Datasphere’s integration capabilities, SAP Ariba data can be accessed from both strategic and operational perspectives using SAP Ariba Standard APIs. This integration allows KAUST to leverage foundational SAP Ariba models and pre-built stories in SAP Analytics Cloud to extract valuable insights and perform detailed analyses of their procurement processes.

Through the SAP Discovery Center Mission - Leverage SAP Ariba Foundation model in SAP Datasphere (cloud.sap)  KAUST followed step-by-step guidelines to set up this integration, leveraging the BTP Data Extractor application. This open-source tool facilitated KAUST’s data extraction requirements, enabling a seamless PoC execution within the SAP ecosystem.

Technical Architecture

The technical architecture was designed to simplify complex SAP Ariba APIs, allowing KAUST to utilize pre-built SAP Analytics Cloud content integrated with SAP Datasphere's Ariba models. This approach streamlined the extraction, transformation, and loading (ETL) processes, supporting KAUST’s objective to gain insights into procurement efficiency.

DetailDiagram (2).png

Process

The SAP team provided hands-on support throughout a four-week engagement, covering architectural guidance, technical setup, and customizations to the SAP data extraction samples. Weekly touchpoints ensured that KAUST could address challenges promptly, ultimately enabling them to complete the PoC successfully with guidance from the Discovery Center mission.

Metrics

The following metrics demonstrate the scale and impact of this PoC project:

  • Replication Volume: 22 Models, 104 Views across 51 tables.
  • POC Data Volume: 1 GB of procurement data.
  • Ariba Data Availability: 100% Achieved.

Success Measurements

KAUST observed significant improvements following the PoC, including:

  • Real-Time Analytics: By utilizing the Standard Sap Analytics Cloud Dashboards KAUST team was able to quickly deploy the Analytics Assets, which are getting updated in near real-time, enhancing KAUST’s ability to make prompt decisions.
  • Reduced Manual Processes: The manual ETL process, which previously required constant oversight, is now fully automated with SAP Datasphere replication flows.

 Next Steps towards Operational Solution:

The next steps involve commencing a structured implementation plan to transition KAUST’s PoC to a fully operational solution. This phase will begin with a comprehensive requirements review and finalization of the architectural design based on PoC learnings.

Acknowledgments

This project’s success was due to the combined efforts of SAP and KAUST teams. Special thanks to SAP colleagues Cameron KhorsandiUma Anbazhagan, and  Harish Kintali for their guidance and support, and to KAUST team members Sohail IbjeeVikram Bagade and Venkata Rajasekhar Gottipati for their valuable collaboration.

4 Comments
Maha-Dev
Discoverer
0 Kudos

HI @lalitmohan,

Could you please provide detailed insights regarding the integration? I would like to know whether the Ariba data is stored within an HDI container, which is subsequently added to the SAP Datasphere space. and then these tables are consumed in DF/TF/Views in Datasphere?

Kalyan

KunalBansal
SAP Champion
SAP Champion

Informative and Insightful Blog, @lalitmohan 👏🏻

lalitmohan
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hello @Maha-Dev,

Yes, we are utilizing the Data Extractor App for Procurement to pull data from SAP Ariba Cloud Services via standard APIs and store the relevant information in a database.To automate data extractions on a set schedule, we have configured the SAP BTP Job Scheduler service to trigger extraction jobs in the Data Extractor application through the available endpoints.

In SAP Datasphere cockpit, we have leveraged the pre-packaged business content via Semantic Onboarding.

For more detailed technical insights on the integration, please refer to the link below, as this blog focuses primarily on the Customer Proof of Concept.
SAP Discovery Center Mission - Leverage SAP Ariba Foundation model in SAP Datasphere (cloud.sap)  

Maha-Dev
Discoverer

I appreciate the clarification. Thank you @lalitmohan