Application Development Blog Posts
Learn and share on deeper, cross technology development topics such as integration and connectivity, automation, cloud extensibility, developing at scale, and security.
Showing results for 
Search instead for 
Did you mean: 
Former Member
0 Kudos

Key Objectives -

  1. Enhanced customer experience - by bringing down the
    overall installation processing time
  2. Integration of data at various segregated layers like -
  3. Reduction in the efforts put in by NEG teams by involving
    in manual activities to extract and analyses data for asset utilization.
  4. Lay down foundation for proactively augmenting the
    network, assets even prior to request from applicants.

End User Situation:

  1. When NEG gets the case from CMG for checking the
    technical feasibility against the applied load by applicant, they extract data
    manually from various sources like, AMR, GIS, SAP and work manually to derive
    the network and asset utilization.
  2. Manual working often lead to errors and thereby lead to
    delay in processing of the applicant's request for new connection


SAP HANA is used for big data processing (sourced from
various sources like - AMR, GIS, etc) and a unique integration is developed to
transport processed data from HANA to SAP.


A unique data flow mechanism was developed to transport data
from HANA to SAP ISU, as against the standard and usual scenario of data flow
from SAP to HANA database. So in this scenario HANA became the 'Destination' as
well as a 'source'.

The data from different sources were sent to HANA and all
the complex business logics were developed in HANA instead of SAP for utilizing
the HANA processing capabilities and the processed data was sourced from HANA
to its final destination in SAP.


To create a system based provision which could bring the
network/ asset (DT) utilization data from various sources to a single point and
based on system based algorithms the technical feasibility may be derived from
an available network against the additional load growth applied from the same

How to develop?

  1. First we have modeled the process for DT peak
  2. Design the SQL calculation view in HANA Studio
    as below

3. Developed Virtual provider using this calculation view as:

4. Pull the data from this virtual provider in ECC ISU system using with RFC function module as RSDRI_INFOPROV_READ_RFC

5. Developed Module pool entry screen for end user as:

Control Project Closer:

A new unique landscape worked out during brainstorming

As the requirement involved huge data from various different
landscape, so an efficient and optimized runtime processing was a big


  1. Availability of data from various sources to one location
    for ease of working by involved groups
  2. Enhanced data integrity as various system based
    algorithms will help highlight any data discrepancy among various different
  3. An advanced step towards 'Ease of Doing Business' for
    enhanced Customer Experience (by over all reduction in process cycle time)
  4. Enhanced IT user experience (ease in working through IT
    system based automation)
  5. Utilization of Advanced SAP HANA technology for this
    process (First time such landscape modeling has been used in TPDDL scenario)