Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
Showing results for 
Search instead for 
Did you mean: 
Product and Topic Expert
Product and Topic Expert
SAP Data Intelligence, cloud edition DI:2113 will soon be available.

Within this blog post, you will find updates on the latest enhancements in DI:2113. We want to share and describe the new functions and features of SAP Data Intelligence for the Q1 2022 release.

If you would like to review what was made available in the previous release, please have a look at this blog post.


This section will give you a quick preview about the main developments in each topic area. All details will be described in the following sections for each individual topic area.

SAP Data Intelligence 2113



Metadata & Governance

In this topic area you will find all features dealing with discovering metadata, working with it and also data preparation functionalities. Sometimes you will find similar information about newly supported systems. The reason is that people only having a look into one area, do not miss information as well as there could also be some more information related to the topic area.

Mass Update of Custom Attribute Values


  • Ability to easily update multiple custom attributes

    • Update attributes in glossary terms

    • Update attributes linked to a rule


  • Allow users to select a custom attribute’s value and replace it with a new value or blank.



View Dependencies and Rule Usage


  • Ability to view dependencies for a given rule

  • Ability to view dependencies for a dataset:

    • View what rules are bound to the dataset

    • View columns referenced by rule parameters



  • A central location for users to view rule binding information.

  • View where a rule is being used to understand impact a rule change may have on other datasets.



Extract Lineage from Structured Data Operators


  • Ability to extract object-level lineage from SAP Data Intelligence Modeler Structured Data Operators for the following operations:

    • Structured File Producer and Consumer

    • SAP Application Producer and Consumer

    • Table Producer and Consumer

    • SQL Consumer



  • Allow users to gain additional insights by capturing and viewing lineage of flowgraphs containing structured data operators.



Add Rules – Add Lineage – Add Publishing… within Metadata Explorer


  • Expanded functionality support for sources *

** A complete list of all supported sources and capabilities can be found in SAP Data Intelligence’s Data Governance User Guide.



Connectivity & Integration

This topic area focuses mainly on all kinds of connection and integration capabilities which are used across the product – for example: in the Metadata Explorer or on operator level in the Pipeline Modeler.

Mass Data Replication via Replication Flows
High-level overview & concept of Replication Flows


Enhancement of Data Replication Use Cases in Data Intelligence

  • Model Data Replication from a selected source to a selected target

  • Initial Focus on 1:1 replication with simple projections and filters

  • Dedicated User Interface for modeling Mass Data Replication

  • Lower TCO and TCD for Data Replication

  • Support Initial Load as well as Delta Load capabilities


Mass Data Replication via Replication Flows
Overview of Replication Flows source and target connectivity


Mass Data Replication via Replication Flows
Creation of Replication Flows and Tasks


Creating a Replication Flow including the configuration of the following properties:

  • name of your Replication Flow,

  • a selected source system,

  • a selected target system,

  • a description (optional) &

  • location where the objects are stored (=Container)



For each replication flow, you can add one or several tasks, where each task consists of:

  • a source data set (table, CDS View etc.),

  • a target data set (table),

  • load type (initial or initial with delta),

  • filters and mappings / projections (optional)


Mass Data Replication via Replication Flows
Creation of Replication Flows and Tasks


Select your desired source system, e.g.

  • SAP S/4 HANA Cloud (CDS Views)

  • SAP S/4 HANA on Premise (CDS Views)

  • SAP ECC / SLT (Tables)

  • MS Azure SQL (Tables)

Select your desired target system:

  • HANA Cloud (Tables)

  • AWS S3 (CSV and Parquet)

  • ADL V2 (CSV and Parquet)


Using Cloud Object Store as Target:


Select Object Store specific configurations:

  • Include Header Information

  • File Format (CSV and Parquet)

  • Compression (for Parquet)

  • Group Delta By


Mass Data Replication via Replication Flows
Projections and Mappings of Tasks


Define Filtering on single or multiple columns using pre-defined operators (e.g. >, <, >=, <=, between etc.) in a simplified user interface.



Perform projections on target data set to adjust default data mapping coming from the data source including add, edit and removal of columns on a dataset-level.




  • Ability to activate Secure Network Communication (SNC) for Remote Function Call (RFC) when using ABAP_LEGACY connections.



  • Allow users to create a connection to legacy ABAP On Premise/ECC systems in a secure way with encryption.


Support Active Warehouse for Snowflake 


  • Ability to configure a Snowflake connection with an active warehouse.




  • Users do not need a default warehouse assigned to their credential.


Driver Bundling for DB2 and MySQL

Drivers for MySQL and DB2 are already available in DI Cloud:

  • Connection can be created in Connection Manager as before.

  • No upload of drivers to User Workspace (/content/files/flowagent/) needed.


Support SNC Encryption for ABAP_LEGACY Connection

  • In case of VPN or DI on Premise, SNC is configured within SAP DI Connection Management.

  • In case SAP Cloud Connector (SCC) is used, SNC is configured in SCC configuration (SAP Standard).



Pipeline Modelling

This topic area covers new operators or enhancements of existing operators. Improvements or new functionalities of the Pipeline Modeler and the development of pipelines.


Snapshot and Enhanced Data Type Support for Kafka Operators

  • Updated Kafka Consumer and Producer operators (Generation 2)

    • Automated snapshots

      • No dedicated commit handling needed

    • New data type API

      • Easy data to message transform

      • Streaming support

      • Standardized Error ports

  • Operators work seamlessly with other Generation 2 operators

    • Consume Kafka data and ingest into selected data targets

    • Produce Kafka messages from consumed sources


Cloud Table Producer Operator


  • New Operator to write data to Cloud Table Storage

  • Support Snowflake (Staging S3 or WASB)

  • Support Google Big Query (Staging GCS)

  • After loading, the staging files (parquet) are automatically deleted



  • Increased flexibility to write into additional non-SAP cloud data warehouse solutions from SAP Data Intelligence.


Delete Mode for Table Producer

  • Allow to delete records in a target table

    • Records are deleted based on the provided values in the input records

    • Matching can be based on values in single column or several columns (configured using column mapping)





This topic area includes all services that are provided by the system – like administration, user management or system management.


Solution Download

  • Solutions can be downloaded via System Management UI:

    • Download action within the ”Tenant Management” view

    • User can only download non-SAP solutions

      • System solutions are shown as before but have an inactivated download link

    • Note: Solution can be downloaded and uploaded via System Management Command Line Client (vctl) as before



Policy Mapping

Allow to map attributes from a connected Identify Provider (IdP) to DI policies in DI Cloud

  • By this, DI policies can be assigned to users outside of DI using the corresponding IdP tools

  • Configured with the vctl Command Line Tool using the policy-mapping command

  • Allows to set policy assingment mode:

    • OnlyManual: Policies can only be assigned within DI

    • OnlyMapping: Policies can only be assigned by policy mappings to the IDP

    • ManualAndMapping: Both policy assignments are used


These are the new functions, features and enhancements in SAP Data Intelligence, cloud edition DI:2113 release.

We hope you like them and, by reading the above descriptions, have already identified some areas you would like to try out.

If you are interested, please refer to What’s New in Data Intelligence – Central Blog Post.

For more updates, follow the tag SAP Data Intelligence.

We recommend visiting our SAP Data Intelligence community topic page to check helpful resources, links and what other community members post. If you have a question, feel free to check out the Q&A area and ask a question here.

Thank you & Best Regards,

Eduardo and the SAP Data Intelligence PM team
1 Comment