Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
Showing results for 
Search instead for 
Did you mean: 
Product and Topic Expert
Product and Topic Expert
SAP Data Intelligence, cloud edition DI:2110 will soon be available.

Within this blog post, you will find updates on the latest enhancements in DI:2110. We want to share and describe the new functions and features of SAP Data Intelligence for the Q4 2021 release.

If you would like to review what was made available in the previous release, please have a look at this blog post.


This section will give you a quick preview about the main developments in each topic area. All details will be described in the following sections for each individual topic area.

SAP Data Intelligence 2110


Metadata & Governance

In this topic area you will find all features dealing with discovering metadata, working with it and also data preparation functionalities. Sometimes you will find similar information about newly supported systems. The reason is that people only having a look into one area, do not miss information as well as there could also be some more information related to the topic area.

Rule column in remediation preparation UI


  • Run a rulebook and create a preparation based on failed records

  • Open preparation to remediate the records that failed the defined rules

  • Use self service data preparation to correct, standardize, and enrich data



  • Improve quality and trustworthiness of data

  • Identify and remediate data to comply with organizational standards

  • Easy to use and intuitive data preparation to correct data

Schedule Profiling Tasks in Metadata Explorer


  • Scheduling of profiling tasks in Metadata Explorer

  • Select to a profile task to run on a schedule to ensure you data insight is the most current

  • Ability to create a profile task and setup a schedule to automatically run the profile task on a reoccurring basis


  • Always view the most updated rulebook results

  • Automate the profiling tasks with the latest metadata

  • Provide scheduling to create, modify, and view scheduled profiling tasks

  • View end-to-end tracking of scheduled events

Support Custom Attributes with Rules


  • Support custom attributes within rules to add extra insight to a defined rule

  • Ability to import SAP Information Steward's custom attributes for a rule into Metadata Explorer



  • Gain greater insight and understanding around data quality and validation rules

  • Tightly link glossary terms, custom attributes to rules

  • Reuse existing and approved custom attributes used within SAP Information Steward through IS Connection type or zip file

Copy Rule and Rule Bindings


  • Ability to copy an existing rule to save time from having to recreate shared functionality

  • Copy an existing rule bindings to a dataset instead of having to recreate a dataset and binding to columns in the dataset

  • Copy rule binding(s) from one rulebook to another rulebook



  • Save time and protects users from missing mapping input parameters

  • Save time by copyingrule binding(s) from one rulebook to another and not have to recreate from scratch

Public (Rule) APIs in Metadata Explorer for Metadata Exchange


  • Export out Data Intelligence rules

  • Create public API to access rule definitions, rulebooks, and rule results

  • Gain access to rule results outside of SAP Data Intelligence

  • Ability to access bound rules through rule APIs users can create their own reports



  • Ability to share definition or rules, rulebooks, and rule results with external reporting tools

  • Capacity to build custom data quality reports

  • Ability to view quality improvements of data over time

  • Identify datasets with bad quality of data – untrustworthy data

Ux refactoring of Glossary Term View / Edit


  • Combined view and edit of term modes into one screen

  • Support to include a tabular view for relationships



  • Improved seamless experience between viewing and editing a term

  • Improve user experience and usability for glossary


Delta Extraction Support for Data Lineage and
Add Profiling – Add Data Lineage – Add Publishing… within Metadata Explorer


  • Extract only the necessary information that changed for the lineage of artifacts

  • Added functionality support for sources **

Note: ** denote new with DI:2110



  • Improved performance of lineage extraction using lineage delta extraction

  • Ability to schedule publication to regularly and efficiently refresh extracted metadata and lineage

  • Expanded functionality support for sources **


Connectivity & Integration

This topic area focuses mainly on all kinds of connection and integration capabilities which are used across the product – for example: in the Metadata Explorer or on operator level in the Pipeline Modeler.

Support of Kafka Connection via Cloud Connector


  • Support usage of Kafka Connections via SAP Cloud Connector for DI Cloud Customers.

  • Kafka Pipeline operators can connect to Kafka messaging broker running inside a corporate network/VPC via the SAP Cloud Connector in DI Cloud



  • Avoid establishing a VPN or VPC peering to Kafka

  • Deploy SAP Cloud connector on premise, so a DI Cloud cluster can connect to a Kafka messaging broker via SAP Cloud Connector

Support client certificate for WebSocket RFC 


  • Allow usage of client certificates for WebSocket RFC Connections for S/4 HANA Cloud systems and S/4 HANA on-premise (starting from1909) used as a source system.



  • Adds additional layer of security, by adding another authentication mode for Websocket RFC connections.

Support of additional ABAP Data Format Conversions

Please note that this functionality is depending on the connected ABAP system and not directly included in DI 2110 as a feature. You can check SAP note 3105880 for more details.


  • Support of different data type conversions using SAP source systems with SLT / ECC and S/4 HANA source systems using Generation 1 Operators.



  • When extracting the data out of legacy ABAP system to target system, users can switch between pre-defined different format conversions.

Simplified Operators for Core Integration Scenarios

Introduction of new operators with unified table type and efficient recovery support to implement core integration scenarios

  • Connections: HANA, 3rd Party Databases (Table Consumer),
    File Readers, and Application Consumer

  • Processing: Structured Data Transform, Python, and standardized Table Encode/Decoder (CSV, JSON, Parquet, ORC)

Operators are available as a new ”Generation 2” category and do not interfere with existing graphs and operators


This topic area includes all services that are provided by the system – like administration, user management or system management.

Monitoring API for Tenant Administrators

Prometheus API to connect external monitoring applications to SAP Data Intelligence Cloud tenants

  • Allows for detailed resource and status monitoring of tenants

  • SAP Note number 3098656: “Integrating the 3rd party Grafana Software with SAP Data Intelligence Cloud"

Pipeline Modelling

This topic area covers new operators or enhancements of existing operators. Improvements or new functionalities of the Pipeline Modeler and the development of pipelines.

Automated recovery of pipelines

  • Pipelines can automatically re-start in case of (temporary) failures

    • Configured by maximum number of re-starts in recovery interval

  • Pipelines can be paused and re-started with given configuration

  • Restarted pipelines are grouped in Modeler and can be fully analyzed (traces, logs, metrics)


New Data Types for Low-code Scenarios

  • Introduction of general table type to simplify data processing pipelines

  • Replacement of the “message” types used in the existing operators

  • New underlying Data Type Framework:

    • Existing core data types (int16, int64, …)

    • Ability to build complex data types (structs)

    • Primary table type for structured data processing

  • Read and write of data using streams for lower memory consumption

Stateful Pipelines for efficient recovery

  • Operators save snapshots with processing state(e.g., offset state in HANA reader)

  • On failure, the pipeline can be re-started from the last saved snapshot for efficient recovery

  • Allows to implement resilient and long running pipelines

  • Introduced as part of new operator category (Generation 2), existing pipelines and operators are not affected


Deployment & Delivery

Within this focus area, all functions and features which are dealing with the setup process, installation or deployment will be described.

AWS Singapore


These are the new functions, features and enhancements in SAP Data Intelligence, cloud edition DI:2110 release.

We hope you like them and, by reading the above descriptions, have already identified some areas you would like to try out.

If you are interested, please refer to What’s New in Data Intelligence – Central Blog Post.

For more updates, follow the tag SAP Data Intelligence.

We recommend visiting our SAP Data Intelligence community topic page to check helpful resources, links and what other community members post. If you have a question, feel free to check out the Q&A area and ask a question here.

Thank you & Best Regards,

Eduardo and the SAP Data Intelligence PM team