Moving data from one system to another with some transformations is Data Integration. This is fine for ad hoc integrations but if such a pipeline is executed regularly and part of an entire ecosystem, I would add more requirements to a data integrati...
A frequently asked question is how to ingest data available in Kafka into Hana for e.g. analytics. Should it be done from Hana or from Kafka?
Hana reads from Kafka
SAP Hana has a wonderful data federation feature that can be used to make remote data ...
SAP Data Services is an ETL tool to Extract raw data, Transform it into business meaning and Load the result into a target for reporting, Business Intelligence tools, Data Migration or any other generic data integration requirement. Building such dat...
Today's UIs must be visually appealing and SAPUI5 does come with charting capabilities.
Note: This is a complete rewrite utilizing the latest amCharts 5 version and a different concept.
In my projects in need much more than that. More types of char...
The point I do not get is how the user permissions and the buttons/links of UI applications work together from technical point of view.
Imagine the following situation: The Sales Order screen shows data and the UI has an Edit button to modify the s...
As you create the schema based on the table structure, what is the Avro schema field name if the table column is called "/BIC/ABCD"? Avro names are limited to A-Z, a-z, 0-9 and _ chars. No other chars are allowed. No double byte chars, no minus, no $...
And some questions:What if my table has the two primary key columns user id and valid_from_date and the user id value is "werner_daehn"?What happens if the user executes an update statement on a primary key column, e.g. update table1 set id = 2 where...
Suggestion: Please look into this Kafka Avro library to get additional metadata into the avro schema when the source is a table.https://github.com/rtdi/KafkaAvroYour Confluent Connector maps a nvarchar(1) to a string and a string can be of any length...
Is the Avro encoding available when using a private Kafka cluster with the schema registry? Or does the Kafka connector support Avro/Schema registry meanwhile as well? Why are there two Kafka producers actually, Kafka and Confluent?