Kafka Component Release Notes
This page lists the main features added to the Kafka component.
Feature Highlights
Version 2.0.2
Synchronous sending mode
Kafka component was sending messages asynchronously, without waiting for an answer from the Kafka broker (Fire and Forget).
This was allowing better performances when sending a lot of messages. However, this avoided to properly compute messages status and statistics.
A new mode has been added to choose how the messages are sent, which allows sending also messages synchronously if wanted.
-
The new synchronous mode is the default mode used now, it allows to have a better management and follow the status and statistics properly.
-
Asynchronous mode is still available if you want better performances and you do not need statistics about messages sent.
Defining the writing mode in Metadata
A new attribute is available in Metadata to choose the writing mode.
This allows to define a default value for the Mapping and Processes which will use this Metadata.
Note that you can define a default value in Metadata and override it in Mappings easily, as described in the next section.
Defining the writing mode in Mapping
Writing mode can be customized also in Mappings, on the Kafka Integration Templates.
The value defined on the Template overrides the default value defined in Metadata.
If no value is defined, the Metadata default value is used.
INTEGRATION Kafka To File Template
New Batch Size parameter
Creation of a new parameter "Batch Size" to defines how many messages must be read from the topic before writing to the file.
When reading messages, the messages will be written in the file each "n" messages received, "n" being the batch size.
When you define a large timeout so that the Mapping will read data continuously, make sure to define a proper value for this parameter.
As an example, you can define it to "1" for instance if you want each message to be written in the file as soon as it is read. |
Change Data Capture (CDC)
Multiple improvements have been performed to homogenize the usage of Change Data Capture (CDC) in the various Components.
Parameters have been homogenized, so that all Templates should now have the same CDC Parameters, with the same support of features.
Multiple fixes have also been performed to correct CDC issues. Refer to the changelog for the exact list of changes.
Change Log
Version 3.0.0 (Component Pack)
New Features
-
DI-3701: Allow Components to contribute to Designer monitored statistics
-
DI-4053: Query Editor menu renamed to "Launch Query Editor"
-
DI-4508: Update Components and Designer to take into account dedicated license permissions
-
DI-4813: Rebranding: Drivers classes and URLs
-
DI-4962: Improved component dependencies and requirements management
Version 2.0.2 (Kafka Component)
New Features
-
DI-1298: INTEGRATION Kafka to File Template - Addition of the "Batch Size" parameter to define how many messages must be read from the topic before writing to the file
-
DI-1940: Support synchronous mode when sending messages (new attribute and parameter in Metadata and Templates are available to define the write mode)
-
DI-1909: Templates updated - New Parameters 'Unlock Cdc Table' and 'Lock Cdc Table' to configure the behaviour of CDC tables locking
Bug Fixes
-
DI-1913: An error was thrown when trying to connect to a secure Kafka server, such as "KafkaException: javax.security.auth.login.LoginException: unable to find LoginModule class: org.apache.kafka.common.security.plain.PlainLoginModule"
-
DI-2162: The Component was not working on Topics having a dot in the name
-
DI-2658: Kafka Metadata - 'connect to database' context menu was not working
-
DI-1908: Templates updated - The 'Cdc Subscriber' parameter was ignored in some Templates on Lock / Unlock CDC steps
-
DI-1907: Templates updated - The 'Cdc Subscriber' parameter was ignored in some Templates when querying the source data