Skip to content

Commit 9acd043

Browse files
authored
Merge pull request #1323 from datanav/IS-18610-a
IS-18605: Removed references to key_schema and value_schema properties
2 parents bf37937 + bee22b3 commit 9acd043

2 files changed

Lines changed: 2 additions & 2 deletions

File tree

hub/documentation/service-configuration/pipes/configuration-sinks-kafka.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ Kafka sink
55

66
The Kafka sink produces data to a Kafka topic.
77

8-
Entities sent to this sink will use the ``"key"``, ``"value"``, ``"headers"``, ``"key_schema"`` and ``"value_schema"`` properties to produce the messages sent to the Kafka topic. The latter two properties are only relevant if the ``"confluent_schema_json"`` serializer is used. The properties ``"key"`` and ``"value"`` are mandatory. The ``"headers"`` property is optional, but it must be an object with string keys and string or bytes values if present.
8+
Entities sent to this sink will use the ``"key"``, ``"value"`` and ``"headers"`` properties to produce the messages sent to the Kafka topic. The latter two properties are only relevant if the ``"confluent_schema_json"`` serializer is used. The properties ``"key"`` and ``"value"`` are mandatory. The ``"headers"`` property is optional, but it must be an object with string keys and string or bytes values if present.
99

1010
The properties used matches the properties emitted by the :ref:`Kafka source <kafka_source>`. This means that it should be possible to consume a topic and produce to a new topic in a pipe with no DTL.
1111

hub/documentation/service-configuration/pipes/configuration-sources-kafka.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ Kafka source
55

66
The Kafka source consumes data from a Kafka topic. The consumer does not use a consumer group, but instead stores the offset in the pipe, and it does not commit the consumer offset back to Kafka.
77

8-
The entities emitted from this source has the properties ``"offset"``, ``"partition"``, ``"timestamp"``, ``"key"``, ``"key_schema"``, ``"value"``, ``"value_schema"``, ``"headers"``. If key deserialization fails and ``"strict"`` is ``false`` then the entity will also have an ``"invalid_key"`` property. Similarly if value deserialization fails and ``"strict"`` is ``false`` then the entity will also have an ``"invalid_value"`` property. ``"headers"`` is optional and will only be present if the message has headers. If present the ``"headers"`` property is an object with string keys and string or bytes values. If the header value is of type bytes then it means that the header value couldn't be deserialized as a string.
8+
The entities emitted from this source has the properties ``"offset"``, ``"partition"``, ``"timestamp"``, ``"key"``, ``"value"``, ``"headers"``. If key deserialization fails and ``"strict"`` is ``false`` then the entity will also have an ``"invalid_key"`` property. Similarly if value deserialization fails and ``"strict"`` is ``false`` then the entity will also have an ``"invalid_value"`` property. ``"headers"`` is optional and will only be present if the message has headers. If present the ``"headers"`` property is an object with string keys and string or bytes values. If the header value is of type bytes then it means that the header value couldn't be deserialized as a string.
99

1010
.. NOTE::
1111

0 commit comments

Comments
 (0)