(WIP) KAFKA-5142: Added support for record headers, reusing Kafka client's interfaces#4077
(WIP) KAFKA-5142: Added support for record headers, reusing Kafka client's interfaces#4077rhauch wants to merge 1 commit intoapache:trunkfrom
Conversation
|
See #2942 for an earlier proposal that uses a Converter-based approach. |
|
@rhauch reason more the converter approach is that just like with the byte[] in the record for key or value as you noted you need to know what this is. If within an org you simply only use String headers then you can use String serializer. But as you note in a more complex one where different headers are different types how do you know what to expect the header to be? One idea originally proposed in the headers kip was to have typed (primitives) headers using a leading type bits as with jms/amqp unfortunately the byte[] only won that discussion. A way to deal with this then is very similar to the way you can deal with it as a value or key in the record. And this is to use a scheme registry, where the subject can be topic+headerkey. Using serdes/converters means we can re use existing and keep it flexible for users , which is what the PR I have done allows (differs also from the kip but was off the back of kip discussion on mailing list) |
|
Closing this since the issue was resolved by #4319. |
This is still a work in progress and should not be merged.
This is a proposed PR that implements most of KIP-145 but with some changes. The Kafka client library's
HeadersandHeaderinterfaces are used directly so as to minimize the overhead of converting instances to a Connect-specific object. However, a newConnectHeadersclass is proposed to provide a fluent builder for easily constructing headers either in source connectors or SMTs that need to add/remove/modify headers, and a reader utility component for reading header values and converting to primitives.Note that KIP-145 is still undergoing discussions, so this is provided merely as one possible approach.