(WIP) KAFKA-5142: Expose Record Headers in Kafka Connect (DO NOT MERGE)#2942
(WIP) KAFKA-5142: Expose Record Headers in Kafka Connect (DO NOT MERGE)#2942michaelandrepearce wants to merge 1 commit intoapache:trunkfrom
Conversation
|
Refer to this link for build results (access rights to CI server needed): |
|
Refer to this link for build results (access rights to CI server needed): |
|
Refer to this link for build results (access rights to CI server needed): |
535ea20 to
6dd6cfb
Compare
|
Refer to this link for build results (access rights to CI server needed): |
|
Refer to this link for build results (access rights to CI server needed): |
|
Refer to this link for build results (access rights to CI server needed): |
6dd6cfb to
7322999
Compare
|
Refer to this link for build results (access rights to CI server needed): |
|
Refer to this link for build results (access rights to CI server needed): |
|
Refer to this link for build results (access rights to CI server needed): |
7322999 to
f91a4e9
Compare
|
Refer to this link for build results (access rights to CI server needed): |
|
Refer to this link for build results (access rights to CI server needed): |
Add ability to convert the headers data new ConnectHeader that has header value schema new SubjectConverter which allows exposing a subject, in this case the subject is the key. - this can be used to register the header type in repos like schema registry Add primitive subject converter that allows to configure header types, uses an in memory repo - this could be replace by a remote global repo ala schema repo.
|
Refer to this link for build results (access rights to CI server needed): |
f91a4e9 to
714adef
Compare
|
Refer to this link for build results (access rights to CI server needed): |
|
Refer to this link for build results (access rights to CI server needed): |
|
Refer to this link for build results (access rights to CI server needed): |
|
@michaelandrepearce, this is an interesting WIP proposal. Are you still working on this or interested in continuing this? I've got another #4077, though I'm not happy with it and it doesn't help much with on the sink side. I think there are two semi-distinct aspects of headers. The first is simple serialization/deserialization, and here I think using AK's serializers and deserializers makes a lot of sense. (It would be good to also support Connect's converters for the complex types.) The other challenge is conversion. Source connectors will generate headers each with a particular type, but that might not match what you want to have in your Kafka topics. Sink connectors will also likely expect headers with particular types, but again that might not match what you have in your messages. We could handle this conversion with SMTs, but SMTs are complicated, and it'd be interesting if Connect could provide support for conversion at the connector-level. Either way, it seems like this conversion step is going to be pretty important -- the challenge is whether it will be possible to convert between types. Numbers and Strings are kinda obvious, but conversion with other types maybe not so much. Anyway, nice work so far. Hope you're willing to continue the discussion! |
|
@rhauch this was to compliment a kip, and the discussion. https://cwiki.apache.org/confluence/display/KAFKA/KIP-145+-+Expose+Record+Headers+in+Kafka+Connect The kip is paused as not many supported it at the time. I’m happy to re pick up but need the support on the kip via the mail lists (eg +1’s) . Please re ignite the kip discussion on the mailing lists, and I’d be more than happy finishing this work off. |
|
@rhauch was working on my phone, and put my reply to why this approach with serdes/converters on your pr accident as had it open. I’m sure you’ll pick it up :) |
|
Closing since this was resolved by #4319. |
as per KIP-145
Add constructors to Connect/Source/SinkRecord to take
Iterable<Header>add accessor method to ConnectRecord
Headers headers()Update WorkerSource/Sink to pass headers to/from Producer/ConsumerRecords