Describe the change you'd like to see
A example where Kafka is the default channel and Trigger with filters based on event fields.
Additionally the real source of events is application kafka topics ( not the kafka channel topics).
What would the complete setup be in this case. Does it still need the KafkaSource to read from application level kafka topics to publish to brokers (with kafka backed channels)
Any guidance on how the schema for the application level Kafka topic should be to support Trigger filters
Options:
- CloudEvent specification for Kafka with binary content mode with avro serialisation
https://github.com/cloudevents/spec/blob/master/kafka-protocol-binding.md#32-binary-content-mode
- CloudEvent specification for Kafka with structure mode with avro serialization.
https://github.com/cloudevents/spec/blob/master/kafka-protocol-binding.md#33-structured-content-mode
Additional context
https://github.com/knative/docs/tree/master/docs/eventing/samples/kafka
contains example for
- 'kafka source' -> 'service'
- 'broker' -> 'trigger' -> 'service'
Describe the change you'd like to see
A example where Kafka is the default channel and Trigger with filters based on event fields.
Additionally the real source of events is application kafka topics ( not the kafka channel topics).
What would the complete setup be in this case. Does it still need the KafkaSource to read from application level kafka topics to publish to brokers (with kafka backed channels)
Any guidance on how the schema for the application level Kafka topic should be to support Trigger filters
Options:
https://github.com/cloudevents/spec/blob/master/kafka-protocol-binding.md#32-binary-content-mode
https://github.com/cloudevents/spec/blob/master/kafka-protocol-binding.md#33-structured-content-mode
Additional context
https://github.com/knative/docs/tree/master/docs/eventing/samples/kafka
contains example for