From 6a71cd93ca417132f0176f26735c1c017ecae630 Mon Sep 17 00:00:00 2001 From: Clemens Hutter Date: Thu, 7 Aug 2025 16:13:15 +0200 Subject: [PATCH 1/2] Remove SPAM URL the previous URL http://lambda-architecture.net/ seems to now be controlled by spammers --- docs/streams/core-concepts.html | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/streams/core-concepts.html b/docs/streams/core-concepts.html index c400ca08453c6..7baaea2b0c183 100644 --- a/docs/streams/core-concepts.html +++ b/docs/streams/core-concepts.html @@ -279,7 +279,7 @@

Lambda Architecture. + to the stream processing pipeline, known as the Lambda Architecture. Prior to 0.11.0.0, Kafka only provides at-least-once delivery guarantees and hence any stream processing systems that leverage it as the backend storage could not guarantee end-to-end exactly-once semantics. In fact, even for those stream processing systems that claim to support exactly-once processing, as long as they are reading from / writing to Kafka as the source / sink, their applications cannot actually guarantee that no duplicates will be generated throughout the pipeline.
From 1e63616a418565ef7a99f0ff666ac350225631f6 Mon Sep 17 00:00:00 2001 From: Clemens Hutter Date: Thu, 7 Aug 2025 21:55:46 +0200 Subject: [PATCH 2/2] Update docs/streams/core-concepts.html change URL to use HTTPS Co-authored-by: Shashank --- docs/streams/core-concepts.html | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/streams/core-concepts.html b/docs/streams/core-concepts.html index 7baaea2b0c183..a2d1b7209b551 100644 --- a/docs/streams/core-concepts.html +++ b/docs/streams/core-concepts.html @@ -279,7 +279,7 @@

Lambda Architecture. + to the stream processing pipeline, known as the Lambda Architecture. Prior to 0.11.0.0, Kafka only provides at-least-once delivery guarantees and hence any stream processing systems that leverage it as the backend storage could not guarantee end-to-end exactly-once semantics. In fact, even for those stream processing systems that claim to support exactly-once processing, as long as they are reading from / writing to Kafka as the source / sink, their applications cannot actually guarantee that no duplicates will be generated throughout the pipeline.