π Introducing Icecake.Ingest.Streaming β High-Performance Snowpipe Streaming Ingestion for .NET #1
jesinity
announced in
Announcements
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I'm excited to announce the first public release of Icecake.Ingest.Streaming, a modern, high-performance .NET library designed to make Snowflake Snowpipe Streaming ingestion fast, reliable, and developer-friendly.
Snowpipe Streaming (v2 API) is Snowflakeβs low-latency ingestion channel for real-time workloads β and this library wraps it in a clean, safe, and highly optimized .NET experience.
β¨ Key Features
π₯ High-throughput ingestion (internals)
β’ Efficient double-buffer design
β’ Zero-overlap flushes via a single-flight semaphore
β’ Minimal lock contention
β’ Reusable buffers to reduce GC pressure
π Smart batching
β’ Flush automatically on:
β’ max rows
β’ max bytes
β’ flush interval
β’ Client-supplied or automatic offset tokens
β’ Optional stateful ingestion support
π§± Battle-tested concurrency model
β’ Fully async/await .NET http ingestion (of course!)
β’ Flushing happens outside locks
β’ Single flush at a time (safe continuation token handling)
β’ Background timers for:
β’ periodic flush
β’ health monitoring
π‘ Adaptive offset polling
β’ FetchLatestCommittedOffsetAsync dynamically adjusts polling frequency
β’ Uses Snowflakeβs SnowflakeAvgProcessingLatencyMs to avoid hammering the endpoint under load
Beta Was this translation helpful? Give feedback.
All reactions