| sidebar_position | title |
|---|---|
1 |
Introduction |
Laminar is a high-performance, distributed stream ingestion/processing platform that combines the power of SQL with the flexibility of modern stream processing. Laminar enables you to build, deploy, and manage streaming applications that can process millions of events per second with low latency and finegrained atomic scalability.
- SQL-based stream processing - Write streaming pipelines using familiar SQL syntax
- Real-time analytics - Process and analyze data as it arrives
- Scalable architecture - Horizontally scale to handle any data volume
- Fault tolerance - Automatic recovery with exactly-once processing guarantees
- Rich connector ecosystem - Connect to various data sources and sinks
Pipelines are the core abstraction in Laminar. A pipeline defines:
- The data sources to read from
- The transformations to apply
- The destinations to write to
Pipelines are defined using SQL and can include:
- Filtering and projection
- Aggregations and windowing
- Joins between streams
- User-defined functions (UDFs)
Connections define how Laminar interacts with external systems:
- Connection Profiles - Reusable configuration for connecting to systems
- Connection Tables - Schema definitions for sources and sinks
Jobs are running instances of pipelines. Each job:
- Executes the pipeline logic
- Maintains state and checkpoints
- Provides metrics and monitoring
- Can be stopped, started, and scaled
Laminar is ideal for:
- Dashboard and monitoring systems
- Real-time reporting
- Metric aggregation
- Anomaly detection
- Change Data Capture (CDC)
- ETL/ELT pipelines
- Data synchronization
- Event streaming
- Event-driven architectures
- Complex event processing
- Real-time ML feature computation
- IoT data processing
- Understand the Architecture of Laminar
- Learn Pipeline Basics
- Explore Kafka Integration
- Implement Change Data Capture