
Kaskada is a next-generation streaming engine that connects AI models to real-time and historical data. It provides real-time aggregation, event detection, and history replay, allowing users to precompute model inputs from streaming data and trigger proactive AI behaviors. Built for scale and reliability, Kaskada is implemented in Rust and uses Apache Arrow for efficient execution of large historic and high-throughput streaming queries. The platform is designed for ease of use, enabling users to connect and compute over databases and streaming data without the need for infrastructure provisioning. Kaskada's cloud-native design supports partitioned execution, making it suitable for growing needs in real-time AI applications.

Kaskada is a next-generation streaming engine that connects AI models to real-time and historical data. It provides real-time aggregation, event detection, and history replay, allowing users to precompute model inputs from streaming data and trigger proactive AI behaviors. Built for scale and reliability, Kaskada is implemented in Rust and uses Apache Arrow for efficient execution of large historic and high-throughput streaming queries. The platform is designed for ease of use, enabling users to connect and compute over databases and streaming data without the need for infrastructure provisioning. Kaskada's cloud-native design supports partitioned execution, making it suitable for growing needs in real-time AI applications.
Product: Streaming engine for computing, storing, and serving ML features from real-time and historical/event data
Tech: Implemented in Rust, uses Apache Arrow, integrates with Python
HQ: Seattle, Washington
Founders: Ben Chambers and Davor Bonaci
Outcome: Acquired by DataStax
Feature engineering and real-time ML/data pipelines for event-based systems
Machine learning infrastructure
Crunchbase lists Series A as last round; total funding reported as $9,780,000
“Includes investors such as Founders' Co-op and NextGen Venture Partners”