Overview
- A new Real-Time Context Engine, offered as a managed service, streams materialized views to AI agents and applications via the Model Context Protocol and is available in early access.
- Streaming Agents run natively on Apache Flink to observe, decide, and act on live event streams, with Anthropic’s Claude set as the default large language model.
- Confluent added an Agent Definition service to create production-ready agents in a few lines of code with built-in observability and debugging.
- Built-in machine learning functions written in Flink SQL for anomaly detection, forecasting, model inference, and real-time visualization are available on Confluent Cloud.
- Confluent says delivering context through continuous streams can cut token and API costs and allows organizations to reuse existing governance, compliance, and security controls for AI data flows.