SDK Documentation
Complete developer guide for integrating real-time machine learning inference into Apache Flink streaming applications. Now with Flink SQL support!
Core Concepts
Learn about the architecture, key components, and design patterns used in Otter Streams SDK for real-time ML inference.
Read MoreFlink SQL Guide
Use ML inference directly in Flink SQL with UDFs, table functions, and declarative API.
Explore SQLAPI Reference
Detailed API documentation for all classes, methods, and interfaces in the Otter Streams SDK.
Browse APICore Concepts
Otter Streams SDK is designed to seamlessly integrate machine learning inference into Apache Flink streaming applications. Now supporting both DataStream and Flink SQL APIs! This section covers the fundamental concepts and architecture.
New: Flink SQL Integration
Otter Streams now provides full Flink SQL support with UDFs, table functions, and declarative ML inference. This allows you to use ML models directly in your SQL queries without writing Java code.
Architecture Overview
Learn how Otter Streams integrates with Apache Flink's streaming architecture to enable real-time ML inference.
FundamentalFlink SQL Integration
Use ML inference directly in SQL with UDFs, table functions, and declarative API.
New!ML Inference Engine
Understand the core inference engine that powers model execution across different frameworks.
CoreConfiguration
Learn how to configure and optimize ML inference pipelines for your specific use case.
EssentialGetting Started
Follow these steps to integrate Otter Streams SDK into your Apache Flink application:
Prerequisites
Before you begin, ensure you have Apache Flink 1.15+ and Java 11 or later installed.
Step 1: Add Dependency
<dependency>
<groupId>com.codedstreams</groupId>
<artifactId>ml-inference-core</artifactId>
<version>1.0.16</version>
</dependency>
<dependency>
<groupId>com.codedstreams</groupId>
<artifactId>otter-stream-onnx</artifactId>
<version>1.0.16</version>
</dependency>
Step 2: Basic Usage
// Create ML inference operator
MLInferenceOperator<MyInput, MyOutput> inferenceOp =
new MLInferenceOperator.Builder<>()
.withModelPath("path/to/model.onnx")
.withInputType(MyInput.class)
.withOutputType(MyOutput.class)
.build();
// Use in Flink DataStream
DataStream<MyOutput> resultStream =
inputStream.process(inferenceOp);
SQL API Benefits
The Flink SQL API provides a declarative way to use ML inference, making it easier to integrate with existing SQL-based workflows, BI tools, and data exploration platforms.
Step 1: Add SQL Dependency
<dependency>
<groupId>com.codedstreams</groupId>
<artifactId>otter-stream-sql</artifactId>
<version>1.0.16</version>
</dependency>
Step 2: Register ML Functions
// Register ML inference UDF in SQL
CREATE FUNCTION FraudDetector
AS 'com.codedstreams.otter.MLInferenceUDF'
USING JAR '/path/to/otter-stream-sql.jar';
// Or in Java code
tEnv.createTemporarySystemFunction(
"ML_PREDICT",
new MLInferenceUDF("models/fraud_model.onnx")
);
Step 3: Use in SQL Queries
-- Real-time fraud detection with ML
SELECT
transaction_id,
amount,
FraudDetector(amount, category, merchant) as fraud_score
FROM transactions
WHERE FraudDetector(amount, category, merchant) > 0.8;
-- Table function for complex outputs
SELECT
t.transaction_id,
p.score,
p.confidence
FROM transactions t,
LATERAL TABLE(ML_PREDICT_TABLE(t.features)) AS p(score, confidence);
For detailed instructions, visit the Getting Started guide.
Module Documentation
Explore detailed documentation for each module:
Core Module
Foundation classes and interfaces for ML inference in Flink streams.
EssentialFlink SQL
SQL integration with UDFs, table functions, and declarative ML inference.
New!ONNX Runtime
Support for ONNX models with optimized inference execution.
PopularTensorFlow
TensorFlow model support for real-time inference pipelines.
PopularPyTorch
PyTorch model integration with Flink streaming.
Cutting EdgeXGBoost
Gradient boosting models for real-time predictions.
OptimizedPMML
PMML model support for standardized ML deployments.
StandardNext Steps
Explore Modules
Dive deeper into specific modules to understand their capabilities and usage patterns.
Browse ModulesTry SQL Examples
Check out SQL examples to see ML inference in action with declarative queries.
View SQL ExamplesAPI Reference
Explore the complete API documentation for detailed class and method references.
Browse API