otter-stream-tensorflow
SavedModel
Native TensorFlow SavedModel integration with automatic signature discovery and GPU acceleration.
Module Overview
The TensorFlow module provides production-grade inference using TensorFlow's official Java API with SavedModel format support. It features automatic signature parsing, input/output discovery, and GPU acceleration when available.
TensorFlow Engine
SavedModelProduction-grade TensorFlow inference using official Java API with SavedModel format.
- SavedModel format support (TensorFlow 2.x standard)
- Automatic signature parsing and input/output discovery
- Float32 and Int32 tensor support
- GPU acceleration when TensorFlow GPU version available
Session Management
CoreEfficient TensorFlow session lifecycle with automatic tensor cleanup.
- SavedModelBundle loading and management
- Signature-based tensor name caching
- Automatic resource cleanup to prevent memory leaks
- Thread-safe session runners
Implementing TensorFlow Inference
Complete guide to integrate TensorFlow SavedModels into streaming pipelines.
-
Add Maven Dependency
<dependency> <groupId>com.codedstreams</groupId> <artifactId>otter-stream-tensorflow</artifactId> <version>1.0.16</version> </dependency> -
Export TensorFlow Model as SavedModel
import tensorflow as tf # Save your trained model model.save('/path/to/saved_model', save_format='tf') # SavedModel directory structure: # saved_model_directory/ # ├── saved_model.pb # ├── variables/ # └── assets/ -
Configure and Use Engine
ModelConfig config = ModelConfig.builder() .modelPath("/models/saved_model") .modelId("tensorflow-classifier") .format(ModelFormat.TENSORFLOW_SAVEDMODEL) .build(); TensorFlowInferenceEngine engine = new TensorFlowInferenceEngine(); engine.initialize(config); // Get discovered tensor names List<String> inputNames = engine.getCachedInputNames(); List<String> outputNames = engine.getCachedOutputNames(); // Perform inference Map<String, Object> inputs = Map.of( inputNames.get(0), new float[]{0.1f, 0.2f, 0.3f} ); InferenceResult result = engine.infer(inputs);
Maven Dependency
<dependency>
<groupId>com.codedstreams</groupId>
<artifactId>otter-stream-tensorflow</artifactId>
<version>1.0.16</version>
</dependency>