Class TensorFlowSavedModelEngine
- java.lang.Object
-
- com.codedstream.otterstream.inference.engine.LocalInferenceEngine<org.tensorflow.SavedModelBundle>
-
- com.codedstreams.otterstreams.sql.runtime.TensorFlowSavedModelEngine
-
- All Implemented Interfaces:
InferenceEngine<org.tensorflow.SavedModelBundle>
public class TensorFlowSavedModelEngine extends LocalInferenceEngine<org.tensorflow.SavedModelBundle>
TensorFlow SavedModel inference engine.
-
-
Nested Class Summary
-
Nested classes/interfaces inherited from interface com.codedstream.otterstream.inference.engine.InferenceEngine
InferenceEngine.EngineCapabilities
-
-
Field Summary
-
Fields inherited from class com.codedstream.otterstream.inference.engine.LocalInferenceEngine
initialized, loadedModel, modelConfig, modelLoader
-
-
Constructor Summary
Constructors Constructor Description TensorFlowSavedModelEngine()
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description voidclose()Closes the inference engine and releases all resources.InferenceEngine.EngineCapabilitiesgetCapabilities()Gets the capabilities of this inference engine.ModelMetadatagetMetadata()Gets metadata about the loaded model.InferenceResultinfer(Map<String,Object> inputs)Performs inference on a single input.InferenceResultinferBatch(Map<String,Object>[] batchInputs)Performs batch inference by running each input separately.protected voidloadModelDirectly(ModelConfig config)Override this method for engines that handle their own model loading.-
Methods inherited from class com.codedstream.otterstream.inference.engine.LocalInferenceEngine
getModelConfig, initialize, isReady
-
-
-
-
Method Detail
-
loadModelDirectly
protected void loadModelDirectly(ModelConfig config) throws InferenceException
Description copied from class:LocalInferenceEngineOverride this method for engines that handle their own model loading.Called during
LocalInferenceEngine.initialize(ModelConfig)if no ModelLoader is provided.- Overrides:
loadModelDirectlyin classLocalInferenceEngine<org.tensorflow.SavedModelBundle>- Parameters:
config- model configuration containing path and options- Throws:
InferenceException- if model loading fails
-
infer
public InferenceResult infer(Map<String,Object> inputs) throws InferenceException
Description copied from interface:InferenceEnginePerforms inference on a single input.- Specified by:
inferin interfaceInferenceEngine<org.tensorflow.SavedModelBundle>- Specified by:
inferin classLocalInferenceEngine<org.tensorflow.SavedModelBundle>- Parameters:
inputs- map of input name to input value- Returns:
- inference result containing predictions
- Throws:
InferenceException- if inference fails
-
inferBatch
public InferenceResult inferBatch(Map<String,Object>[] batchInputs) throws InferenceException
Performs batch inference by running each input separately. Returns a list of InferenceResult objects, one per input.- Specified by:
inferBatchin interfaceInferenceEngine<org.tensorflow.SavedModelBundle>- Specified by:
inferBatchin classLocalInferenceEngine<org.tensorflow.SavedModelBundle>- Parameters:
batchInputs- array of input maps- Returns:
- inference result containing batch predictions
- Throws:
InferenceException- if inference fails
-
getCapabilities
public InferenceEngine.EngineCapabilities getCapabilities()
Description copied from interface:InferenceEngineGets the capabilities of this inference engine.- Specified by:
getCapabilitiesin interfaceInferenceEngine<org.tensorflow.SavedModelBundle>- Specified by:
getCapabilitiesin classLocalInferenceEngine<org.tensorflow.SavedModelBundle>- Returns:
- engine capabilities (batching, GPU support, etc.)
-
getMetadata
public ModelMetadata getMetadata()
Description copied from interface:InferenceEngineGets metadata about the loaded model.- Returns:
- model metadata including inputs, outputs, and format
-
close
public void close() throws InferenceExceptionDescription copied from interface:InferenceEngineCloses the inference engine and releases all resources.After calling this method, the engine should not be used again.
- Specified by:
closein interfaceInferenceEngine<org.tensorflow.SavedModelBundle>- Overrides:
closein classLocalInferenceEngine<org.tensorflow.SavedModelBundle>- Throws:
InferenceException- if cleanup fails
-
-