Uses of Class
com.codedstream.otterstream.inference.config.ModelConfig
-
-
Uses of ModelConfig in com.codedstream.otterstream.inference.config
Methods in com.codedstream.otterstream.inference.config that return ModelConfig Modifier and Type Method Description ModelConfigModelConfig.Builder. build()Builds the ModelConfig instance.ModelConfigInferenceConfig. getModelConfig()Methods in com.codedstream.otterstream.inference.config with parameters of type ModelConfig Modifier and Type Method Description InferenceConfig.BuilderInferenceConfig.Builder. modelConfig(ModelConfig modelConfig)Sets the model configuration.Constructors in com.codedstream.otterstream.inference.config with parameters of type ModelConfig Constructor Description InferenceConfig(ModelConfig modelConfig, int batchSize, long timeoutMs, int maxRetries, boolean enableMetrics, Map<String,Object> engineOptions)Constructs inference configuration. -
Uses of ModelConfig in com.codedstream.otterstream.inference.engine
Fields in com.codedstream.otterstream.inference.engine declared as ModelConfig Modifier and Type Field Description protected ModelConfigLocalInferenceEngine. modelConfigMethods in com.codedstream.otterstream.inference.engine that return ModelConfig Modifier and Type Method Description ModelConfigInferenceEngine. getModelConfig()Gets the configuration used to initialize this engine.ModelConfigLocalInferenceEngine. getModelConfig()Methods in com.codedstream.otterstream.inference.engine with parameters of type ModelConfig Modifier and Type Method Description voidInferenceEngine. initialize(ModelConfig config)Initializes the inference engine with the given configuration.voidLocalInferenceEngine. initialize(ModelConfig config)protected voidLocalInferenceEngine. loadModelDirectly(ModelConfig config)Override this method for engines that handle their own model loading. -
Uses of ModelConfig in com.codedstream.otterstream.inference.model
Methods in com.codedstream.otterstream.inference.model with parameters of type ModelConfig Modifier and Type Method Description TModelLoader. loadModel(ModelConfig config)Loads a model from the path specified in configuration.TModelLoader. loadModel(InputStream inputStream, ModelConfig config)Loads a model from an input stream.booleanModelLoader. validateModel(T model, ModelConfig config)Validates that a loaded model matches the configuration. -
Uses of ModelConfig in com.codedstream.otterstream.onnx
Methods in com.codedstream.otterstream.onnx with parameters of type ModelConfig Modifier and Type Method Description voidOnnxInferenceEngine. initialize(ModelConfig config)Initializes the ONNX inference engine with the provided configuration.InferenceSessionOnnxModelLoader. loadModel(ModelConfig config)Loads an ONNX model from the path specified in configuration.InferenceSessionOnnxModelLoader. loadModel(InputStream inputStream, ModelConfig config)Loads an ONNX model from an input stream.booleanOnnxModelLoader. validateModel(InferenceSession model, ModelConfig config)Validates that a loaded ONNX model matches the configuration. -
Uses of ModelConfig in com.codedstream.otterstream.pmml
Methods in com.codedstream.otterstream.pmml with parameters of type ModelConfig Modifier and Type Method Description voidPmmlInferenceEngine. initialize(ModelConfig config)Initializes the PMML inference engine by loading and parsing a PMML model file. -
Uses of ModelConfig in com.codedstream.otterstream.pytorch
Methods in com.codedstream.otterstream.pytorch with parameters of type ModelConfig Modifier and Type Method Description voidTorchScriptInferenceEngine. initialize(ModelConfig config)Initializes the PyTorch inference engine by loading a TorchScript model. -
Uses of ModelConfig in com.codedstream.otterstream.remote
Fields in com.codedstream.otterstream.remote declared as ModelConfig Modifier and Type Field Description protected ModelConfigRemoteInferenceEngine. modelConfigMethods in com.codedstream.otterstream.remote with parameters of type ModelConfig Modifier and Type Method Description voidRemoteInferenceEngine. initialize(ModelConfig config)Initializes the remote inference engine with endpoint configuration. -
Uses of ModelConfig in com.codedstream.otterstream.remote.http
Methods in com.codedstream.otterstream.remote.http that return ModelConfig Modifier and Type Method Description ModelConfigHttpInferenceClient. getModelConfig()Gets the model configuration.Methods in com.codedstream.otterstream.remote.http with parameters of type ModelConfig Modifier and Type Method Description voidHttpInferenceClient. initialize(ModelConfig config)Initializes the HTTP inference client with connection configuration. -
Uses of ModelConfig in com.codedstream.otterstream.remote.sagemaker
Methods in com.codedstream.otterstream.remote.sagemaker that return ModelConfig Modifier and Type Method Description ModelConfigSageMakerInferenceClient. getModelConfig()Gets the model configuration.Methods in com.codedstream.otterstream.remote.sagemaker with parameters of type ModelConfig Modifier and Type Method Description voidSageMakerInferenceClient. initialize(ModelConfig config)Initializes the SageMaker inference client with AWS configuration. -
Uses of ModelConfig in com.codedstream.otterstream.remote.vertex
Methods in com.codedstream.otterstream.remote.vertex that return ModelConfig Modifier and Type Method Description ModelConfigVertexAIInferenceClient. getModelConfig()Gets the model configuration.Methods in com.codedstream.otterstream.remote.vertex with parameters of type ModelConfig Modifier and Type Method Description voidVertexAIInferenceClient. initialize(ModelConfig modelConfig)Initializes the Vertex AI inference client with Google Cloud configuration. -
Uses of ModelConfig in com.codedstream.otterstream.tensorflow
Methods in com.codedstream.otterstream.tensorflow with parameters of type ModelConfig Modifier and Type Method Description voidTensorFlowInferenceEngine. initialize(ModelConfig config)Initializes the TensorFlow inference engine by loading a SavedModel. -
Uses of ModelConfig in com.codedstream.otterstream.xgboost
Methods in com.codedstream.otterstream.xgboost with parameters of type ModelConfig Modifier and Type Method Description voidXGBoostInferenceEngine. initialize(ModelConfig config)Initializes the XGBoost inference engine by loading a model file. -
Uses of ModelConfig in com.codedstreams.otterstreams.sql.runtime
Methods in com.codedstreams.otterstreams.sql.runtime with parameters of type ModelConfig Modifier and Type Method Description static InferenceEngine<?>InferenceEngineFactory. createEngine(ModelConfig config)protected voidTensorFlowGraphDefEngine. loadModelDirectly(ModelConfig config)protected voidTensorFlowSavedModelEngine. loadModelDirectly(ModelConfig config)
-