Class InferenceEngine
-
- All Implemented Interfaces:
public class InferenceEngineAndroid LRE Inference Engine.
-
-
Constructor Summary
Constructors Constructor Description InferenceEngine(Context context, InferenceOptions inferenceOptions)Initialize a Latent Runtime Engine (LRE) instance from a model generated by LEIP Design or LEIP Optimize with Model inference options.
-
Method Summary
Modifier and Type Method Description <T> TrunInference(T data)Runs Inference upon provided input(s). InferenceOptionsgetInferenceOptions()Get current running model inference options. synchronized voidloadModel()Interface to load model. synchronized voidcloseModel()Interface to close model. ModelgetCurrentModel()Interface to get current running model. -
-
Constructor Detail
-
InferenceEngine
InferenceEngine(Context context, InferenceOptions inferenceOptions)
Initialize a Latent Runtime Engine (LRE) instance from a model generated by LEIP Design or LEIP Optimize with Model inference options.- Parameters:
context- the application contextinferenceOptions- an InferenceOptions object containing configuration settings for the runtime engine; the InferenceOptions object is expected to include: String modelName and DataType type.
-
-
Method Detail
-
runInference
<T> T runInference(T data)
Runs Inference upon provided input(s).
- Parameters:
data- the image data being loaded for inference.
-
getInferenceOptions
InferenceOptions getInferenceOptions()
Get current running model inference options.
-
loadModel
synchronized void loadModel()
Interface to load model.
-
closeModel
synchronized void closeModel()
Interface to close model.
-
getCurrentModel
Model getCurrentModel()
Interface to get current running model.
- Returns:
current running model
-
-
-
-