Class InferenceEngine
-
- All Implemented Interfaces:
public class InferenceEngine
Android LRE Inference Engine.
-
-
Constructor Summary
Constructors Constructor Description InferenceEngine(Context context, InferenceOptions inferenceOptions)
Initialize a Latent Runtime Engine (LRE) instance from a model generated by LEIP Design or LEIP Optimize with Model inference options.
-
Method Summary
Modifier and Type Method Description <T> T
runInference(T data)
Runs Inference upon provided input(s). InferenceOptions
getInferenceOptions()
Get current running model inference options. synchronized void
loadModel()
Interface to load model. synchronized void
closeModel()
Interface to close model. Model
getCurrentModel()
Interface to get current running model. -
-
Constructor Detail
-
InferenceEngine
InferenceEngine(Context context, InferenceOptions inferenceOptions)
Initialize a Latent Runtime Engine (LRE) instance from a model generated by LEIP Design or LEIP Optimize with Model inference options.- Parameters:
context
- the application contextinferenceOptions
- an InferenceOptions object containing configuration settings for the runtime engine; the InferenceOptions object is expected to include: String modelName and DataType type.
-
-
Method Detail
-
runInference
<T> T runInference(T data)
Runs Inference upon provided input(s).
- Parameters:
data
- the image data being loaded for inference.
-
getInferenceOptions
InferenceOptions getInferenceOptions()
Get current running model inference options.
-
loadModel
synchronized void loadModel()
Interface to load model.
-
closeModel
synchronized void closeModel()
Interface to close model.
-
getCurrentModel
Model getCurrentModel()
Interface to get current running model.
- Returns:
current running model
-
-
-
-