A recipe defines an intersection in the search space of machine learning models, data formats, optimization schemes, and deployment targets. A recipe facilitates the development process by allowing the user to select a model to use with their data and provides pre-configured steps and settings to deliver excellent performance of that model on a desired target platform. The next few sections of this guide will walk you through each step of the process.
Step One: Train and Evaluate on the Host and Export
The first stage of a LEIP Recipe handles the machine learning portion of the recipe up to the point where a traced model is exported. The LEIP toolkit enables you to train the model with your own data, evaluate, and then export a traced model. The machine learning portion of the recipe guarantees that you will get perfect compatibility of the exported model with the LEIP SDK. We will initially skip the training stage by using a pre-trained model. Training on your dataset is covered later in this guide as an option after the first four steps.
Step Two: Compile and Optimize
The second stage of a LEIP Recipe provides a pre-configured build operation with compilation and optimization settings selected to achieve the best model performance. In the case of a GPU target such as the AGX, the second stage provides cross compilation and produces the model binary artifact and the compute engine.
Step Three: Evaluate on Target
The third stage of a LEIP Recipe runs on the target, verifies the model, and reports the accuracy and inference speed on the device itself.
Step Four: Deploy
The final step of a LEIP Recipe is to deploy the model with your own application. We have provided Python and C++ examples to help you get started.
The LEIP Recipe process flow is shown below.
We will now explore how to use LEIP Recipes to export and evaluate a model. We have provided the following tutorials: