A Recipe defines an intersection in the search space of machine learning models, data formats, optimization schemes, and deployment targets. A recipe facilitates the development process by allowing the user to select a model to use with their data, and provides pre-configured steps and settings to deliver excellent performance of that model on a desired target platform. The next few pages of this guide will walk you through each step of the process.
Step One: Train, Evaluate on Host and Export
The first stage of a LEIP Recipe handles the machine learning portion of the Recipe, up to the point where a traced model is exported. The LEIP toolkit enables you to train the model with your own data, evaluate, and then export a traced model. The ML portion of the recipe guarantees that you will get perfect compatibility of the exported model with the LEIP SDK. In the initial tutorial, we will skip the training stage by using a pre-trained model. Training on your dataset is covered later under Adding your own data to the Recipe.
Step Two: Compile and Optimize
The second stage of a LEIP Recipe provides a pre-configured build operation with compilation and optimization settings selected to achieve the best model performance. In the case of a GPU target such as the AGX, the second stage provides cross compilation, and produces the model binary artifact and the compute engine.
Step Three: Evaluate on Target
The third stage of the LEIP Recipe runs on the target, verifying the model, and reporting accuracy and inference speed on the device itself.
Step Four: Deploy
The final step of a recipe is deploying the model with your own application. We have provided python and C++ examples to help you get started.
The recipe process flow is shown below.
Next, let's go ahead and explore how to use LEIP Recipes to export and evaluate a model.