LEIP Recipes are a set of pre-configured components and settings that enable a machine learning engineer to quickly train, compile, compress, and package models for optimized deployment at the edge. Latent AI is building a deep set of recipes starting with Computer Vision applications targeting many common edge computing architectures. This initial release provides a recipe for training and deploying YOLOv5* on an AGX machine with best-in-class INT8 performance. Additional recipes for classification, detection, and segmentation on other GPU and CPU targets will quickly follow in future releases.
The machine learning components of these recipes are packaged modularly, enabling models, datasets, optimization schemes, and deployment targets to be swapped easily. As Latent AI releases future components, the machine learning engineer will be able to quickly and reproducibly run experiments on different models and optimizations across various hardware targets to efficiently determine the best combination, therefore reducing time to market.
LEIP Recipes endorse a Bring Your Own Data (BYOD) approach. This means that you add your own data set to the environment as a Recipe component and subsequently train these models for your specific application. LEIP Recipes currently support the Common Objects in Context (COCO) dataset format; other formats will follow in future releases. Instructions are provided in the documentation to configure your own data to use with these reusable components.
The best way to experience LEIP Recipes is to try it yourself. The following sections will walk you through the process of installing and using recipes with the provided MS COCO dataset to evaluate our optimized YOLOv5 model on an Nvidia AGX, and then retraining the model with a BYOD approach. Contact Latent AI if you are interested in using recipes with other models or hardware targets.
Let’s get started by installing LEIP Recipes.
*Our initial YOLOv5 recipe uses the
yolov5-rt-stack implementation of the Ultralytics YOLOv5 model. The rt-stack model uses the same structure as Ultralytrics while being optimized for cross-platform deployment.