Skip to content

LEIP Optimize

LEIP Optimize simplifies the process of quantizing and compiling machine learning models for specific hardware targets. Capable of ingesting models from a variety of sources into a unified Relay representation, LEIP Optimize is both approachable and customizable, providing a suite of tools for more advanced users.

Unlike compiler tooling that requires in-depth hardware expertise and results in time-consuming optimization processes, LEIP Optimize enables rapid prototyping for a variety of model-hardware combinations.

Under the hood, LEIP Optimize uses a tool called Forge, an advanced extension of TVM that's designed to simplify the editing, manipulation, and introspection of the Relay Intermediate Representation (IR). You'll interact with Forge directly via the API, and you can learn about Forge, TVM, IR, and more in the Explanation section.

You should use LEIP Optimize if you already have a machine learning model and want to deploy to the edge, but lack specific hardware expertise. To begin, check out the Getting Started guide.

In this Documentation

  • Getting Started

    Start here for step-by-step instructions for installing and getting started with LEIP Optimize

  • How-to Guides

    Guides that cover key operations and common tasks

  • API Reference

    Reference documentation for the LEIP Optimize API

  • Explanation

    Learn more about machine learning compilation, intermediate representation, quantization, and optimization