Skip to main content
Skip table of contents

Using LEIP

Once you have installed the LEIP SDK you are ready to start optimizing your model.

The following guide will help you get familiar with the SDK and some of its core features. The SDK also provides access to the Latent AI model zoo with sample models and scripts to help get you on your way. If you need further assistance, please contact support@latentai.com.

The following guide will assume you are running the LEIP SDK container and will reference examples and files installed inside this container.

Preparing a Model

It is highly recommended that you get started with the SDK by first running through a sample model from the Latent AI Model Zoo. This will help you get familiarized with the SDK and ensure you can successfully run through the LEIP workflow from start to finish. However, when using your own trained model please refer to the Supportability Matrix to verify it meets the model requirements.

The following example will use a model from the Latent AI Model Zoo.

Selecting a Zoo Model

Use the leip zoo list command to query the available models and datasets. You will notice that each model includes a model_id and variant_id; you will need to pass those values from the model you choose to the leip zoo download command in the next step.

CODE
# List available models and datasets
leip zoo list

The following command uses the inceptionv3 model and keras-open-images-10-classes variant.

Downloading a Zoo Model

Once you have the model_id and variant_id, you can download an un-frozen model to the local workspace by running the leip zoo download command. Again, this information was obtained from output of the leip zoo list command.

CODE
# download the model variant to our local workspace
leip zoo download --model_id inceptionv3 --variant_id keras-open-images-10-classes

# notice the directory structure and contents
ls workspace/models/inceptionv3/keras-open-images-10-classes/

# The model weights and class names
class_names.txt  model.h5  model_schema.json

# The model_schema.json tracks common model attributes
cat  workspace/models/inceptionv3/keras-open-images-10-classes/model_schema.json
{
  "dataset": "custom",
  "preprocessor": "imagenet",
  "metadata": {
    "name": "inceptionv3",
    "variant": "keras-open-images-10-classes",
    "full_name": "Inception V3",
    "description": "Inception V3 is a convolutional neural network developed by Google to perform image classificaiton tasks.",
    "type": "Image Classification",
    "source": "https://github.com/latentai/model-zoo-models/tree/master/inceptionv3"
  }
}

Downloading a Zoo Dataset

You will also need the dataset. The dataset for this model is the open-images-10-classes:

CODE
# Download the evaluation dataset
leip zoo download --dataset_id open-images-10-classes --variant_id eval

# All downloads are saved to your workspace
ls workspace/datasets/open-images-10-classes/eval/

Please see the LEIP Zoo section for more detailed information about accessing and using zoo models and datasets.

Using LEIP Optimize

LEIP Optimize is a state-of-the-art model optimizer that applies post-quantization algorithms to a model and produces a binary representation based on the target specified by the user. The binary is in the form of a shared object file which is loaded into a small runtime for its execution.

Run leip optimize --help command to obtain the supported options, or view the CLI Reference for LEIP Optimize.

The following example uses asymmetric quantization at 8 bits with a target data type of uint8 and saves the output model to an optimizedModel folder. A calibration is done using files specified in a rep_dataset.txt file. The original model is left untouched.

CODE
# Create a representative dataset file, in this case with one example item
echo workspace/datasets/open-images-10-classes/eval/Apple/06e47f3aa0036947.jpg > rep_dataset.txt

# Using the previously downloaded inceptionv3 instance, optimize the model
leip optimize --input_path workspace/models/inceptionv3/keras-open-images-10-classes \
              --output_path optimizedModel/ \
              --quantizer asymmetric \
              --bits 8 \
              --rep_dataset rep_dataset.txt \
              --preprocessor imagenet

# After a few minutes, LEIP optimize will complete the process
ls optimizedModel/
modelLibrary.so  model_schema.json  results.json

The command line arguments are documented in LEIP Optimize, but it is worth noting how they are being used here in this example:

  • The --quantizer is set to asymmetric to tell leip optimize to cast the tensors to 8-bit integer representation using per-tensor asymmetric quantization.

  • During the casting, a file containing the newline-separated paths to sample input files, comprising what is known as the “representative dataset”, is supplied with the --rep_dataset option. This tells leip optimize to use this input file to run inference on the model to gather the quantization parameters for output tensors. A random image from the dataset that was downloaded above was used here, but in practice the representative dataset file does not need to be from the training or test dataset.

  • Finally, the --preprocessor option specifies the preprocessor callback function name to use during the inference on the representative dataset file.

After the leip optimize command is run, the folder specified by --output_path will now contain an 8-bit integer version of the model in the form of a shared object file (.so). In this file, both the parameters and operations are integer typed.

Using LEIP Evaluate

You can use LEIP to evaluate the accuracy of a model given a dataset and specifying class names and batch size. Run leip evaluate --help command to obtain the supported options.

The evaluate command is useful to compare the accuracy of a model at different points in the pipeline. For example, you can evaluate the accuracy of the model before and after running optimize. The following example illustrates this.

We will compare the results of the original inceptionv3 model and then the optimized inceptionv3 model that were trained on our OpenImages dataset in this example.

CODE
# Evaluate the original model and note the results
leip evaluate --input_path workspace/models/inceptionv3/keras-open-images-10-classes/ \
              --test_path workspace/datasets/open-images-10-classes/eval/index.txt \
              --preprocessor imagenet

# Run evaluate again, this time on the previously optimized model, and note the results
leip evaluate --input_path optimizedModel/ \
              --test_path workspace/datasets/open-images-10-classes/eval/index.txt

Additional information about leip evaluate can be found here

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.