Skip to main content
Skip table of contents

Supportability Matrix

This page discusses the generally supported model file formats and versions for each tool in the LEIP SDK.

Depending on the operators used in a particular model architecture, you may encounter unforeseen compatibility issues with specific tools.

Model Requirements

Model

Requirement

TensorFlow

  • Compatible with TensorFLow 2.3

  • Has not been frozen

PyTorch

  • Compatible with PyTorch 1.7

  • When using LEIP Optimize or Compile, your input model should be traced and saved using torch.jit.save (recommended), or alternatively the whole eager model saved using torch.save (but it must be traceable anyway).

  • If you are using LEIP Optimize with the -use_legacy_quantizer true option, your model should be a quantizable and traceable eager model (e.g., torchvision.models.quantization.resnet50(pretrained=True)).

Modules

Module

Execution

LEIP Train

For leip_train command and QGT API (Input and Output)

Keras models from Tensorflow 2.x onwards, in eager execution mode only.

LEIP Optimize

LEIP Compile

The leip_optimize and leip_compile commands support the following input formats. The output format depends on the target being compiled to.

  • TF (SavedModel)

  • TF (Keras)

  • TF (Graph Proto)

  • TF (ckpt meta)

  • TFLite

  • PyTorch 1.7

LEIP Evaluate

LEIP Run

The leip_evaluate and leip_run commands can currently execute in the following inference frameworks, which are included in the LEIP SDK Docker Images:

  • Tensorflow ver 2.3

  • Tensorflow Lite ver 2.3

  • LRE - LEIP Runtime Environment

  • PyTorch 1.7

Supported Input Formats

Model

Description

Tensorflow v2.3

  • TF (ckpt meta)

  • TF (Graph Proto)

  • TF (SavedModel)

  • TF (Keras)

Tensorflow Lite v2.3

Any .tflite file converted by Tensorflow or from leip compress.

PyTorch v 1.7

A complete pytorch module (eager or traced) with .pt extension.

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.