You have trained a YOLOv5 model on your own data. Now you want to evaluate its performance, visualize predictions, and export the traced model to use with the LEIP SDK.

Your trained checkpoint is stored at the location specified in the logs. It will be inside /latentai/outputs/{date}_BYOD_recipe/checkpoints/*.

Export the BYOD Model

Perform the following to export your trained model to use with the LEIP SDK:

af --config-name=yolov5_L_RT command=export +checkpoint="'/absolute/path/to.ckpt'" \
 task.moniker="BYOD_recipe"
BASH

Your traced model will be found at /latentai/artifacts/BYOD_recipe_yolov5l_640-640.pt

Please note the need to use double and single quotes around the ckpt path as shown above.

That is it. You can now use this exported model and proceed to compiling or optimizing the model. You may, however, wish to first evaluate the accuracy of the model in your host environment. Let's use the other machine learning recipe options.

Evaluate the BYOD Model on Host

af --config-name=yolov5_L_RT \
    command=evaluate \
    +checkpoint="'/full/path/to.ckpt'" \
    data=coco-like-template \
    'hydra.searchpath=[file://custom-configs]' \
    task.moniker="BYOD_recipe"
BASH

Use the same data configuration you used to train the model.

Perform the following to visualize the bounding box predictions of your trained YOLOv5 model:

af --config-name=yolov5_L_RT \
    command=predict \
    +checkpoint="'/full/path/to.ckpt'" \
    data=coco-like-template \
    'hydra.searchpath=[file://custom-configs]' \
    task.moniker="BYOD_recipe"
BASH

Use the same data config you used to train the model.

The images with bounding boxes drawn over them are now under /latentai/artifacts/predictions/coco-detection.