Skip to main content

PyTorch to TFLite

This chapter will describe how to convert and export PyTorch models to TFLite models.

Preparation

Environment Configuration

As the Training step, we recommend you to do it in a virtual environment during the model exporting phase. In the sscma virtual environment, make sure that the Installation - Prerequisites - Install Extra Dependencies step has been completed.

tip

If you have configured a virtual environment but not activated it, you can activate it with the following command.

conda activate sscma

Models and Weights

You also need to prepare the PyTorch model and its weights before exporting the model. For the model, you can find it in the Config section, we have already preconfigured. For the weights, you can refer to the following steps to get the model weights.

tip

Export TFLite model requires a training set as a representative dataset, if it not found, the program will download it automatically. However, for some large datasets, this can take a long time, so please be patient.

Export Model

For model transformation (convert and export), the relevant commands with some common parameters are listed.

python3 tools/export.py \
"<CONFIG_FILE_PATH>" \
"<CHECKPOINT_FILE_PATH>" \
--target tflite

TFLite Export Examples

Here are some model conversion examples (int8 precision) for reference.

python3 tools/export.py \
configs/fomo/fomo_mobnetv2_0.35_x8_abl_coco.py \
"$(cat work_dirs/fomo_mobnetv2_0.35_x8_abl_coco/last_checkpoint)" \
--target tflite \
--cfg-options \
data_root='datasets/mask'

Model Validation

Since in the process of exporting the model, SSCMA will do some optimization for the model using some tools, such as model pruning, distillation, etc. Although we have tested and evaluated the model weights during the training process, we recommend you to validate the exported model again.

python3 tools/inference.py \
"<CONFIG_FILE_PATH>" \
"<CHECKPOINT_FILE_PATH>" \
--show \
--cfg-options "<CFG_OPTIONS>"
tip

For more parameters supported, please refer to the source code tools/inference.py or run python3 tools/inference.py --help.

Model Validation Example

Here are some examples for validating converted model (int8 precision), for reference only.

python3 tools/inference.py \
configs/fomo/fomo_mobnetv2_0.35_x8_abl_coco.py \
"$(cat work_dirs/fomo_mobnetv2_0.35_x8_abl_coco/last_checkpoint | sed -e 's/.pth/_int8.tflite/g')" \
--show \
--cfg-options \
data_root='datasets/mask'
Loading Comments...