This application is designed to convert Keras models to TensorFlow Lite (TFLite) models using quantization and to benchmark multiple models.
-
Benchmarks/: Contains all the code used for benchmarking models.
-
Quantization_tools/: Contains the code to perform quantization on models.
-
Models/: Contains the models you want to convert or quantize. Each model directory must follow the implementation examples (e.g., Mnist or CloudNet).
- Each model directory should include a
run.pyscript with a class representing your model that extendsBaseModel. This class must haveconvertandbenchmarkfunctions to handle conversion to TFLite and benchmarking, respectively.
- Each model directory should include a
- run_model.py:
The main script of the application used to launch the processes.
- Usage:
python run_model.py {model_name} {action}{model_name}: The name of the model directory (e.g.,mnist).{action}: The action to perform (benchmarkorconvert).
- Usage:
To start the benchmark function for the MNIST model, run:
python run_model.py mnist benchmark-
Clone the repository:
git clone https://github.com/Alex-Delaveau/TFLite-Optimize-Bench.git
-
Install the required dependencies:
pip install -r requirements.txt
-
(Optional) If you want to work on the CloudNet model, install its specific dependencies:
pip install -r models/cloudnet/requirements.txt
-
Run the desired model and action:
python run_model.py {model_name} {action}
Replace {model_name} with the name of your model (e.g., mnist) and {action} with the desired action (benchmark or convert).