AI Model Efficiency Toolkit (AIMET) Forum

How to deploy quantized model to mobile device?

Hi there, I use AIMET to optimize my tensorflow model. I export the result to checkpoint .meta file.
So can I create a frozen graph (float32) and convert it to .tflite file with post-train quantization to deploy my model (uint8) on Android device?