AI Model Efficiency Toolkit (AIMET) Forum

How to load the exported model?

Hi! I’m using the api docs tutorial listed here to save a quantized model:

https://quic.github.io/aimet-pages/AimetDocs/api_docs/torch_quantsim.html#api-torch-quantsim

Especifically, the command used to export the quantized model using is:

sim.export(path=’./’, filename_prefix=‘quantized_mnist’, input_shape=input_shape)

This command export three files: quantized_mnist.pth, quantized_mnist.onnx and quantized_mnist.encodings

My question is: How to load again the quantized model? Is it possible to load the quantized model into a machine without aimet tool installed?

Thanks!

Hi Flavio,

Can you tell me your use case?

Instead of export, save_checkpoint and load_checkpoint APIs can be used to save and load a QuantizationSimModel.

I think @flaviomb already asked this. I have similar question. We want to load quantized model on edge device. How to do it?