How to check whether the layers of model is Quantized?

Hi AIMET Team,

I am converting a model from pytorch using QuantSim API. When I export the model, I see quantization artifact files like encoding being generated. However, there are so many warnings that certain pytorch layers are not mapped to AIMET. Do we have an API for us to check given a model what layers can / cannot be quantized? Any help appreciated. Thanks.

Hi @Selventhiran
I suggest trying with the latest version of AIMET - e.g. Release version 1.16.1 · quic/aimet · GitHub

We have changed the way PyTorch layer names get mapped to ONNX recently and this might help in your scenario.

Other than that, you can just do “print(sim)” when sim is the instance of the QuantizationSimModel class. This will give you a good report of which layers had quantized wrappers assigned and which quantizers were enabled/disabled etc.