Hi AIMET Team,
I am converting a model from pytorch using QuantSim API. When I export the model, I see quantization artifact files like encoding being generated. However, there are so many warnings that certain pytorch layers are not mapped to AIMET. Do we have an API for us to check given a model what layers can / cannot be quantized? Any help appreciated. Thanks.