Post training Quantization of Adapnet-pp

Hi
I was trying to quantize AdapNet-pp. While creating the Quantization simulator, I am facing this issue

 quant_scheme=QuantScheme.post_training_tf_enhanced)
  File "/home/mcw/anaconda3/envs/qual_tf/lib/python3.6/site-packages/aimet_tensorflow/quantsim.py", line 185, in __init__
    config_file)
  File "/home/mcw/anaconda3/envs/qual_tf/lib/python3.6/site-packages/aimet_tensorflow/quantsim.py", line 602, in _add_and_configure_quant_nodes
    config_file)
  File "/home/mcw/anaconda3/envs/qual_tf/lib/python3.6/site-packages/aimet_tensorflow/quantsim.py", line 249, in configure_quantization_ops
    activation_op_names)
  File "/home/mcw/anaconda3/envs/qual_tf/lib/python3.6/site-packages/aimet_tensorflow/utils/quantsim.py", line 101, in create_op_to_quant_ops_dict
    assert param_quantizer.type in ['QcQuantize', 'QcQuantizeRecurrentParam']
AssertionError

I printed the param_quantizer.name & type and found out that the issue arises from this particular node

name: resnet_v2_50/block2/unit_4/bottleneck_v1/conv2/split
type: Split 

Can you give me some pointers on how to create the wrapper function to enable aimet for custom tensorflow-ops?

Hi @mcw_qc_aimet Sorry about the delayed response. Could you please share your model checkpoint? Also, what do you mean by custom tf-ops?

The checkpoint is from here
In this repository of adapnet-pp, they have created a custom layer named ‘split_conv2d’. The above-mentioned issue is from that layer