As I still work on pruning some models, I printed the weight of a model after pruning it by 70% and got the following results. Can someone help me interpret these results ? What does the kernel, gamma and beta mean ? And why we have 100% at count in normalization even tho I just wanted to prune 70% ?

normalization/mean:0: 0.00% sparsity (0/3)
normalization/variance:0: 0.00% sparsity (0/3)
normalization/count:0: 100.00% sparsity (1/1)
stem_conv/kernel:0: 70.02% sparsity (605/864)
stem_bn/gamma:0: 0.00% sparsity (0/32)
stem_bn/beta:0: 0.00% sparsity (0/32)

Thank you in advance

lgusm
June 6, 2022, 10:51am
#3
can you share a little bit more information about what you did?
did you use the Model Optimization toolkit?

did you do the optimization during training or post training?

Hello @lgusm

I actually followed this tutorial (Pruning preserving quantization aware training (PQAT)) they used in the example a pruning rate of 50%, and when they printed to weights they got the following results:

conv2d/kernel:0: 50.00% sparsity (54/108)
conv2d/bias:0: 0.00% sparsity (0/12)
dense/kernel:0: 50.00% sparsity (10140/20280)
dense/bias:0: 0.00% sparsity (0/10)

I only changed the pruning rate from 50% to 70%. So I applied pruning like they did, Pruning then training.
I used this code if it helps.

```
def pruning(layer):
if (isinstance(layer, tf.keras.layers.Rescaling) or
isinstance(layer, tf.keras.layers.Normalization) or
isinstance(layer, tf.keras.layers.Multiply)):
return layer
return tfmot.sparsity.keras.prune_low_magnitude(layer, **pruning_params)
```

And got this :

normalization/mean:0: 0.00% sparsity (0/3)
normalization/variance:0: 0.00% sparsity (0/3)
normalization/count:0: 100.00% sparsity (1/1)
stem_conv/kernel:0: 70.02% sparsity (605/864)
stem_bn/gamma:0: 0.00% sparsity (0/32)
stem_bn/beta:0: 0.00% sparsity (0/32)

yyoon
June 13, 2022, 5:24pm
#7
@Rino_Lee Could you take a look at this inquiry? Thanks.

hi @Mohamed

The only prunable weight in your example is the weight matrix (kernel) of the Conv2D Op. So it does pruned to 70% sparsity.

The batch norm and Rescaling layer is not a target of pruning by default, so they are not pruned at all.

1 Like

hello @Rino_Lee

Thank you for you answer, but what actually do the gamma and beta mean ? if you have any idea!

They are sclae and offset - see more details in BatchNormalization layer

1 Like