Reputation: 53
I need to compress NN Layers' weights and use for this Tensor Decomposition, TensorLy lib in Python. (Yes, I know that is far more efficiently to train network with decomposed layers with rank as parameter (Tednet) but not allowed to). So I try to decompose Linear Layers' weights and Conv2d. I take weights, decompose them using Tensor Train
tl.decomposition.matrix_product_state(weights_tensor, rank = ranks)
then restore from decomposition, replace the weights in model and check the model's Accuracy. For a small network it works fine, quality degradation is acceptable and actually the difference between the restored tensor and initial is not big. For larger models, larger matrices it got worse and one moment bothers me - I just put random ranks in tl.decomposition.matrix_product_state.
The question is - is there any existing solution of how to estimate the ranks of future decomposition?
I chose Tensor Train because of some advices from lections on that topic, maybe you have another experience?
And generally, probably there is more efficient way or libs to decompose Linear weights matrices and Conv2D weights tensors?
Thanks a lot.
Upvotes: 0
Views: 57