John
John

Reputation: 3070

Does size of caffemodel proportional to number of parameter?

the *.cafemodel is an output of a network after the training phase. Do you think its size is proportional to the number of parameters? It means that if I have two networks A and B, the network A results in *.caffemodel with size 10MB in the disk, while network B results in a *.caffemodel with size 20MB in the disk. Is it right if I said the network A has less number of learnable parameter than network B?

Upvotes: 1

Views: 347

Answers (1)

Prune
Prune

Reputation: 77837

Not quite. The size is mostly dependent on the memory needed to store the parameters and weights. This includes not only the quantity of each, but the data size as well. If one model is set up to work with short fixed-point arithmetic, while another uses normal floats, there will be a large discrepancy in storage needs.

Also, layer connectivity has a lot to do with the file size. A fully-connected layer has far more weights than a convolution layer involving the same quantities of parameters.

Does that help sort things out a little?

Upvotes: 2

Related Questions