mehrdadjavadi
mehrdadjavadi

Reputation: 1

what does mean numbers in paranthesis of multi layer perceptron like (64,128,1024)

Hello every one i'm reading pointnet papper but i cant understand the numbers of network architecture can you explain this to me. enter image description here

Upvotes: -1

Views: 29

Answers (1)

Sarvesh Thakur
Sarvesh Thakur

Reputation: 157

It means 3 fully connected layers, having neurons in each layer as 64, 128 and 1024 respectively. Between each layer, there is batch normalization as well as RELU activation (implemented in the paper).

Upvotes: 0

Related Questions