Reputation: 99
Mxnet and Tensorflow both declare that they has auto-differentiation feature.
In Mxnet, I need to define the backward part when creating a new op(like loss function), but not in Tensorflow.
In my knowledge, auto-differentiation means I don't need to care about the backward part. So, does mxnet has auto-differentiation feature?
Upvotes: 0
Views: 267
Reputation: 1478
Yes, MXNet has autograd.
Here is a tutorial: http://gluon.mxnet.io/chapter01_crashcourse/autograd.html
Upvotes: 2