contrib.autograd¶
Autograd for NDArray.
Classes
|
Scope for managing training state. |
Functions
|
Compute the gradients of outputs w.r.t variables. |
|
Deprecated. |
|
Return function that computes gradient of arguments. |
|
Return function that computes both gradient of arguments and loss value. |
|
Mark NDArrays as variables to compute gradient for autograd. |
|
Set status to training/not training. |
Returns a testing scope context to be used in ‘with’ statement and captures testing code. |
|
Returns a training scope context to be used in ‘with’ statement and captures training code. |
-
class
mxnet.contrib.autograd.
TrainingStateScope
(enter_state)[source]¶ Bases:
object
Scope for managing training state.
- Example::
- with TrainingStateScope(True):
y = model(x) compute_gradient([y])
-
mxnet.contrib.autograd.
backward
(outputs, out_grads=None, retain_graph=False)[source]¶ Compute the gradients of outputs w.r.t variables.
- Parameters
outputs (list of NDArray) –
out_grads (list of NDArray or None) –
-
mxnet.contrib.autograd.
grad
(func, argnum=None)[source]¶ Return function that computes gradient of arguments.
- Parameters
func (a python function) – The forward (loss) function.
argnum (an int or a list of int) – The index of argument to calculate gradient for.
- Returns
grad_func – A function that would compute the gradient of arguments.
- Return type
a python function
Examples
>>> # autograd supports dynamic graph which is changed >>> # every instance >>> def func(x): >>> r = random.randint(0, 1) >>> if r % 2: >>> return x**2 >>> else: >>> return x/3 >>> # use `grad(func)` to get the gradient function >>> for x in range(10): >>> grad_func = grad(func) >>> inputs = nd.array([[1, 2, 3], [4, 5, 6]]) >>> grad_vals = grad_func(inputs)
-
mxnet.contrib.autograd.
grad_and_loss
(func, argnum=None)[source]¶ Return function that computes both gradient of arguments and loss value.
- Parameters
func (a python function) – The forward (loss) function.
argnum (an int or a list of int) – The index of argument to calculate gradient for.
- Returns
grad_and_loss_func – A function that would compute both the gradient of arguments and loss value.
- Return type
a python function
-
mxnet.contrib.autograd.
mark_variables
(variables, gradients, grad_reqs='write')[source]¶ Mark NDArrays as variables to compute gradient for autograd.
- Parameters
variables (list of NDArray) –
gradients (list of NDArray) –
grad_reqs (list of string) –
-
mxnet.contrib.autograd.
set_is_training
(is_train)[source]¶ Set status to training/not training. When training, graph will be constructed for gradient computation. Operators will also run with ctx.is_train=True. For example, Dropout will drop inputs randomly when is_train=True while simply passing through if is_train=False.
- Parameters
is_train (bool) –
- Returns
- Return type
previous state before this set.