org.apache.clojure-mxnet.module

arg-params

(arg-params mod)

aux-params

(aux-params mod)

backward

(backward mod out-grads)(backward mod)
Backward computation.
out-grads -  Gradient on the outputs to be propagated back.
             This parameter is only needed when bind is called
             on outputs that are not a loss function.

bind

(bind mod {:keys [data-shapes label-shapes for-training inputs-need-grad force-rebind shared-module grad-req], :as opts, :or {for-training true, inputs-need-grad false, force-rebind false, grad-req "write"}})
Bind the symbols to construct executors. This is necessary before one
can perform computation with the module.
mod : module
map of opts:
  :data-shapes Typically is  (provide-data-desc data-iter). Data shape must be in the form of io/data-desc with is a map of :name :shape :dtype and :layout
  :label-shapes Typically is  (provide-label-desc data-iter). map of :name :shape :dtype and :layout
  :for-training Default is `true`. Whether the executors should be bind for training.
  :inputs-need-grad Default is `false`.
                    Whether the gradients to the input data need to be computed.
                    Typically this is not needed.
                    But this might be needed when implementing composition of modules.
  :force-rebind Default is `false`.
                This function does nothing if the executors are already binded.
                But with this `true`, the executors will be forced to rebind.
  :shared-module Default is nil. This is used in bucketing.
                 When not `None`, the shared module essentially corresponds to
                 a different bucket -- a module with different symbol
                 but with the same sets of parameters
                 (e.g. unrolled RNNs with different lengths). 

borrow-optimizer

(borrow-optimizer mod shared-module)
Borrow optimizer from a shared module. Used in bucketing, where exactly the same
optimizer (esp. kvstore) is used.
- mod module
- shared-module

data-names

(data-names mod)

data-shapes

(data-shapes mod)

exec-group

(exec-group mod)

fit

(fit mod {:keys [train-data eval-data num-epoch fit-params], :as opts, :or {num-epoch 1, fit-params (new FitParams)}})
Train the module parameters.
- mod module
- train-data (data-iterator)
- eval-data (data-iterator)If not nil, will be used as validation set and evaluate
                the performance after each epoch.
- num-epoch Number of epochs to run training.
- f-params Extra parameters for training (See fit-params).

fit-params

(fit-params {:keys [eval-metric kvstore optimizer initializer arg-params aux-params allow-missing force-rebind force-init begin-epoch validation-metric monitor batch-end-callback], :as opts, :or {eval-metric (eval-metric/accuracy), kvstore "local", optimizer (optimizer/sgd), initializer (initializer/uniform 0.01), allow-missing false, force-rebind false, force-init false, begin-epoch 0}})(fit-params)
Fit Params

forward

(forward mod data-batch is-train)(forward mod data-batch-map)
Forward computation.
data-batch -  input data of form io/data-batch either map or DataBatch
is-train -  Default is nil, which means `is_train` takes the value of `for_training`.

forward-backward

(forward-backward mod data-batch)
A convenient function that calls both `forward` and `backward`.

get-params

(get-params mod)

grad-arrays

(grad-arrays mod)

init-optimizer

(init-optimizer mod {:keys [kvstore optimizer reset-optimizer force-init], :as opts, :or {kvstore "local", optimizer (optimizer/sgd), reset-optimizer true, force-init false}})(init-optimizer mod)
 Install and initialize optimizers.
- mod Module
- options map of
       - kvstore
      - reset-optimizer Default `True`, indicating whether we should set
        `rescaleGrad` & `idx2name` for optimizer according to executorGroup
      -  force-init Default `False`, indicating whether we should force
          re-initializing the optimizer in the case an optimizer is already installed.

init-params

(init-params mod {:keys [initializer arg-params aux-params allow-missing force-init allow-extra], :as opts, :or {initializer (initializer/uniform 0.01), allow-missing false, force-init false, allow-extra false}})(init-params mod)
 Initialize the parameters and auxiliary states.
options map
  :initializer - Called to initialize parameters if needed.
  :arg-params -  If not nil, should be a map of existing arg-params.
                  Initialization will be copied from that.
  :auxParams - If not nil, should be a map of existing aux-params.
                 Initialization will be copied from that.
  :allow-missing - If true, params could contain missing values,
                    and the initializer will be called to fill those missing params.
  :force-init -  If true, will force re-initialize even if already initialized.
  :allow-extra -  Whether allow extra parameters that are not needed by symbol.
          If this is True, no error will be thrown when argParams or auxParams
          contain extra parameters that is not needed by the executor.

input-grads

(input-grads mod)
  Get the gradients to the inputs, computed in the previous backward computation.
In the case when data-parallelism is used,
          the outputs will be collected from multiple devices.
          The results will look like `[[grad1_dev1, grad1_dev2], [grad2_dev1, grad2_dev2]]`
         those `NDArray` might live on different devices.

input-grads-merged

(input-grads-merged mod)
 Get the gradients to the inputs, computed in the previous backward computation.
return In the case when data-parallelism is used,
        the outputs will be merged from multiple devices,
        as they look like from a single executor.
        The results will look like `[grad1, grad2]`

install-monitor

(install-monitor mod monitor)
Install monitor on all executors

label-shapes

(label-shapes mod)

load-checkpoint

(load-checkpoint {:keys [prefix epoch load-optimizer-states data-names label-names contexts workload-list fixed-param-names], :as opts, :or {load-optimizer-states false, data-names ["data"], label-names ["softmax_label"], contexts [(context/cpu)], workload-list nil, fixed-param-names nil}})(load-checkpoint prefix epoch)
Create a model from previously saved checkpoint.
- opts map of
  -  prefix Path prefix of saved model files. You should have prefix-symbol.json,
              prefix-xxxx.params, and optionally prefix-xxxx.states,
              where xxxx is the epoch number.
  -  epoch Epoch to load.
  - load-optimizer-states Whether to load optimizer states.
                       Checkpoint needs to have been made with save-optimizer-states=True
  - dataNames Input data names.
  - labelNames Input label names
  - contexts Default is cpu().
  -  workload-list  Default nil, indicating uniform workload.
  - fixed-param-names Default nil, indicating no network parameters are fixed.

load-optimizer-states

(load-optimizer-states mod fname)
Load optimizer (updater) state from file
- mod module
- fname Path to input states file.

module

(module sym {:keys [data-names label-names contexts workload-list fixed-param-names], :as opts, :or {data-names ["data"], label-names ["softmax_label"], contexts [(context/default-context)]}})(module sym data-names label-names contexts)(module sym)
Module is a basic module that wrap a symbol.
sym : Symbol definition.
map of options
    :data-names - Input data names.
    :label-names - Input label names
    :contexts - Default is cpu().
    :workload-list - Default nil, indicating uniform workload.
    :fixed-param-names Default nil, indicating no network parameters are fixed.

output-names

(output-names mod)

output-shapes

(output-shapes mod)

outputs

(outputs mod)
 Get outputs of the previous forward computation.
In the case when data-parallelism is used,
          the outputs will be collected from multiple devices.
          The results will look like `[[out1_dev1, out1_dev2], [out2_dev1, out2_dev2]]`,
         those `NDArray` might live on different devices.

outputs-merged

(outputs-merged mod)
 Get outputs of the previous forward computation.
return In the case when data-parallelism is used,
        the outputs will be merged from multiple devices,
        as they look like from a single executor.
        The results will look like `[out1, out2]`

params

(params mod)

predict

(predict mod {:keys [eval-data num-batch reset], :as opts, :or {num-batch -1, reset true}})
Run prediction and collect the outputs.
- mod module
- option map with
  - :eval-data
  - :num-batch Default is -1, indicating running all the batches in the data iterator.
  - :reset Default is `True`, indicating whether we should reset the data iter before start
            doing prediction.
 The return value will be a vector of NDArrays `[out1, out2, out3]`.
       Where each element is concatenation of the outputs for all the mini-batches.

predict-batch

(predict-batch mod data-batch)
Run the predication on a data batch
- mod module
- data-batch data-batch

predict-every-batch

(predict-every-batch mod {:keys [eval-data num-batch reset], :as opts, :or {num-batch -1, reset true}})
 Run prediction and collect the outputs.
- module
- option map with
  :eval-data
  :num-batch Default is -1, indicating running all the batches in the data iterator.
  :reset Default is `True`, indicating whether we should reset the data iter before start
            doing prediction.
 The return value will be a nested list like
[[out1_batch1, out2_batch1, ...], [out1_batch2, out2_batch2, ...]]`
This mode is useful because in some cases (e.g. bucketing),
 the module does not necessarily produce the same number of outputs.

reshape

(reshape mod data-shapes label-shapes)(reshape mod data-shapes)
 Reshapes the module for new input shapes.
- mod module
- data-shapes Typically is `(provide-data data-iter)
- param label-shapes Typically is `(provide-label data-tier)`. 

save-checkpoint

(save-checkpoint mod {:keys [prefix epoch save-opt-states], :as opts, :or {save-opt-states false}})(save-checkpoint mod prefix epoch)
 Save current progress to checkpoint.
Use mx.callback.module_checkpoint as epoch_end_callback to save during training.
- mod Module
-  opt-map with
   :prefix The file prefix to checkpoint to
   :epoch The current epoch number
   :save-opt-states Whether to save optimizer states for continue training 

save-optimizer-states

(save-optimizer-states mod fname)
Save optimizer (updater) state to file
- mod module
- fname Path to output states file.

score

(score mod {:keys [eval-data eval-metric num-batch reset epoch], :as opts, :or {num-batch Integer/MAX_VALUE, reset true, epoch 0}})
 Run prediction on `eval-data` and evaluate the performance according to `eval-metric`.
- mod module
- option map with
  :eval-data : DataIter
  :eval-metric : EvalMetric
  :num-batch Number of batches to run. Default is `Integer.MAX_VALUE`,
                indicating run until the `DataIter` finishes.
  :batch-end-callback -not supported yet
  :reset Default `True`,
              indicating whether we should reset `eval-data` before starting evaluating.
  :epoch Default 0. For compatibility, this will be passed to callbacks (if any).
             During training, this will correspond to the training epoch number.

set-params

(set-params mod {:keys [arg-params aux-params allow-missing force-init allow-extra], :as opts, :or {allow-missing false, force-init true, allow-extra false}})
 Assign parameter and aux state values.
 - mod module
 - arg-params : map
         map of name to value (`NDArray`) mapping.
 - aux-params : map
        map of name to value (`NDArray`) mapping.
 - allow-missing : bool
         If true, params could contain missing values, and the initializer will be
         called to fill those missing params.
 - force-init : bool
         If true, will force re-initialize even if already initialized.
-  allow-extra : bool
         Whether allow extra parameters that are not needed by symbol.
         If this is True, no error will be thrown when arg-params or aux-params
         contain extra parameters that is not needed by the executor.

symbol

(symbol mod)

update

(update mod)
Update parameters according to the installed optimizer and the gradients computed
in the previous forward-backward batch.

update-metric

(update-metric mod eval-metric labels)
Evaluate and accumulate evaluation metric on outputs of the last forward computation.
- mod module
- eval-metric
- labels