Class

ml.dmlc.mxnet.optimizer

AdaDelta

Related Doc: package optimizer

Permalink

class AdaDelta extends Optimizer

AdaDelta optimizer as described in Matthew D. Zeiler, 2012. http://arxiv.org/abs/1212.5701

Linear Supertypes
Optimizer, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. AdaDelta
  2. Optimizer
  3. Serializable
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new AdaDelta(rho: Float = 0.05f, rescaleGradient: Float = 1.0f, epsilon: Float = 1e-8f, wd: Float = 0.0f, clipGradient: Float = 0f)

    Permalink

    rho

    Decay rate for both squared gradients and delta x.

    rescaleGradient

    rescaling factor of gradient.

    epsilon

    The constant as described in the thesis

    wd

    L2 regularization coefficient add to all the weights

    clipGradient

    clip gradient in range [-clip_gradient, clip_gradient]

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. def createState(index: Int, weight: NDArray): (NDArray, NDArray)

    Permalink
    Definition Classes
    AdaDeltaOptimizer
  7. def deserializeState(bytes: Array[Byte]): AnyRef

    Permalink
    Definition Classes
    AdaDeltaOptimizer
  8. def disposeState(state: AnyRef): Unit

    Permalink
    Definition Classes
    AdaDeltaOptimizer
  9. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  10. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  11. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  12. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  13. def getLr(index: Int, lr: Float): Float

    Permalink
    Attributes
    protected
    Definition Classes
    Optimizer
  14. def getWd(index: Int, wd: Float): Float

    Permalink
    Attributes
    protected
    Definition Classes
    Optimizer
  15. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  16. var idx2name: Map[Int, String]

    Permalink
    Attributes
    protected
    Definition Classes
    Optimizer
  17. val indexUpdateCount: Map[Int, Int]

    Permalink
    Attributes
    protected
    Definition Classes
    Optimizer
  18. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  19. val lrMult: Map[Either[Int, String], Float]

    Permalink
    Attributes
    protected
    Definition Classes
    Optimizer
  20. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  21. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  22. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  23. var numUpdate: Int

    Permalink
    Attributes
    protected
    Definition Classes
    Optimizer
  24. var rescaleGrad: Float

    Permalink
    Attributes
    protected
    Definition Classes
    Optimizer
  25. def serializeState(state: AnyRef): Array[Byte]

    Permalink
    Definition Classes
    AdaDeltaOptimizer
  26. def setArgNames(argNames: Seq[String]): Unit

    Permalink
    Definition Classes
    Optimizer
  27. def setIdx2Name(paramIdx2Name: Map[Int, String]): Unit

    Permalink
    Definition Classes
    Optimizer
  28. def setLrMult(argsLrMult: Map[Either[Int, String], Float]): Unit

    Permalink

    Sets an individual learning rate multiplier for each parameter.

    Sets an individual learning rate multiplier for each parameter. If you specify a learning rate multiplier for a parameter, then the learning rate for the parameter will be set as the product of the global learning rate and its multiplier. note:: The default learning rate multiplier of a Variable can be set with lr_mult argument in the constructor.

    Definition Classes
    Optimizer
  29. def setRescaleGrad(rescaleGrad: Float): Unit

    Permalink
    Definition Classes
    Optimizer
  30. def setSymbol(sym: Symbol): Unit

    Permalink
    Definition Classes
    Optimizer
  31. def setWdMult(argsWdMult: Map[Either[Int, String], Float]): Unit

    Permalink

    Sets an individual weight decay multiplier for each parameter.

    Sets an individual weight decay multiplier for each parameter.

    By default, the weight decay multipler is set as 0 for all parameters whose name don't end with _weight or _gamma, if you call the setIdx2Name method to set idx2name.

    note:: The default weight decay multiplier for a Variable can be set with its wd_mult argument in the constructor.

    Definition Classes
    Optimizer
  32. var specialized: Boolean

    Permalink
    Attributes
    protected
    Definition Classes
    Optimizer
  33. var symbol: Symbol

    Permalink
    Attributes
    protected
    Definition Classes
    Optimizer
  34. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  35. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  36. def update(index: Int, weight: NDArray, grad: NDArray, state: AnyRef): Unit

    Permalink

    Update the parameters.

    Update the parameters.

    index

    An unique integer key used to index the parameters

    weight

    weight ndarray

    grad

    grad ndarray

    state

    NDArray or other objects returned by initState The auxiliary state used in optimization.

    Definition Classes
    AdaDeltaOptimizer
  37. def updateCount(index: Int): Unit

    Permalink

    update num_update

    update num_update

    index

    The index will be updated

    Attributes
    protected
    Definition Classes
    Optimizer
  38. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  39. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  40. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  41. val wdMult: Map[Either[Int, String], Float]

    Permalink
    Attributes
    protected
    Definition Classes
    Optimizer
  42. val weightSet: Set[Int]

    Permalink
    Attributes
    protected
    Definition Classes
    Optimizer

Deprecated Value Members

  1. def setLrScale(lrScale: Map[Int, Float]): Unit

    Permalink
    Definition Classes
    Optimizer
    Annotations
    @deprecated
    Deprecated

    Use setLrMult instead.

Inherited from Optimizer

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped