org.apache.mxnet.optimizer

AdaDelta

Related Doc: package optimizer

class AdaDelta extends Optimizer

AdaDelta optimizer as described in Matthew D. Zeiler, 2012. http://arxiv.org/abs/1212.5701

Linear Supertypes
Optimizer, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By inheritance
Inherited
  1. AdaDelta
  2. Optimizer
  3. Serializable
  4. AnyRef
  5. Any
  1. Hide All
  2. Show all
Learn more about member selection
Visibility
  1. Public
  2. All

Instance Constructors

  1. new AdaDelta(rho: Float = 0.05f, rescaleGradient: Float = 1.0f, epsilon: Float = 1e-8f, wd: Float = 0.0f, clipGradient: Float = 0f)

    rho

    Decay rate for both squared gradients and delta x.

    rescaleGradient

    rescaling factor of gradient.

    epsilon

    The constant as described in the thesis

    wd

    L2 regularization coefficient add to all the weights

    clipGradient

    clip gradient in range [-clip_gradient, clip_gradient]

Value Members

  1. final def !=(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Definition Classes
    Any
  5. def clone(): AnyRef

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. def createState(index: Int, weight: NDArray): (NDArray, NDArray)

    Definition Classes
    AdaDeltaOptimizer
  7. def deserializeState(bytes: Array[Byte]): AnyRef

    Definition Classes
    AdaDeltaOptimizer
  8. def disposeState(state: AnyRef): Unit

    Definition Classes
    AdaDeltaOptimizer
  9. final def eq(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  10. def equals(arg0: Any): Boolean

    Definition Classes
    AnyRef → Any
  11. def finalize(): Unit

    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  12. final def getClass(): Class[_]

    Definition Classes
    AnyRef → Any
  13. def getLr(index: Int, lr: Float): Float

    Attributes
    protected
    Definition Classes
    Optimizer
  14. def getWd(index: Int, wd: Float): Float

    Attributes
    protected
    Definition Classes
    Optimizer
  15. def hashCode(): Int

    Definition Classes
    AnyRef → Any
  16. var idx2name: Map[Int, String]

    Attributes
    protected
    Definition Classes
    Optimizer
  17. val indexUpdateCount: Map[Int, Int]

    Attributes
    protected
    Definition Classes
    Optimizer
  18. final def isInstanceOf[T0]: Boolean

    Definition Classes
    Any
  19. val lrMult: Map[Either[Int, String], Float]

    Attributes
    protected
    Definition Classes
    Optimizer
  20. final def ne(arg0: AnyRef): Boolean

    Definition Classes
    AnyRef
  21. final def notify(): Unit

    Definition Classes
    AnyRef
  22. final def notifyAll(): Unit

    Definition Classes
    AnyRef
  23. var numUpdate: Int

    Attributes
    protected
    Definition Classes
    Optimizer
  24. var rescaleGrad: Float

    Attributes
    protected
    Definition Classes
    Optimizer
  25. def serializeState(state: AnyRef): Array[Byte]

    Definition Classes
    AdaDeltaOptimizer
  26. def setArgNames(argNames: Seq[String]): Unit

    Definition Classes
    Optimizer
  27. def setIdx2Name(paramIdx2Name: Map[Int, String]): Unit

    Definition Classes
    Optimizer
  28. def setLrMult(argsLrMult: Map[Either[Int, String], Float]): Unit

    Sets an individual learning rate multiplier for each parameter.

    Sets an individual learning rate multiplier for each parameter. If you specify a learning rate multiplier for a parameter, then the learning rate for the parameter will be set as the product of the global learning rate and its multiplier. note:: The default learning rate multiplier of a Variable can be set with lr_mult argument in the constructor.

    Definition Classes
    Optimizer
  29. def setRescaleGrad(rescaleGrad: Float): Unit

    Definition Classes
    Optimizer
  30. def setSymbol(sym: Symbol): Unit

    Definition Classes
    Optimizer
  31. def setWdMult(argsWdMult: Map[Either[Int, String], Float]): Unit

    Sets an individual weight decay multiplier for each parameter.

    Sets an individual weight decay multiplier for each parameter.

    By default, the weight decay multipler is set as 0 for all parameters whose name don't end with _weight or _gamma, if you call the setIdx2Name method to set idx2name.

    note:: The default weight decay multiplier for a Variable can be set with its wd_mult argument in the constructor.

    Definition Classes
    Optimizer
  32. var specialized: Boolean

    Attributes
    protected
    Definition Classes
    Optimizer
  33. var symbol: Symbol

    Attributes
    protected
    Definition Classes
    Optimizer
  34. final def synchronized[T0](arg0: ⇒ T0): T0

    Definition Classes
    AnyRef
  35. def toString(): String

    Definition Classes
    AnyRef → Any
  36. def update(index: Int, weight: NDArray, grad: NDArray, state: AnyRef): Unit

    Update the parameters.

    Update the parameters.

    index

    An unique integer key used to index the parameters

    weight

    weight ndarray

    grad

    grad ndarray

    state

    NDArray or other objects returned by initState The auxiliary state used in optimization.

    Definition Classes
    AdaDeltaOptimizer
  37. def updateCount(index: Int): Unit

    update num_update

    update num_update

    index

    The index will be updated

    Attributes
    protected
    Definition Classes
    Optimizer
  38. final def wait(): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  39. final def wait(arg0: Long, arg1: Int): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  40. final def wait(arg0: Long): Unit

    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  41. val wdMult: Map[Either[Int, String], Float]

    Attributes
    protected
    Definition Classes
    Optimizer
  42. val weightSet: Set[Int]

    Attributes
    protected
    Definition Classes
    Optimizer

Deprecated Value Members

  1. def setLrScale(lrScale: Map[Int, Float]): Unit

    Definition Classes
    Optimizer
    Annotations
    @deprecated
    Deprecated

    Use setLrMult instead.

Inherited from Optimizer

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped