Class

ml.dmlc.mxnet.optimizer

Adam

Related Doc: package optimizer

Permalink

class Adam extends Optimizer

Adam optimizer as described in [King2014]

[King2014] Diederik Kingma, Jimmy Ba, Adam: A Method for Stochastic Optimization, http://arxiv.org/abs/1412.6980

Linear Supertypes
Optimizer, Serializable, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. Adam
  2. Optimizer
  3. Serializable
  4. AnyRef
  5. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new Adam(learningRate: Float = 0.002f, beta1: Float = 0.9f, beta2: Float = 0.999f, epsilon: Float = 1e-8f, decayFactor: Float = 1-1e-8f, wd: Float = 0.0f, clipGradient: Float = 0f, lrScheduler: LRScheduler = null)

    Permalink

    learningRate

    Float, Step size.

    beta1

    Float, Exponential decay rate for the first moment estimates.

    beta2

    Float, Exponential decay rate for the second moment estimates.

    epsilon

    Float

    decayFactor

    Float

    wd

    Float, L2 regularization coefficient add to all the weights

    clipGradient

    Float, clip gradient in range [-clip_gradient, clip_gradient]

    lrScheduler

    The learning rate scheduler

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. def createState(index: Int, weight: NDArray): (NDArray, NDArray)

    Permalink
    Definition Classes
    AdamOptimizer
  7. def deserializeState(bytes: Array[Byte]): AnyRef

    Permalink
    Definition Classes
    AdamOptimizer
  8. def disposeState(state: AnyRef): Unit

    Permalink
    Definition Classes
    AdamOptimizer
  9. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  10. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  11. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  12. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  13. def getLr(index: Int, lr: Float): Float

    Permalink
    Attributes
    protected
    Definition Classes
    Optimizer
  14. def getWd(index: Int, wd: Float): Float

    Permalink
    Attributes
    protected
    Definition Classes
    Optimizer
  15. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  16. var idx2name: Map[Int, String]

    Permalink
    Attributes
    protected
    Definition Classes
    Optimizer
  17. val indexUpdateCount: Map[Int, Int]

    Permalink
    Attributes
    protected
    Definition Classes
    Optimizer
  18. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  19. val learningRate: Float

    Permalink

    Float, Step size.

  20. val lrMult: Map[Either[Int, String], Float]

    Permalink
    Attributes
    protected
    Definition Classes
    Optimizer
  21. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  22. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  23. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  24. var numUpdate: Int

    Permalink
    Attributes
    protected
    Definition Classes
    Optimizer
  25. var rescaleGrad: Float

    Permalink
    Attributes
    protected
    Definition Classes
    Optimizer
  26. def serializeState(state: AnyRef): Array[Byte]

    Permalink
    Definition Classes
    AdamOptimizer
  27. def setArgNames(argNames: Seq[String]): Unit

    Permalink
    Definition Classes
    Optimizer
  28. def setIdx2Name(paramIdx2Name: Map[Int, String]): Unit

    Permalink
    Definition Classes
    Optimizer
  29. def setLrMult(argsLrMult: Map[Either[Int, String], Float]): Unit

    Permalink

    Sets an individual learning rate multiplier for each parameter.

    Sets an individual learning rate multiplier for each parameter. If you specify a learning rate multiplier for a parameter, then the learning rate for the parameter will be set as the product of the global learning rate and its multiplier. note:: The default learning rate multiplier of a Variable can be set with lr_mult argument in the constructor.

    Definition Classes
    Optimizer
  30. def setRescaleGrad(rescaleGrad: Float): Unit

    Permalink
    Definition Classes
    Optimizer
  31. def setSymbol(sym: Symbol): Unit

    Permalink
    Definition Classes
    Optimizer
  32. def setWdMult(argsWdMult: Map[Either[Int, String], Float]): Unit

    Permalink

    Sets an individual weight decay multiplier for each parameter.

    Sets an individual weight decay multiplier for each parameter.

    By default, the weight decay multipler is set as 0 for all parameters whose name don't end with _weight or _gamma, if you call the setIdx2Name method to set idx2name.

    note:: The default weight decay multiplier for a Variable can be set with its wd_mult argument in the constructor.

    Definition Classes
    Optimizer
  33. var specialized: Boolean

    Permalink
    Attributes
    protected
    Definition Classes
    Optimizer
  34. var symbol: Symbol

    Permalink
    Attributes
    protected
    Definition Classes
    Optimizer
  35. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  36. var time: Int

    Permalink
    Attributes
    protected
  37. var timeFirstIndex: Option[Int]

    Permalink
    Attributes
    protected
  38. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  39. def update(index: Int, weight: NDArray, grad: NDArray, state: AnyRef): Unit

    Permalink

    Update the parameters.

    Update the parameters.

    index

    An unique integer key used to index the parameters

    weight

    weight ndarray

    grad

    grad ndarray

    state

    NDArray or other objects returned by initState The auxiliary state used in optimization.

    Definition Classes
    AdamOptimizer
  40. def updateCount(index: Int): Unit

    Permalink

    update num_update

    update num_update

    index

    The index will be updated

    Attributes
    protected
    Definition Classes
    Optimizer
  41. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  42. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  43. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  44. val wdMult: Map[Either[Int, String], Float]

    Permalink
    Attributes
    protected
    Definition Classes
    Optimizer
  45. val weightSet: Set[Int]

    Permalink
    Attributes
    protected
    Definition Classes
    Optimizer

Deprecated Value Members

  1. def setLrScale(lrScale: Map[Int, Float]): Unit

    Permalink
    Definition Classes
    Optimizer
    Annotations
    @deprecated
    Deprecated

    Use setLrMult instead.

Inherited from Optimizer

Inherited from Serializable

Inherited from AnyRef

Inherited from Any

Ungrouped