Class

ml.dmlc.mxnet

FactorScheduler

Related Doc: package mxnet

Permalink

class FactorScheduler extends LRScheduler

Class for reducing learning rate in factor

Assume the weight has been updated by n times, then the learning rate will be base_lr * factor^^(floor(n/step))

Linear Supertypes
LRScheduler, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. FactorScheduler
  2. LRScheduler
  3. AnyRef
  4. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new FactorScheduler(step: Int, factor: Float)

    Permalink

    step

    Int, schedule learning rate after n updates

    factor

    Float, the factor for reducing the learning rate

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. def apply(numUpdate: Int): Float

    Permalink

    Base class of a learning rate scheduler

    Base class of a learning rate scheduler

    The training progress is presented by num_update, which can be roughly viewed as the number of minibatches executed so far. Its value is non-decreasing, and increases at most by one.

    The exact value is the upper bound of the number of updates applied to a weight/index.

    numUpdate

    Int, the maximal number of updates applied to a weight.

    Definition Classes
    FactorSchedulerLRScheduler
  5. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  6. var baseLR: Float

    Permalink
    Definition Classes
    LRScheduler
  7. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  8. var count: Int

    Permalink
    Attributes
    protected
  9. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  10. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  11. var factor: Float

    Permalink

    Float, the factor for reducing the learning rate

    Float, the factor for reducing the learning rate

    Attributes
    protected
  12. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  13. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  14. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  15. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  16. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  17. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  18. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  19. var step: Int

    Permalink

    Int, schedule learning rate after n updates

    Int, schedule learning rate after n updates

    Attributes
    protected
  20. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  21. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  22. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  23. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  24. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from LRScheduler

Inherited from AnyRef

Inherited from Any

Ungrouped