ó šÄïYc@sXdZddlZdefd„ƒYZdefd„ƒYZdefd„ƒYZdS( sScheduling learning rate.iÿÿÿÿNt LRSchedulercBs#eZdZdd„Zd„ZRS(s÷Base class of a learning rate scheduler. A scheduler returns a new learning rate based on the number of updates that have been performed. Parameters ---------- base_lr : float, optional The initial learning rate. g{®Gáz„?cCs ||_dS(N(tbase_lr(tselfR((s2build/bdist.linux-armv7l/egg/mxnet/lr_scheduler.pyt__init__ scCstdƒ‚dS(sÔReturn a new learning rate. The ``num_update`` is the upper bound of the number of updates applied to every weight. Assume the optimizer has updated *i*-th weight by *k_i* times, namely ``optimizer.update(i, weight_i)`` is called by *k_i* times. Then:: num_update = max([k_i for all i]) Parameters ---------- num_update: int the maximal number of updates applied to a weight. smust override thisN(tNotImplementedError(Rt num_update((s2build/bdist.linux-armv7l/egg/mxnet/lr_scheduler.pyt__call__#s(t__name__t __module__t__doc__RR(((s2build/bdist.linux-armv7l/egg/mxnet/lr_scheduler.pyRs  tFactorSchedulercBs&eZdZddd„Zd„ZRS(s¼Reduce the learning rate by a factor for every *n* steps. It returns a new learning rate by:: base_lr * pow(factor, floor(num_update/step)) Parameters ---------- step : int Changes the learning rate for every n updates. factor : float, optional The factor to change the learning rate. stop_factor_lr : float, optional Stop updating the learning rate if it is less than this value. ig:Œ0âŽyE>cCsqtt|ƒjƒ|dkr.tdƒ‚n|dkrItdƒ‚n||_||_||_d|_dS(Nis3Schedule step must be greater or equal than 1 roundgð?s/Factor must be no more than 1 to make lr reducei(tsuperR Rt ValueErrortsteptfactortstop_factor_lrtcount(RR RR((s2build/bdist.linux-armv7l/egg/mxnet/lr_scheduler.pyREs     cCs•x‹||j|jkr|j|j7_|j|j9_|j|jkrt|j|_tjd||jƒqtjd||jƒqW|jS(NsMUpdate[%d]: now learning rate arrived at %0.5e, will not change in the futures)Update[%d]: Change learning rate to %0.5e(RR RRRtloggingtinfo(RR((s2build/bdist.linux-armv7l/egg/mxnet/lr_scheduler.pyRPs   (RRR RR(((s2build/bdist.linux-armv7l/egg/mxnet/lr_scheduler.pyR 5s tMultiFactorSchedulercBs#eZdZdd„Zd„ZRS(s‘Reduce the learning rate by given a list of steps. Assume there exists *k* such that:: step[k] <= num_update and num_update < step[k+1] Then calculate the new learning rate by:: base_lr * pow(factor, k+1) Parameters ---------- step: list of int The list of steps to schedule a change factor: float The factor to change the learning rate. icCsètt|ƒjƒt|tƒr4t|ƒdks:t‚xht|ƒD]Z\}}|dkr†||||dkr†tdƒ‚n|dkrGtdƒ‚qGqGW|dkrÀtdƒ‚n||_ d|_ ||_ d|_ dS(Niis0Schedule step must be an increasing integer lists3Schedule step must be greater or equal than 1 roundgð?s/Factor must be no more than 1 to make lr reduce( R RRt isinstancetlisttlentAssertionErrort enumerateR R t cur_step_indRR(RR Rtit_step((s2build/bdist.linux-armv7l/egg/mxnet/lr_scheduler.pyRps'$     cCs”xŠ|jt|jƒdkrŒ||j|jkr‚|j|j|_|jd7_|j|j9_tjd||jƒq|jSqW|jS(Nis)Update[%d]: Change learning rate to %0.5e(RRR RRRRR(RR((s2build/bdist.linux-armv7l/egg/mxnet/lr_scheduler.pyRs  (RRR RR(((s2build/bdist.linux-armv7l/egg/mxnet/lr_scheduler.pyR^s (R RtobjectRR R(((s2build/bdist.linux-armv7l/egg/mxnet/lr_scheduler.pyts  )