๓ šฤ๏Yc@`s…dZddlmZddlmZddlmZddlZddlZddlmZm Z m Z m Z m Z ddl mZmZmZdd l mZmZmZmZmZdd lmZdd lmZmZd „Zd „Zd„Zd„Zdefd„ƒYZ e!d„Z"e#d„Z$d„Z%d„Z&dd„Z'de#e!d„Z)d„Z*defd„ƒYZ+dS(sAutograd for NDArray.i(tabsolute_import(tdivision(tLockN(tc_inttc_void_pt CFUNCTYPEtPOINTERtcasti(t_LIBt check_callt string_types(tmx_uintt NDArrayHandletc_arraytMXCallbackListt SymbolHandle(tNDArray(t _GRAD_REQ_MAPtSymbolcC`sAtjƒ}ttjtj|ƒtj|ƒƒƒt|jƒS(sๆSet status to recording/not recording. When recording, graph will be constructed for gradient computation. Parameters ---------- is_recording: bool Returns ------- previous state before this set. (tctypesRR RtMXAutogradSetIsRecordingtbyreftbooltvalue(t is_recordingtprev((s.build/bdist.linux-armv7l/egg/mxnet/autograd.pyt set_recording!s  cC`sAtjƒ}ttjtj|ƒtj|ƒƒƒt|jƒS(sMSet status to training/predicting. This affects ctx.is_train in operator running context. For example, Dropout will drop inputs randomly when train_mode=True while simply passing through if train_mode=False. Parameters ---------- train_mode: bool Returns ------- previous state before this set. (RRR RtMXAutogradSetIsTrainingRRR(t train_modeR((s.build/bdist.linux-armv7l/egg/mxnet/autograd.pyt set_training2s  cC`s/tjƒ}ttjtj|ƒƒƒ|jS(sdGet status on recording/not recording. Returns ------- Current state of recording. (Rtc_boolR RtMXAutogradIsRecordingRR(tcurr((s.build/bdist.linux-armv7l/egg/mxnet/autograd.pyRDs cC`s/tjƒ}ttjtj|ƒƒƒ|jS(sjGet status on training/predicting. Returns ------- Current state of training/predicting. (RRR RtMXAutogradIsTrainingRR(R ((s.build/bdist.linux-armv7l/egg/mxnet/autograd.pyt is_trainingOs t_RecordingStateScopecB`s)eZdZd„Zd„Zd„ZRS(sšScope for managing training state. Example:: with _RecordingStateScope(True, True): y = model(x) backward([y]) cC`s(||_||_d|_d|_dS(N(t_enter_is_recordt_enter_train_modetNonet_prev_is_recordt_prev_train_mode(tselft is_recordR((s.build/bdist.linux-armv7l/egg/mxnet/autograd.pyt__init__es   cC`sL|jdk r$t|jƒ|_n|jdk rHt|jƒ|_ndS(N(R$R&RR'R%RR((R)((s.build/bdist.linux-armv7l/egg/mxnet/autograd.pyt __enter__kscC`sf|jdk r1|j|jkr1t|jƒn|jdk rb|j|jkrbt|jƒndS(N(R$R&R'RR%R(R(R)tptypeRttrace((s.build/bdist.linux-armv7l/egg/mxnet/autograd.pyt__exit__qs!!(t__name__t __module__t__doc__R+R,R/(((s.build/bdist.linux-armv7l/egg/mxnet/autograd.pyR#[s   cC`s tt|ƒS(s˜Returns an autograd recording scope context to be used in 'with' statement and captures code that needs gradients to be calculated. .. note:: When forwarding with train_mode=False, the corresponding backward should also use train_mode=False, otherwise gradient is undefined. Example:: with autograd.record(): y = model(x) backward([y]) metric.update(...) optim.step(...) Parameters ---------- train_mode: bool, default True Whether the forward pass is in training or predicting mode. This controls the behavior of some layers such as Dropout, BatchNorm. (R#tTrue(R((s.build/bdist.linux-armv7l/egg/mxnet/autograd.pytrecordxscC`s tt|ƒS(sฌReturns a scope context to be used in 'with' statement for codes that do not need gradients to be calculated. Example:: with autograd.record(): y = model(x) backward([y]) with autograd.pause(): # testing, IO, gradient updates... Parameters ---------- train_mode: bool, default False Whether to do forward for training or predicting. (R#tFalse(R((s.build/bdist.linux-armv7l/egg/mxnet/autograd.pytpausescC`s tdtƒS(sReturns a scope context to be used in 'with' statement in which forward pass behavior is set to training mode, without changing the recording states. Example:: y = model(x) with autograd.train_mode(): y = dropout(y) N(R#R&R3(((s.build/bdist.linux-armv7l/egg/mxnet/autograd.pyRคs cC`s tdtƒS(sPReturns a scope context to be used in 'with' statement in which forward pass behavior is set to inference mode, without changing the recording states. Example:: with autograd.record(): y = model(x) with autograd.predict_mode(): y = sampling(y) backward([y]) N(R#R&R5(((s.build/bdist.linux-armv7l/egg/mxnet/autograd.pyt predict_modeณs twritecC`s t|tƒr9t|tƒs$t‚|g}|g}ng}g}x=t||ƒD],\}}|j|jƒ|j|jƒqUWt|tƒrฎt|gt|ƒ}ng|D]}t|^qต}t t j t|ƒt t |ƒt t|ƒt t |ƒƒƒdS(sึMark NDArrays as variables to compute gradient for autograd. Parameters ---------- variables: NDArray or list of NDArray gradients: NDArray or list of NDArray grad_reqs: str or list of str N(t isinstanceRtAssertionErrortziptappendthandleR RtlenR RtMXAutogradMarkVariablesR R R (t variablest gradientst grad_reqstvariable_handlestgradient_handlestvartgradvarti((s.build/bdist.linux-armv7l/egg/mxnet/autograd.pytmark_variablesรs"       cC`s‘t|tƒrW|dks0t|tƒs0t‚|g}|dk rN|gnd}ng}x|D]}|j|jƒqdW|dkrืttjt |ƒt t |ƒt j dƒt j|ƒt j|ƒƒƒdSg}x@|D]8}|dk r |j|jƒqไ|jt dƒƒqไWt |ƒt |ƒksDtdƒ‚ttjt |ƒt t |ƒt t |ƒt j|ƒt j|ƒƒƒdS(s[Compute the gradients of heads w.r.t previously marked variables. Parameters ---------- heads: NDArray or list of NDArray Output NDArray(s) head_grads: NDArray or list of NDArray or None Gradients with respect to heads. train_mode: bool, optional Whether to do backward for training or predicting. iNs.heads and head_grads must have the same length(R9RR&R:R<R=R RtMXAutogradBackwardExR>R R RRR(theadst head_gradst retain_graphRtoutput_handlestarrt ograd_handles((s.build/bdist.linux-armv7l/egg/mxnet/autograd.pytbackwardโs8 !                cC`s5tƒ}ttj|jtj|ƒƒƒt|ƒS(sโRetrieve recorded computation history as `Symbol`. Parameters ---------- x : NDArray Array representing the head of computation graph. Returns ------- Symbol The retrieved Symbol. (RR RtMXAutogradGetSymbolR=RRR(txthdl((s.build/bdist.linux-armv7l/egg/mxnet/autograd.pyt get_symbols "tFunctioncB`s“eZdZeeeeeeƒeeƒeeƒZeeeƒZde fd„ƒYZ e ƒZ d„Z d„Z d„Zd„Zd„ZRS(sNUser-defined differentiable function. Function allows defining both forward and backward computation for custom operators. During gradient computation, the used-defined backward function will be used instead of the default chain-rule. You can also cast to numpy array and back for some operations in forward and backward. For example, a stable sigmoid function can be defined as:: class sigmoid(Function): def forward(self, x): y = 1 / (1 + mx.nd.exp(-x)) self.save_for_backward(y) return y def backward(self, dy): # backward takes as many inputs as forward's return value, # and returns as many NDArrays as forward's arguments. y, = self.saved_tensors return y * (1-y) t _RegistrycB`s eZdZd„Zd„ZRS(sCustomOp registry.cC`s"i|_d|_tƒ|_dS(Ni(t ref_holdertcounterRtlock(R)((s.build/bdist.linux-armv7l/egg/mxnet/autograd.pyR+?s  cC`s6|jjƒ|j}|jd7_|jjƒ|S(sGet index for new entry.i(RYtacquireRXtrelease(R)tcur((s.build/bdist.linux-armv7l/egg/mxnet/autograd.pytincDs    (R0R1R2R+R](((s.build/bdist.linux-armv7l/egg/mxnet/autograd.pyRV=s cC`st|_d|_dS(N((R5t_usedt saved_tensors(R)((s.build/bdist.linux-armv7l/egg/mxnet/autograd.pyR+Ns cG`s ||_dS(N(R_(R)targs((s.build/bdist.linux-armv7l/egg/mxnet/autograd.pytsave_for_backwardRsc `s็ˆj stdƒ‚tˆ_ttƒ}ˆj|Œ}t|ƒ|sN|S|}t|tƒro|f}ntj j ƒ‰‡fd†}‡fd†}g|D]}|j ^qฃ}g|D]}|j ^qฟ} tj |ƒtj |ƒg} g| D]} t| ttƒƒ^q๙} ttt| ƒƒttttƒ| ƒtttƒƒƒtttdgt| ƒƒttƒƒƒ} ttjtt|ƒƒtt|ƒtt|ƒƒtt| ƒtj| ƒƒƒ| tj jˆ<|S(NsOEach Function instance can only be called once. Please create another instance.c `sฤyœg|| D]$}ttj|tƒdtƒ^q}g||||!D]$}ttj|tƒdtƒ^qJ}gt|ƒD]}||^q}ˆj|Œ} t| tƒrม| f} nt | ƒt |ƒkst dˆj j t |ƒt | ƒfƒ‚x”t || |ƒD]€\} } } t| tƒsKt dt| ƒƒ‚| dkr[dS| dkss| dkr}| | (q| dkr| | 7(qqWWn!tk rฟd tjƒGHtSXtS( sentry point for backward.twritables~%s.backward must return exactly the same number of NDArrays as the number of NDArrays arguments to forward.Expecting %d got %ds7autograd.Function.backward must return NDArrays, not %siNiitaddsError in Function.backward: %s(RRRR R5R3trangeRPR9R>R:t __class__tnameR;ttypet Exceptiont tracebackt format_exc( t num_ogradst num_igradstptrstreqstis_traint_RGt output_gradst input_gradstretstigradtrettreq(R)(s.build/bdist.linux-armv7l/egg/mxnet/autograd.pytbackward_entryhs229# %"    c`s9ytjjˆ=Wn!tk r4dtjƒGHtSXtS(s%C Callback for CustomFunction::deletes%Error in autograd.Function.delete: %s(RUt _registryRWRhRiRjR5R3(Rp(tkey(s.build/bdist.linux-armv7l/egg/mxnet/autograd.pyt delete_entry†s  (R^R:R3RR5tforwardR9RRURxR]R=t _bwd_functypet _del_functypeRRRRR>R RRR&R RtMXCustomFunctionRecordR RRRW( R)tinputstprev_recordingtoutputst ret_outputsRwRzRRt input_handlesRMt callbacksRGtcontext((RyR)s.build/bdist.linux-armv7l/egg/mxnet/autograd.pyt__call__Us@        (   cG`s t‚dS(sForward computation.N(tNotImplementedError(R)R((s.build/bdist.linux-armv7l/egg/mxnet/autograd.pyR{คscG`s t‚dS(sŒBackward computation. Takes as many inputs as forward's outputs, and returns as many NDArrays as forward's inputs. N(R‡(R)Rq((s.build/bdist.linux-armv7l/egg/mxnet/autograd.pyRPจs(R0R1R2RRRRR|R}tobjectRVRxR+RaR†R{RP(((s.build/bdist.linux-armv7l/egg/mxnet/autograd.pyRU#s    O (,R2t __future__RRt threadingRRiRRRRRRtbaseRR R R R R RRtndarrayRtsymbolRRRRRR"RˆR#R3R4R5R6RR7RHR&RPRTRU(((s.build/bdist.linux-armv7l/egg/mxnet/autograd.pyts.  ((        /