σ šΔοYc@`s dZddlmZddlmZddlZddlZddlmZmZm Z ddlm Z m Z m Z ddl mZmZdd lmZd „Zd efd „ƒYZd „Zd„Zdd„Zded„Zd„Zdd„Zdd„ZdS(sAutograd for NDArray.i(tabsolute_import(tdivisionNi(t_LIBt check_callt string_types(tmx_uintt NDArrayHandletc_array(tNDArrayt zeros_like(t _GRAD_REQ_MAPcC`sitjƒ}ttjtj|ƒtj|ƒƒƒttjtj|ƒtj|ƒƒƒt|jƒS(s‰Set status to training/not training. When training, graph will be constructed for gradient computation. Operators will also run with ctx.is_train=True. For example, Dropout will drop inputs randomly when is_train=True while simply passing through if is_train=False. Parameters ---------- is_train: bool Returns ------- previous state before this set. ( tctypestc_intRRtMXAutogradSetIsTrainingtbyreftMXAutogradSetIsRecordingtbooltvalue(tis_traintprev((s6build/bdist.linux-armv7l/egg/mxnet/contrib/autograd.pytset_is_trainings    tTrainingStateScopecB`s)eZdZd„Zd„Zd„ZRS(s˜Scope for managing training state. Example:: with TrainingStateScope(True): y = model(x) compute_gradient([y]) cC`s||_d|_dS(N(t _enter_statetNonet_prev(tselft enter_state((s6build/bdist.linux-armv7l/egg/mxnet/contrib/autograd.pyt__init__=s cC`st|jƒ|_dS(N(RRR(R((s6build/bdist.linux-armv7l/egg/mxnet/contrib/autograd.pyt __enter__AscC`s&|j|jkr"t|jƒndS(N(RRR(RtptypeRttrace((s6build/bdist.linux-armv7l/egg/mxnet/contrib/autograd.pyt__exit__Ds(t__name__t __module__t__doc__RRR(((s6build/bdist.linux-armv7l/egg/mxnet/contrib/autograd.pyR5s  cC`s ttƒS(sReturns a training scope context to be used in 'with' statement and captures training code. Example:: with autograd.train_section(): y = model(x) compute_gradient([y]) metric.update(...) optim.step(...) (RtTrue(((s6build/bdist.linux-armv7l/egg/mxnet/contrib/autograd.pyt train_sectionIs cC`s ttƒS(s0Returns a testing scope context to be used in 'with' statement and captures testing code. Example:: with autograd.train_section(): y = model(x) compute_gradient([y]) with autograd.test_section(): # testing, IO, gradient updates... (RtFalse(((s6build/bdist.linux-armv7l/egg/mxnet/contrib/autograd.pyt test_sectionWs twritecC`sΣg}g}x=t||ƒD],\}}|j|jƒ|j|jƒqWt|tƒrut|gt|ƒ}ng|D]}t|^q|}ttj t|ƒt t |ƒt t |ƒt t |ƒƒƒdS(sΌMark NDArrays as variables to compute gradient for autograd. Parameters ---------- variables: list of NDArray gradients: list of NDArray grad_reqs: list of string N( tziptappendthandlet isinstanceRR tlenRRtMXAutogradMarkVariablesRRR(t variablest gradientst grad_reqstvariable_handlestgradient_handlestvartgradvarti((s6build/bdist.linux-armv7l/egg/mxnet/contrib/autograd.pytmark_variableses     cC`sCt|ttfƒs!tdƒ‚g}x|D]}|j|jƒq.W|dkr•ttj t |ƒt t |ƒt jdƒt j|ƒƒƒdSg}x@|D]8}|dk rΗ|j|jƒq’|jt dƒƒq’Wt |ƒt |ƒkstdƒ‚ttj t |ƒt t |ƒt t |ƒt j|ƒƒƒdS(s™Compute the gradients of outputs w.r.t variables. Parameters ---------- outputs: list of NDArray out_grads: list of NDArray or None s+outputs must be a list or tuple of NDArraysiNs/outputs and out_grads must have the same length(R+tlistttupletAssertionErrorR)R*RRRtMXAutogradBackwardR,RRR tc_void_pR (toutputst out_gradst retain_graphtoutput_handlestarrt ograd_handles((s6build/bdist.linux-armv7l/egg/mxnet/contrib/autograd.pytbackwards0              cC`st|ƒdS(sDeprecated. Please use backwardN(RB(R<((s6build/bdist.linux-armv7l/egg/mxnet/contrib/autograd.pytcompute_gradient₯sc`s%tjˆƒ‡‡fd†ƒ}|S(s’Return function that computes both gradient of arguments and loss value. Parameters ---------- func: a python function The forward (loss) function. argnum: an int or a list of int The index of argument to calculate gradient for. Returns ------- grad_and_loss_func: a python function A function that would compute both the gradient of arguments and loss value. c`sπ|}ˆdk rPtˆtƒr'ˆnˆg}g|D]}||^q7}nx)|D]!}t|tƒsWtdƒ‚qWWg|D]}t|ƒ^qƒ}t||ƒtƒˆ|Œ}WdQXtt|tƒrί|gn|ƒ||fS(sWrapped function.s&type of autograd input should NDArray.N( RR+R7RR9R R6R$RC(targsR.targnum_R5txtgradsR<(targnumtfunc(s6build/bdist.linux-armv7l/egg/mxnet/contrib/autograd.pytwrappedΉs     "(t functoolstwraps(RIRHRJ((RHRIs6build/bdist.linux-armv7l/egg/mxnet/contrib/autograd.pyt grad_and_lossͺs!c`s1t||ƒ‰tjˆƒ‡fd†ƒ}|S(sPReturn function that computes gradient of arguments. Parameters ---------- func: a python function The forward (loss) function. argnum: an int or a list of int The index of argument to calculate gradient for. Returns ------- grad_func: a python function A function that would compute the gradient of arguments. Examples -------- >>> # autograd supports dynamic graph which is changed >>> # every instance >>> def func(x): >>> r = random.randint(0, 1) >>> if r % 2: >>> return x**2 >>> else: >>> return x/3 >>> # use `grad(func)` to get the gradient function >>> for x in range(10): >>> grad_func = grad(func) >>> inputs = nd.array([[1, 2, 3], [4, 5, 6]]) >>> grad_vals = grad_func(inputs) c`sˆ|ŒdS(Ni((RD(tgrad_with_loss_func(s6build/bdist.linux-armv7l/egg/mxnet/contrib/autograd.pyRJκs(RMRKRL(RIRHRJ((RNs6build/bdist.linux-armv7l/egg/mxnet/contrib/autograd.pytgradΚs(R"t __future__RRR RKtbaseRRRRRRtndarrayRR tsymbolR RtobjectRR$R&R6RR%RBRCRMRO(((s6build/bdist.linux-armv7l/egg/mxnet/contrib/autograd.pyts"      &