σ ω΅Θ[c@`sdZddlmZddlmZddlmZddlZddlZddlmZm Z m Z ddlm Z m Z m Z mZmZdd lmZmZmZd „Zd efd „ƒYZd „Zd„Zdd„Zded„Zd„Zdd„Zdd„ZdS(sAutograd for NDArray.i(tabsolute_import(tdivision(tarrayNi(t_LIBt check_callt string_types(tmx_uintt NDArrayHandletc_arrayt c_array_buftc_handle_array(tNDArrayt zeros_liket _GRAD_REQ_MAPcC`sitjƒ}ttjtj|ƒtj|ƒƒƒttjtj|ƒtj|ƒƒƒt|jƒS(s‰Set status to training/not training. When training, graph will be constructed for gradient computation. Operators will also run with ctx.is_train=True. For example, Dropout will drop inputs randomly when is_train=True while simply passing through if is_train=False. Parameters ---------- is_train: bool Returns ------- previous state before this set. ( tctypestc_intRRtMXAutogradSetIsTrainingtbyreftMXAutogradSetIsRecordingtbooltvalue(tis_traintprev((sV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/contrib/autograd.pytset_is_training s    tTrainingStateScopecB`s)eZdZd„Zd„Zd„ZRS(s˜Scope for managing training state. Example:: with TrainingStateScope(True): y = model(x) compute_gradient([y]) cC`s||_d|_dS(N(t _enter_statetNonet_prev(tselft enter_state((sV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/contrib/autograd.pyt__init__>s cC`st|jƒ|_dS(N(RRR(R((sV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/contrib/autograd.pyt __enter__BscC`s&|j|jkr"t|jƒndS(N(RRR(RtptypeRttrace((sV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/contrib/autograd.pyt__exit__Es(t__name__t __module__t__doc__RRR"(((sV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/contrib/autograd.pyR6s  cC`s ttƒS(sReturns a training scope context to be used in 'with' statement and captures training code. Example:: with autograd.train_section(): y = model(x) compute_gradient([y]) metric.update(...) optim.step(...) (RtTrue(((sV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/contrib/autograd.pyt train_sectionJs cC`s ttƒS(s0Returns a testing scope context to be used in 'with' statement and captures testing code. Example:: with autograd.train_section(): y = model(x) compute_gradient([y]) with autograd.test_section(): # testing, IO, gradient updates... (RtFalse(((sV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/contrib/autograd.pyt test_sectionXs twritec C`sŠt|tƒr)t|gt|ƒ}ng|D]}t|^q0}ttjt|ƒt|ƒtt t d|ƒƒt|ƒƒƒdS(sΌMark NDArrays as variables to compute gradient for autograd. Parameters ---------- variables: list of NDArray gradients: list of NDArray grad_reqs: list of string tIN( t isinstanceRR tlenRRtMXAutogradMarkVariablesR R RR(t variablest gradientst grad_reqsti((sV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/contrib/autograd.pytmark_variablesfs    cC`st|ttfƒs!tdƒ‚|dkrkttjt|ƒt |ƒt j dƒt j |ƒƒƒdSg}x@|D]8}|dk r|j |jƒqx|j tdƒƒqxWt|ƒt|ƒksΨtdƒ‚ttjt|ƒt |ƒtt|ƒt j |ƒƒƒdS(s™Compute the gradients of outputs w.r.t variables. Parameters ---------- outputs: list of NDArray out_grads: list of NDArray or None s+outputs must be a list or tuple of NDArraysiNs/outputs and out_grads must have the same length(R,tlistttupletAssertionErrorRRRtMXAutogradBackwardR-R Rtc_void_pRtappendthandleRR(toutputst out_gradst retain_grapht ograd_handlestarr((sV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/contrib/autograd.pytbackward{s*             cC`st|ƒdS(sDeprecated. Please use backwardN(R@(R;((sV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/contrib/autograd.pytcompute_gradientžsc`s%tjˆƒ‡‡fd†ƒ}|S(s’Return function that computes both gradient of arguments and loss value. Parameters ---------- func: a python function The forward (loss) function. argnum: an int or a list of int The index of argument to calculate gradient for. Returns ------- grad_and_loss_func: a python function A function that would compute both the gradient of arguments and loss value. c`sπ|}ˆdk rPtˆtƒr'ˆnˆg}g|D]}||^q7}nx)|D]!}t|tƒsWtdƒ‚qWWg|D]}t|ƒ^qƒ}t||ƒtƒˆ|Œ}WdQXtt|tƒrί|gn|ƒ||fS(sWrapped function.s&type of autograd input should NDArray.N( RR,R4R R6R R3R'RA(targsR/targnum_R2txtgradsR;(targnumtfunc(sV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/contrib/autograd.pytwrapped²s     "(t functoolstwraps(RGRFRH((RFRGsV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/contrib/autograd.pyt grad_and_loss£s!c`s1t||ƒ‰tjˆƒ‡fd†ƒ}|S(sPReturn function that computes gradient of arguments. Parameters ---------- func: a python function The forward (loss) function. argnum: an int or a list of int The index of argument to calculate gradient for. Returns ------- grad_func: a python function A function that would compute the gradient of arguments. Examples -------- >>> # autograd supports dynamic graph which is changed >>> # every instance >>> def func(x): >>> r = random.randint(0, 1) >>> if r % 2: >>> return x**2 >>> else: >>> return x/3 >>> # use `grad(func)` to get the gradient function >>> for x in range(10): >>> grad_func = grad(func) >>> inputs = nd.array([[1, 2, 3], [4, 5, 6]]) >>> grad_vals = grad_func(inputs) c`sˆ|ŒdS(Ni((RB(tgrad_with_loss_func(sV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/contrib/autograd.pyRHγs(RKRIRJ(RGRFRH((RLsV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/contrib/autograd.pytgradΓs( R%t __future__RRRRRItbaseRRRRRRR R tndarrayR R R RtobjectRR'R)R3RR(R@RARKRM(((sV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/contrib/autograd.pyts"  (    #