σ ω΅Θ[c@@sdZddlmZddlmZddlZddlZddlZddl m Z ddl m Z m Z m Z mZddl mZmZmZmZdd lmZdd lmZdd lmZmZmZmZd „Zd efd„ƒYZdS(s%Symbolic Executor component of MXNet.i(tabsolute_import(tarrayNi(t_LIB(tmx_uintt NDArrayHandletExecutorHandletpy_str(t check_calltc_handle_arrayt c_array_buft c_str_array(tNDArray(t _ndarray_cls(t_split_input_slicet_check_argumentst _load_datat _load_labelc@s‡fd†}|S(s&A wrapper for the user-defined handle.c@sˆ||ƒdS(s ctypes function N((tnameRt_(tcallback(sN/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor.pytcallback_handle(s((RR((RsN/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor.pyt_monitor_callback_wrapper&stExecutorcB@sΏeZdZd„Zd„Zed„ƒZd„Zed„Z de d„Z d„Z ed„ƒZed „ƒZed „ƒZed „ƒZded „Zeed „Zd„ZRS(sdExecutor is the object providing efficient symbolic graph execution and optimization. Examples -------- >>> # typical approach to create an executor is to bind symbol >>> a = mx.sym.Variable('a') >>> b = mx.sym.Variable('b') >>> c = 2 * a + b >>> texec = c.bind(mx.cpu(), {'a': mx.nd.array([1,2]), 'b':mx.nd.array([2,3])}) cC@sΣt|tƒstdƒ‚n||_g|_g|_g|_|jƒ|_t j |ƒ|_ d|_ d|_d|_d|_d|_d|_t j |ƒ|_t j |ƒ|_t j |ƒ|_dS(sConstructor, used Symbol.bind and Symbol.simple_bind instead. Parameters ---------- handle: ExecutorHandle ExecutorHandle generated by calling `bind`. See Also -------- Symbol.bind : to create executor. sHandle type errorN(t isinstanceRt TypeErrorthandlet arg_arrayst grad_arrayst aux_arrayst _get_outputstoutputstcopytdeepcopyt_symboltNonet_optimized_symbolt _arg_dictt _grad_dictt _aux_dictt _output_dictt_monitor_callbackt_ctxt _grad_reqt _group2ctx(tselfRtsymboltctxtgrad_reqt group2ctx((sN/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor.pyt__init__8s"           cC@sttj|jƒƒdS(N(RRtMXExecutorFreeR(R,((sN/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor.pyt__del__VscC@s_tƒ}x@|D]8}||kr;tdt|ƒƒ‚n|j|ƒqWtt||ƒƒS(s0Get the dictionary given name and ndarray pairs.sDuplicate names detected, %s(tsett ValueErrortstrtaddtdicttzip(tnamestndarraystnsettnm((sN/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor.pyt _get_dictYs    cC@s…tƒ}tjtƒƒ}ttj|jtj|ƒtj|ƒƒƒ|j }gt |ƒD]}t t||ƒƒ^q_}|S(s€List all the output NDArray. Returns ------- A list of ndarray bound to the heads of executor. ( RtctypestPOINTERRRRtMXExecutorOutputsRtbyreftvaluetrangeR (R,tout_sizethandlest num_outputtiR((sN/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor.pyRcs  /cK@st|ƒdkrΤ|j}xΆ|jƒD]₯\}}t|ttjfƒs[tdƒ‚n||krztd|ƒ‚n||j |j krΒtd|t ||j ƒt |j ƒfƒ‚n|||(q(Wnt t j |jtjt|ƒƒƒƒ|jS(s-Calculate the outputs specified by the bound symbol. Parameters ---------- is_train: bool, optional Whether this forward is for evaluation purpose. If True, a backward call is expected to follow. **kwargs Additional specification of input arguments. Examples -------- >>> # doing forward by specifying data >>> texec.forward(is_train=True, data=mydata) >>> # doing forward by not specifying things, but copy to the executor before hand >>> mydata.copyto(texec.arg_dict['data']) >>> texec.forward(is_train=True) >>> # doing forward by specifying data and get outputs >>> outputs = texec.forward(is_train=True, data=mydata) >>> print(outputs[0].asnumpy()) is:only accept keyword argument of NDArrays and numpy.ndarraysUnknown argument %ss4Shape not match! Argument %s, need: %s, received: %s(tlentarg_dicttitemsRR tnptndarrayR5RtshapeR6RRtMXExecutorForwardRR?tc_inttintR(R,tis_traintkwargsRJRR((sN/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor.pytforwardrs  , cC@sΫ|dkrg}nSt|tƒr0|g}n8t|tƒrhg|jjƒD]}||^qO}nx,|D]$}t|tƒsotdƒ‚qoqoWt|ƒ}tt j |j t t |ƒƒ|tj|ƒƒƒdS(s― Do backward pass to get the gradient of arguments. Parameters ---------- out_grads : NDArray or list of NDArray or dict of str to NDArray, optional Gradient on the outputs to be propagated back. This parameter is only needed when bind is called on outputs that are not a loss function. is_train : bool, default True Whether this backward is for training or inference. Note that in rare cases you want to call backward with is_train=False to get gradient during inference. Examples -------- >>> # Example for binding on loss function symbol, which gives the loss value of the model. >>> # Equivalently it gives the head gradient for backward pass. >>> # In this example the built-in SoftmaxOutput is used as loss function. >>> # MakeLoss can be used to define customized loss function symbol. >>> net = mx.sym.Variable('data') >>> net = mx.sym.FullyConnected(net, name='fc', num_hidden=6) >>> net = mx.sym.Activation(net, name='relu', act_type="relu") >>> net = mx.sym.SoftmaxOutput(net, name='softmax') >>> args = {'data': mx.nd.ones((1, 4)), 'fc_weight': mx.nd.ones((6, 4)), >>> 'fc_bias': mx.nd.array((1, 4, 4, 4, 5, 6)), 'softmax_label': mx.nd.ones((1))} >>> args_grad = {'fc_weight': mx.nd.zeros((6, 4)), 'fc_bias': mx.nd.zeros((6))} >>> texec = net.bind(ctx=mx.cpu(), args=args, args_grad=args_grad) >>> out = texec.forward(is_train=True)[0].copy() >>> print out.asnumpy() [[ 0.00378404 0.07600445 0.07600445 0.07600445 0.20660152 0.5616011 ]] >>> texec.backward() >>> print(texec.grad_arrays[1].asnumpy()) [[ 0.00378404 0.00378404 0.00378404 0.00378404] [-0.92399555 -0.92399555 -0.92399555 -0.92399555] [ 0.07600445 0.07600445 0.07600445 0.07600445] [ 0.07600445 0.07600445 0.07600445 0.07600445] [ 0.20660152 0.20660152 0.20660152 0.20660152] [ 0.5616011 0.5616011 0.5616011 0.5616011 ]] >>> >>> # Example for binding on non-loss function symbol. >>> # Here the binding symbol is neither built-in loss function >>> # nor customized loss created by MakeLoss. >>> # As a result the head gradient is not automatically provided. >>> a = mx.sym.Variable('a') >>> b = mx.sym.Variable('b') >>> # c is not a loss function symbol >>> c = 2 * a + b >>> args = {'a': mx.nd.array([1,2]), 'b':mx.nd.array([2,3])} >>> args_grad = {'a': mx.nd.zeros((2)), 'b': mx.nd.zeros((2))} >>> texec = c.bind(ctx=mx.cpu(), args=args, args_grad=args_grad) >>> out = texec.forward(is_train=True)[0].copy() >>> print(out.asnumpy()) [ 4. 7.] >>> # out_grads is the head gradient in backward pass. >>> # Here we define 'c' as loss function. >>> # Then 'out' is passed as head gradient of backward pass. >>> texec.backward(out) >>> print(texec.grad_arrays[0].asnumpy()) [ 8. 14.] >>> print(texec.grad_arrays[1].asnumpy()) [ 4. 7.] sinputs must be NDArrayN(R"RR R8R!t list_outputsRRRRtMXExecutorBackwardExRRRIR?RP(R,t out_gradsRRtktobjRM((sN/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor.pytbackward›sA   )   cC@sVtjdtjttjƒ}|t|ƒƒ|_tt j |j |jdƒƒdS(sYInstall callback for monitor. Parameters ---------- callback : function Takes a string and an NDArrayHandle. Examples -------- >>> def mon_callback(*args, **kwargs): >>> print("Do your stuff here.") >>> >>> texe.set_monitor_callback(mon_callback) N( R?t CFUNCTYPER"tc_char_pRtc_void_pRR(RRtMXExecutorSetMonitorCallbackR(R,Rtcb_type((sN/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor.pytset_monitor_callbackνs  cC@s:|jdkr3tj|jjƒ|jƒ|_n|jS(s3Get dictionary representation of argument arrrays. Returns ------- arg_dict : dict of str to NDArray The dictionary that maps the names of arguments to NDArrays. Raises ------ ValueError : if there are duplicated names in the arguments. N(R$R"RR>R!tlist_argumentsR(R,((sN/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor.pyRJs cC@s:|jdkr3tj|jjƒ|jƒ|_n|jS(sΡGet dictionary representation of gradient arrays. Returns ------- grad_dict : dict of str to NDArray The dictionary that maps name of arguments to gradient arrays. N(R%R"RR>R!RaR(R,((sN/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor.pyt grad_dicts cC@s:|jdkr3tj|jjƒ|jƒ|_n|jS(sCGet dictionary representation of auxiliary states arrays. Returns ------- aux_dict : dict of str to NDArray The dictionary that maps name of auxiliary states to NDArrays. Raises ------ ValueError : if there are duplicated names in the auxiliary states. N(R&R"RR>R!tlist_auxiliary_statesR(R,((sN/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor.pytaux_dict#s cC@s:|jdkr3tj|jjƒ|jƒ|_n|jS(s/Get dictionary representation of output arrays. Returns ------- output_dict : dict of str to NDArray The dictionary that maps name of output names to NDArrays. Raises ------ ValueError : if there are duplicated names in the outputs. N(R'R"RR>R!RUR(R,((sN/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor.pyt output_dict5s cC@sπxk|jƒD]]\}}||jkrQ|j|}|j|jƒj|ƒq |s td|ƒ‚q q W|dkr~dSxk|jƒD]]\}}||jkrΟ|j|}|j|jƒj|ƒq‹|s‹td|ƒ‚q‹q‹WdS(sCopy parameters from arg_params, aux_params into executor's internal array. Parameters ---------- arg_params : dict of str to NDArray Parameters, dict of name to NDArray of arguments. aux_params : dict of str to NDArray, optional Parameters, dict of name to NDArray of auxiliary states. allow_extra_params : boolean, optional Whether allow extra parameters that are not needed by symbol. If this is True, no error will be thrown when arg_params or aux_params contain extra parameters that is not needed by the executor. Raises ------ ValueError If there is additional parameters in the dict but ``allow_extra_params=False``. Examples -------- >>> # set parameters with existing model checkpoint >>> model_prefix = 'mx_mlp' >>> sym, arg_params, aux_params = mx.model.load_checkpoint(model_prefix, 0) >>> texec.copy_params_from(arg_params, aux_params) s+Find name "%s" that is not in the argumentsNs0Find name %s that is not in the auxiliary states(RKRJtastypetdtypetcopytoR5R"Rd(R,t arg_paramst aux_paramstallow_extra_paramsRRtdst((sN/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor.pytcopy_params_fromGs   cK@sXg}dg}g}xY|jƒD]K\}}t|tƒr"|j|ƒ|j|ƒ|jt|ƒƒq"q"Wg} g} g} |jrάxM|jjƒD]9\} } | j| ƒ| j| jƒ| j| jƒqœWnt ƒ}|j }t j ƒ}t j tƒƒ}t j tƒƒ}t j ƒ}t j tƒƒ}ttjt jt|ƒƒt jt|ƒƒt j|jjƒt j|jjƒtt| ƒƒt| ƒtt jtd| ƒƒtt jtd| ƒƒtt|ƒƒt|ƒtttd|ƒƒtttd|ƒƒt j|ƒt j|ƒt j|ƒt j|ƒt j|ƒ|t j|ƒƒƒgt|jƒD]}tt||ƒƒ^qy}gt|jƒD]2}||dk rΧtt||ƒƒnd^q«}gt|jƒD]}tt||ƒƒ^qσ}t||j|j|j |jƒ}||_!||_"||_#|S(s=Return a new executor with the same symbol and shared memory, but different input/output shapes. For runtime reshaping, variable length sequences, etc. The returned executor shares state with the current one, and cannot be used in parallel with it. Parameters ---------- partial_shaping : bool Whether to allow changing the shape of unspecified arguments. allow_up_sizing : bool Whether to allow allocating new ndarrays that's larger than the original. kwargs : dict of string to tuple of int New shape for arguments. Returns ------- exec : Executor A new executor that shares memory with self. Examples -------- >>> a = mx.sym.Variable('a') >>> b = mx.sym.Variable('b') >>> c = 2 * a + b >>> texec = c.bind(mx.cpu(), {'a': mx.nd.zeros((2, 1)), 'b': mx.nd.ones((2,1))}) >>> new_shape = {'a': (4, 2), 'b': (4, 2)} >>> texec.reshape(allow_up_sizing=True, **new_shape) iRHtIN($RKRttupletappendtextendRIR+t device_typeidt device_idRRR?tc_uintR@RRRtMXExecutorReshapeRPRQR)RR R tpy_arrayRBRDRCR R"RR!R*RRR(R,tpartial_shapingtallow_up_sizingRStprovided_arg_shape_datatprovided_arg_shape_idxtprovided_arg_shape_namesRXtvt ctx_map_keystctx_map_dev_typestctx_map_dev_idstkeytvalRt shared_handlet num_in_argstin_arg_handlestarg_grad_handlestnum_aux_statestaux_state_handlesRHRRRtexecutor((sN/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor.pytreshapetsr                  /E/$   cC@s;tjƒ}ttj|jtj|ƒƒƒt|jƒS(sΉGet a debug string about internal execution plan. Returns ------- debug_str : string Debug string of the executor. Examples -------- >>> a = mx.sym.Variable('a') >>> b = mx.sym.sin(a) >>> c = 2 * a + b >>> texec = c.bind(mx.cpu(), {'a': mx.nd.array([1,2]), 'b':mx.nd.array([2,3])}) >>> print(texec.debug_str()) Symbol Outputs: output[0]=_plus0(0) Variable:a -------------------- Op:_mul_scalar, Name=_mulscalar0 Inputs: arg[0]=a(0) version=0 Attrs: scalar=2 -------------------- Op:sin, Name=sin0 Inputs: arg[0]=a(0) version=0 -------------------- Op:elemwise_add, Name=_plus0 Inputs: arg[0]=_mulscalar0(0) arg[1]=sin0(0) Total 0 MB allocated Total 11 TempSpace resource requested ( R?R\RRtMXExecutorPrintRRBRRC(R,t debug_str((sN/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor.pyR‹Χs$  N(t__name__t __module__t__doc__R1R3t staticmethodR>RtFalseRTR"tTrueRZR`tpropertyRJRbRdReRmR‰R‹(((sN/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor.pyR-s     )R -c(RŽt __future__RRRvR?RtnumpyRLtbaseRRRRRRRR R RMR R texecutor_managerR RRRRtobjectR(((sN/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor.pyts   """