ó ùµÈ[c@@sãdZddlmZddlZddlZddlmZddlm Z ddl m Z ddl mZd „Zd „Zd „Zd „Zd „Zeddded„Zdefd„ƒYZdefd„ƒYZdS(sExecutor manager.i(tabsolute_importNi(t mx_real_t(tndarray(tcpu(tDataDescc C@sèt|ƒ}g|D]}t|||ƒ^q}t|ƒ}||krb|dc||7|j|ƒq|ddj|jdks„td|ddj|jdfƒ‚x%|D]\}}||j|ƒq‹WqWdS(s@Load a list of arrays into a list of arrays specified by slices.iÿÿÿÿis*Batch size miss match. Expected %d, got %dN(tzipt isinstancetndtNDArraytcopytotstoptshapetAssertionError(tdatattargetstd_srct d_targetst slice_idxtd_dst((sV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor_manager.pyt _load_generalbs!"cC@st|j|ƒdS(sLoad data into sliced arrays.N(R1R+(tbatchR,((sV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor_manager.pyt _load_datanscC@st|j|ƒdS(sLoad label into sliced arrays.N(R1tlabel(R2R,((sV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor_manager.pyt _load_labelrsc  @s'|j|\} } } | dk s*t‚|dkrOd„|jƒDƒ}n|j|\} } } | dk syt‚g}ˆtk r‘ind}|jƒ}ˆtkr»tƒ‰nIˆtkræt|ƒt|jƒƒ‰nt ˆtƒrøn tdƒ‚‡fd†|Dƒ}xt |ƒD]\}}||krd|dk r||kr||}t j |j ƒt j | |ƒkrµ| ||jksŸt‚|j| |ƒ}qT|jd|| |fdd|j fddd ƒtj| ||d | |ƒ}|||\}}| ||j ksºt‚| ||jks•t‚q•Wg|jD] }|^qá}|jd |d |d |d|d|d|ƒ}|S(sPbind executor for bucketing, potentially sharing data with an existing executor.cS@si|]}t|“qS((R(t.0tk((sV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor_manager.pys }s s!need_grad must be boolean or set.c@s+i|]!}|ˆkrdnd|“qS(twritetnull((R6R(t need_grad(sV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor_manager.pys Žs s#bucketing: data "%s" has a shape %ss), which is larger than already allocated sshape %ss(. Need to re-allocate. Consider putting s7default_bucket_key to be the bucket taking the largest s input for better memory sharing.tdtypetctxtargst args_gradt aux_statestgrad_reqt shared_execN(t infer_shapetNoneR*tkeyst infer_typetFalseRRtTrueR$t enumeratetnptprodR)R;treshapetwarningR%tzerosR targ_dictt grad_dictR#t aux_arraystbind(tsymR<t input_shapest param_namesR:t base_exectshared_data_arrayst input_typestloggert arg_shapet_t aux_shapet arg_typest aux_typest arg_arrayst grad_arraysRR@tiRtarg_arrtgrad_arrtsttRPtatexecutor((R:sV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor_manager.pyt _bind_execwsn          %+            =tDataParallelExecutorGroupcB@sDeZdZdd„Zd„Zed„Zd„Zed„Z RS(sHA group of executors living on different devices, for data parallelization. Parameters ---------- sym: Symbol The network configuration. arg_names: list of str Equals `sym.list_arguments()` param_names: list of str List of names of all trainable parameters. ctx: list of Context List of devices for training (data parallelization). slices: list of int Describes how the data parallelization splits data into different devices. train_data: DataIter (or DataBatch) The dataset for training. It could be any object with `provide_data` and `provide_label` properties. Loading of actual data is not necessarily needed at this stage. shared_grop: DataParallelExecutorGroup An existing executor group, if to share parameters with it. cC@s€t|ƒ|dkr5g|D] }i^q|_n |j|_g|jD]} | d^qK|_g|jD]} | d^qn|_|jƒ|_gt t |ƒƒD]} || |kr©| ^q©|_ g|j D]} || ^qØ|_ g|_ xt|ƒD]\} } i} i} xƒ|j|jD]q} t|| j|| jgt| ddƒƒ| | dˆscs@s!|]}|jtƒƒVqdS(N(R'R(R6R¤((sV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor_manager.pys ‹sN( R#RTRvRRntastypeR;R'R!RP(RwR¡R¢Rtblocktweight((sV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor_manager.pytcopy_toys " %" cC@s |jjS(sShared parameter arrays.(R–Rv(Rw((sV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor_manager.pyRvŽscC@s |jjS(sShared gradient arrays.(R–R_(Rw((sV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor_manager.pyR_“scC@s |jjS(sShared aux states.(R–RP(Rw((sV/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/executor_manager.pyRP™sc C@s©|jdk r‰|j}||jkrv|j|ƒ}t||j|j|j|j|d|j ƒ}||j|s    %   T^