ó šÄïYc@sXdZddlZddlZddlmZddlmZdefd„ƒYZdS( sR`SequentialModule` is a container module that chains a number of modules together.iÿÿÿÿNi(tUniformi(t BaseModuletSequentialModulecBseZdZdZdZed„Zd„Zed„ƒZ ed„ƒZ ed„ƒZ ed„ƒZ ed „ƒZ d „Zed ƒddeeed „Zdeeedd d„Zddded„Zdd„Zdd„Zd„Zed„Zed„Zd„Zd„ZRS(s1A SequentialModule is a container module that can chain multiple modules together. .. note:: Building a computation graph with this kind of imperative container is less flexible and less efficient than the symbolic graph. So, this should be only used as a handy utility. t take_labelst auto_wiringcCstt|ƒjd|ƒg|_g|_d|_d|_tgt tƒD]$}|j dƒrMt t|ƒ^qMƒ|_ dS(NtloggertMETA_( tsuperRt__init__t_modulest_metastNonet _label_shapest _data_shapestsettdirt startswithtgetattrt _meta_keys(tselfRtx((s>build/bdist.linux-armv7l/egg/mxnet/module/sequential_module.pyR)s    cKso|jj|ƒx-|D]%}||jkstd|ƒ‚qW|jj|ƒt|_t|_t|_|S(síAdds a module to the chain. Parameters ---------- module : BaseModule The new module to add. kwargs : **keywords All the keyword arguments are saved as meta information for the added module. The currently known meta includes - `take_labels`: indicating whether the module expect to take labels when doing computation. Note any module in the chain can take labels (not necessarily only the top most one), and they all take the same labels passed from the original data batch for the `SequentialModule`. Returns ------- self This function returns `self` to allow us to easily chain a series of `add` calls. Examples -------- >>> # An example of addinging two modules to a chain. >>> seq_mod = mx.mod.SequentialModule() >>> seq_mod.add(mod1) >>> seq_mod.add(mod2) sUnknown meta "%s", a typo?( R tappendRtAssertionErrorR tFalsetbindedtparams_initializedtoptimizer_initialized(Rtmoduletkwargstkey((s>build/bdist.linux-armv7l/egg/mxnet/module/sequential_module.pytadd4s #   cCs't|jƒdkr#|jdjSgS(s1A list of names for data required by this module.i(tlenR t data_names(R((s>build/bdist.linux-armv7l/egg/mxnet/module/sequential_module.pyR bscCs't|jƒdkr#|jdjSgS(s/A list of names for the outputs of this module.iiÿÿÿÿ(RR t output_names(R((s>build/bdist.linux-armv7l/egg/mxnet/module/sequential_module.pyR!iscCs|jst‚|jdjS(sÐGets data shapes. Returns ------- list A list of `(name, shape)` pairs. The data shapes of the first module is the data shape of a `SequentialModule`. i(RRR t data_shapes(R((s>build/bdist.linux-armv7l/egg/mxnet/module/sequential_module.pyR"ps cCs|jst‚|jS(s2Gets label shapes. Returns ------- list A list of `(name, shape)` pairs. The return value could be `None` if the module does not need labels, or if the module is not bound for training (in this case, label information is not available). (RRR (R((s>build/bdist.linux-armv7l/egg/mxnet/module/sequential_module.pyt label_shapes}s cCs|jst‚|jdjS(sÕGets output shapes. Returns ------- list A list of `(name, shape)` pairs. The output shapes of the last module is the output shape of a `SequentialModule`. iÿÿÿÿ(RRR t output_shapes(R((s>build/bdist.linux-armv7l/egg/mxnet/module/sequential_module.pyR$‹s cCst|jr|jst‚tƒ}tƒ}x=|jD]2}|jƒ\}}|j|ƒ|j|ƒq4W||fS(sGets current parameters. Returns ------- (arg_params, aux_params) A pair of dictionaries each mapping parameter names to NDArray values. This is a merged dictionary of all the parameters in the modules. (RRRtdictR t get_paramstupdate(Rt arg_paramst aux_paramsRtargtaux((s>build/bdist.linux-armv7l/egg/mxnet/module/sequential_module.pyR&˜s    g{®Gáz„?c Csý|jr| rdS|js)tdƒ‚x?|jD]4}|jd|d|d|d|d|d|ƒq3Wd „}tƒ} tƒ} xgt|jƒD]V\} }|jƒ\}}|| |jƒ|j| ƒ|| |jƒ|j| ƒq–Wt |_dS( s€Initializes parameters. Parameters ---------- initializer : Initializer arg_params : dict Default ``None``. Existing parameters. This has higher priority than `initializer`. aux_params : dict Default ``None``. Existing auxiliary states. This has higher priority than `initializer`. allow_missing : bool Allow missing values in `arg_params` and `aux_params` (if not ``None``). In this case, missing values will be filled with `initializer`. force_init : bool Default ``False``. allow_extra : boolean, optional Whether allow extra parameters that are not needed by symbol. If this is True, no error will be thrown when arg_params or aux_params contain extra parameters that is not needed by the executor. Ns,call bind before initializing the parameterst initializerR(R)t allow_missingt force_initt allow_extrac Ssrxk|D]c}||ks`tdd||t||ƒfd||t|||ƒfƒ‚|||build/bdist.linux-armv7l/egg/mxnet/module/sequential_module.pyt _check_nameÎs   ( RRRR t init_paramsR%t enumerateR&tkeystTrue( RR,R(R)R-R.R/RR6t arg_namest aux_namesti_layer((s>build/bdist.linux-armv7l/egg/mxnet/module/sequential_module.pyR7­s    twritecCsØ|jr$| r$|jjdƒdS|r?|tks?t‚n|dksWtdƒ‚t|jƒdksxtdƒ‚t|_||_|}t } x)t |jƒD]\} } |j | } t j | krê| t j rê|} t} nd} t|p|o| dkƒ}| jt jt ƒr| j}t|ƒt|ƒksJt‚gt||ƒD]\}\}}||f^qZ}n| jd|d| d|d |d |d dd |ƒ| j}q¦W| sÔd|_ndS( sBinds the symbols to construct executors. This is necessary before one can perform computation with the module. Parameters ---------- data_shapes : list of (str, tuple) Typically is `data_iter.provide_data`. label_shapes : list of (str, tuple) Typically is `data_iter.provide_label`. for_training : bool Default is ``True``. Whether the executors should be bind for training. inputs_need_grad : bool Default is ``False``. Whether the gradients to the input data need to be computed. Typically this is not needed. But this might be needed when implementing composition of modules. force_rebind : bool Default is ``False``. This function does nothing if the executors are already bound. But with this ``True``, the executors will be forced to rebind. shared_module : Module Default is ``None``. Currently shared module is not supported for `SequentialModule`. grad_req : str, list of str, dict of str to str Requirement for gradient accumulation. Can be 'write', 'add', or 'null' (default to 'write'). Can be specified globally (str) or for each argument (list, dict). sAlready bound, ignoring bind()NsShared module is not supportedis,Attempting to bind an empty SequentialModuleR"R#t for_trainingtinputs_need_gradt force_rebindt shared_moduletgrad_req(RRtwarningR:RR RR R RR8R RtMETA_TAKE_LABELStbooltgettMETA_AUTO_WIRINGR tziptbindR$(RR"R#R?R@RARBRCtmy_data_shapestanybody_ever_needs_labelR=Rtmetatmy_label_shapestmy_inputs_need_gradR tnew_namet_tshape((s>build/bdist.linux-armv7l/egg/mxnet/module/sequential_module.pyRJàs>!       4  tlocaltsgdt learning_ratec Cs|jr|jst‚|jr<| r<|jjdƒdSx3|jD](}|jd|d|d|d|ƒqFWt|_dS(sKInstalls and initializes optimizers. Parameters ---------- kvstore : str or KVStore Default `'local'`. optimizer : str or Optimizer Default `'sgd'` optimizer_params : dict Default ``(('learning_rate', 0.01),)``. The default value is not a dictionary, just to avoid pylint warning of dangerous default values. force_init : bool Default ``False``, indicating whether we should force re-initializing the optimizer in the case an optimizer is already installed. s(optimizer already initialized, ignoring.Ntkvstoret optimizertoptimizer_paramsR.( RRRRRRDR tinit_optimizerR:(RRVRWRXR.R((s>build/bdist.linux-armv7l/egg/mxnet/module/sequential_module.pyRY)scCs|jr|jst‚tj|ƒ}xæt|jƒD]Õ\}}|j|d|ƒ|dt|jƒkrsPn|jƒ|_ t |dƒr7g|j D]}|d^q›}t|ƒt|j ƒksÒt‚gt ||j ƒD]\}}||j f^qå|_q7q7WdS(s×Forward computation. Parameters ---------- data_batch : DataBatch is_train : bool Default is ``None``, in which case `is_train` is take as ``self.for_training``. tis_trainit provide_dataiN(RRRtcopyR8R tforwardRt get_outputstdatathasattrR$RIRRR[(Rt data_batchRZR=RRR R5((s>build/bdist.linux-armv7l/egg/mxnet/module/sequential_module.pyR]Fs  !cCs†|jr|jst‚xgttttt|jƒƒ|jƒƒƒD]8\}}|j d|ƒ|dkrrPn|j ƒ}qFWdS(sBackward computation.t out_gradsiN( RRRtreversedtlistRItrangeRR tbackwardtget_input_grads(RRbR=R((s>build/bdist.linux-armv7l/egg/mxnet/module/sequential_module.pyRfes : cCsC|jr|jr|js!t‚x|jD]}|jƒq+WdS(s†Updates parameters according to installed optimizer and the gradient computed in the previous forward-backward cycle. N(RRRRR R'(RR((s>build/bdist.linux-armv7l/egg/mxnet/module/sequential_module.pyR'ps!cCs/|jr|jst‚|jdjd|ƒS(sÎGets outputs from a previous forward computation. Parameters ---------- merge_multi_context : bool Default is ``True``. In the case when data-parallelism is used, the outputs will be collected from multiple devices. A ``True`` value indicate that we should merge the collected results so that they look like from a single executor. Returns ------- list of NDArray or list of list of NDArray If `merge_multi_context` is ``True``, it is like ``[out1, out2]``. Otherwise, it is like ``[[out1_dev1, out1_dev2], [out2_dev1, out2_dev2]]``. All the output elements are numpy arrays. iÿÿÿÿtmerge_multi_context(RRRR R^(RRh((s>build/bdist.linux-armv7l/egg/mxnet/module/sequential_module.pyR^yscCs8|jr|jr|js!t‚|jdjd|ƒS(sÞGets the gradients with respect to the inputs of the module. Parameters ---------- merge_multi_context : bool Default is ``True``. In the case when data-parallelism is used, the outputs will be collected from multiple devices. A ``True`` value indicate that we should merge the collected results so that they look like from a single executor. Returns ------- list of NDArrays or list of list of NDArrays If `merge_multi_context` is ``True``, it is like ``[grad1, grad2]``. Otherwise, it is like ``[[grad1_dev1, grad1_dev2], [grad2_dev1, grad2_dev2]]``. All the output elements are `NDArray`. iRh(RRR@RR Rg(RRh((s>build/bdist.linux-armv7l/egg/mxnet/module/sequential_module.pyRgŽs!cCsq|jr|jst‚xRt|j|jƒD];\}}tj|kr.|tjr.|j||ƒq.q.WdS(sõEvaluates and accumulates evaluation metric on outputs of the last forward computation. Parameters ---------- eval_metric : EvalMetric labels : list of NDArray Typically ``data_batch.label``. N( RRRRIR R RREt update_metric(Rt eval_metrictlabelsRMR((s>build/bdist.linux-armv7l/egg/mxnet/module/sequential_module.pyRi£s " cCs4|jst‚x|jD]}|j|ƒqWdS(s"Installs monitor on all executors.N(RRR tinstall_monitor(RtmonR((s>build/bdist.linux-armv7l/egg/mxnet/module/sequential_module.pyRl³sN(RUg{®Gáz„?((RUg{®Gáz„?(t__name__t __module__t__doc__RERHtloggingRRtpropertyR R!R"R#R$R&RR RR7R:RJRYR]RfR'R^RgRiRl(((s>build/bdist.linux-armv7l/egg/mxnet/module/sequential_module.pyRs4 .  2  G     (RpRqR\R,Rt base_moduleRR(((s>build/bdist.linux-armv7l/egg/mxnet/module/sequential_module.pyts