ó ùµÈ[c@sXdZddlZddlZddlmZddlmZdefd„ƒYZdS( sR`SequentialModule` is a container module that chains a number of modules together.iÿÿÿÿNi(tUniformi(t BaseModuletSequentialModulecBs"eZdZdZdZed„Zd„Zed„ƒZ ed„ƒZ ed„ƒZ ed„ƒZ ed „ƒZ d „Zed ƒddeeed „Zdeeedd d„Zddded„Zdd„Zdd„Zd„Zed„Zed„Zed„Zd„ZRS(s1A SequentialModule is a container module that can chain multiple modules together. .. note:: Building a computation graph with this kind of imperative container is less flexible and less efficient than the symbolic graph. So, this should be only used as a handy utility. t take_labelst auto_wiringcCstt|ƒjd|ƒg|_g|_d|_d|_tgt tƒD]$}|j dƒrMt t|ƒ^qMƒ|_ dS(NtloggertMETA_( tsuperRt__init__t_modulest_metastNonet _label_shapest _data_shapestsettdirt startswithtgetattrt _meta_keys(tselfRtx((s^/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/module/sequential_module.pyR)s    cKso|jj|ƒx-|D]%}||jkstd|ƒ‚qW|jj|ƒt|_t|_t|_|S(síAdds a module to the chain. Parameters ---------- module : BaseModule The new module to add. kwargs : **keywords All the keyword arguments are saved as meta information for the added module. The currently known meta includes - `take_labels`: indicating whether the module expect to take labels when doing computation. Note any module in the chain can take labels (not necessarily only the top most one), and they all take the same labels passed from the original data batch for the `SequentialModule`. Returns ------- self This function returns `self` to allow us to easily chain a series of `add` calls. Examples -------- >>> # An example of addinging two modules to a chain. >>> seq_mod = mx.mod.SequentialModule() >>> seq_mod.add(mod1) >>> seq_mod.add(mod2) sUnknown meta "%s", a typo?( R tappendRtAssertionErrorR tFalsetbindedtparams_initializedtoptimizer_initialized(Rtmoduletkwargstkey((s^/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/module/sequential_module.pytadd4s #   cCs't|jƒdkr#|jdjSgS(s1A list of names for data required by this module.i(tlenR t data_names(R((s^/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/module/sequential_module.pyR bscCs't|jƒdkr#|jdjSgS(s/A list of names for the outputs of this module.iiÿÿÿÿ(RR t output_names(R((s^/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/module/sequential_module.pyR!iscCs|jst‚|jdjS(sÐGets data shapes. Returns ------- list A list of `(name, shape)` pairs. The data shapes of the first module is the data shape of a `SequentialModule`. i(RRR t data_shapes(R((s^/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/module/sequential_module.pyR"ps cCs|jst‚|jS(s2Gets label shapes. Returns ------- list A list of `(name, shape)` pairs. The return value could be `None` if the module does not need labels, or if the module is not bound for training (in this case, label information is not available). (RRR (R((s^/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/module/sequential_module.pyt label_shapes}s cCs|jst‚|jdjS(sÕGets output shapes. Returns ------- list A list of `(name, shape)` pairs. The output shapes of the last module is the output shape of a `SequentialModule`. iÿÿÿÿ(RRR t output_shapes(R((s^/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/module/sequential_module.pyR$‹s cCst|jr|jst‚tƒ}tƒ}x=|jD]2}|jƒ\}}|j|ƒ|j|ƒq4W||fS(sGets current parameters. Returns ------- (arg_params, aux_params) A pair of dictionaries each mapping parameter names to NDArray values. This is a merged dictionary of all the parameters in the modules. (RRRtdictR t get_paramstupdate(Rt arg_paramst aux_paramsRtargtaux((s^/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/module/sequential_module.pyR&˜s    g{®Gáz„?c Csý|jr| rdS|js)tdƒ‚x?|jD]4}|jd|d|d|d|d|d|ƒq3Wd „}tƒ} tƒ} xgt|jƒD]V\} }|jƒ\}}|| |jƒ|j| ƒ|| |jƒ|j| ƒq–Wt |_dS( s€Initializes parameters. Parameters ---------- initializer : Initializer arg_params : dict Default ``None``. Existing parameters. This has higher priority than `initializer`. aux_params : dict Default ``None``. Existing auxiliary states. This has higher priority than `initializer`. allow_missing : bool Allow missing values in `arg_params` and `aux_params` (if not ``None``). In this case, missing values will be filled with `initializer`. force_init : bool Default ``False``. allow_extra : boolean, optional Whether allow extra parameters that are not needed by symbol. If this is True, no error will be thrown when arg_params or aux_params contain extra parameters that is not needed by the executor. Ns,call bind before initializing the parameterst initializerR(R)t allow_missingt force_initt allow_extrac Ssrxk|D]c}||ks`tdd||t||ƒfd||t|||ƒfƒ‚|||!       4  tlocaltsgdt learning_ratec Cs|jr|jst‚|jr<| r<|jjdƒdSx3|jD](}|jd|d|d|d|ƒqFWt|_dS(sKInstalls and initializes optimizers. Parameters ---------- kvstore : str or KVStore Default `'local'`. optimizer : str or Optimizer Default `'sgd'` optimizer_params : dict Default ``(('learning_rate', 0.01),)``. The default value is not a dictionary, just to avoid pylint warning of dangerous default values. force_init : bool Default ``False``, indicating whether we should force re-initializing the optimizer in the case an optimizer is already installed. s(optimizer already initialized, ignoring.Ntkvstoret optimizertoptimizer_paramsR.( RRRRRRDR tinit_optimizerR:(RRVRWRXR.R((s^/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/module/sequential_module.pyRY)scCs|jr|jst‚tj|ƒ}xæt|jƒD]Õ\}}|j|d|ƒ|dt|jƒkrsPn|jƒ|_ t |dƒr7g|j D]}|d^q›}t|ƒt|j ƒksÒt‚gt ||j ƒD]\}}||j f^qå|_q7q7WdS(s×Forward computation. Parameters ---------- data_batch : DataBatch is_train : bool Default is ``None``, in which case `is_train` is take as ``self.for_training``. tis_trainit provide_dataiN(RRRtcopyR8R tforwardRt get_outputstdatathasattrR$RIRRR[(Rt data_batchRZR=RRR R5((s^/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/module/sequential_module.pyR]Fs  !cCs†|jr|jst‚xgttttt|jƒƒ|jƒƒƒD]8\}}|j d|ƒ|dkrrPn|j ƒ}qFWdS(sBackward computation.t out_gradsiN( RRRtreversedtlistRItrangeRR tbackwardtget_input_grads(RRbR=R((s^/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/module/sequential_module.pyRfes : cCsC|jr|jr|js!t‚x|jD]}|jƒq+WdS(s†Updates parameters according to installed optimizer and the gradient computed in the previous forward-backward cycle. N(RRRRR R'(RR((s^/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/module/sequential_module.pyR'ps!cCs/|jr|jst‚|jdjd|ƒS(sÎGets outputs from a previous forward computation. Parameters ---------- merge_multi_context : bool Default is ``True``. In the case when data-parallelism is used, the outputs will be collected from multiple devices. A ``True`` value indicate that we should merge the collected results so that they look like from a single executor. Returns ------- list of NDArray or list of list of NDArray If `merge_multi_context` is ``True``, it is like ``[out1, out2]``. Otherwise, it is like ``[[out1_dev1, out1_dev2], [out2_dev1, out2_dev2]]``. All the output elements are numpy arrays. iÿÿÿÿtmerge_multi_context(RRRR R^(RRh((s^/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/module/sequential_module.pyR^yscCs8|jr|jr|js!t‚|jdjd|ƒS(sÞGets the gradients with respect to the inputs of the module. Parameters ---------- merge_multi_context : bool Default is ``True``. In the case when data-parallelism is used, the outputs will be collected from multiple devices. A ``True`` value indicate that we should merge the collected results so that they look like from a single executor. Returns ------- list of NDArrays or list of list of NDArrays If `merge_multi_context` is ``True``, it is like ``[grad1, grad2]``. Otherwise, it is like ``[[grad1_dev1, grad1_dev2], [grad2_dev1, grad2_dev2]]``. All the output elements are `NDArray`. iRh(RRR@RR Rg(RRh((s^/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/module/sequential_module.pyRgŽs!cCst|jr|jst‚xUt|j|jƒD]>\}}tj|kr.|tjr.|j|||ƒq.q.WdS(sõEvaluates and accumulates evaluation metric on outputs of the last forward computation. Parameters ---------- eval_metric : EvalMetric labels : list of NDArray Typically ``data_batch.label``. N( RRRRIR R RREt update_metric(Rt eval_metrictlabelst pre_slicedRMR((s^/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/module/sequential_module.pyRi£s " cCs4|jst‚x|jD]}|j|ƒqWdS(s"Installs monitor on all executors.N(RRR tinstall_monitor(RtmonR((s^/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/module/sequential_module.pyRm³sN(s learning_rateg{®Gáz„?((s learning_rateg{®Gáz„?(t__name__t __module__t__doc__RERHtloggingRRtpropertyR R!R"R#R$R&RR RR7R:RJRYR]RfR'R^RgRiRm(((s^/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/module/sequential_module.pyRs4 .  2  G     (RqRrR\R,Rt base_moduleRR(((s^/usr/local/lib/python2.7/site-packages/mxnet-1.3.1-py2.7.egg/mxnet/module/sequential_module.pyts