U Dx`@sdZddlmZddlZddlmZmZddlZddlZddl Z ddl m Z ddl m Z mZmZmZmZmZmZmZmZddlZddlZddlmZmZddlmZmZdd l m!Z!dd l"m#Z#m$Z$m%Z%m&Z&m'Z'dd l(m)Z)dd l*m+Z+dd l,m-Z-ddl.m/Z/ddl0m1Z1m2Z2m3Z3m4Z4m5Z5m6Z6m7Z7m8Z8m9Z9m:Z:ddl;mZ>m?Z?m@Z@mAZAmBZBmCZCmDZDmEZEmFZFmGZGddlHmIZImJZJmKZKddlLmMmNZOddlPmQZQmRZRddlSmTZTddlUmVZVddlWmXZXddlYmZZZm[Z[e rddl\m]Z]m^Z^m_Z_dZ`dZaddZbddZcdd ZdeQZeefd!d"d#ZgGd$d%d%ehZiGd&d'd'ehZjGd(d)d)ekZld*ZmGd+d,d,ekZnd-ZoGd.d/d/ekZpd0Zqd1Zrd2d2d3d3d4Zse>dgiZtd5Zud6Zvewd78ejxd8d9euejyd:ejxd;devezd2d3dgd:W5QRXda{d9a|defe@e>dtdudvZeee~efdwdxdyZdeeje@feee~efeeeje?fdzd{d|Ze~e@e~e~ed}d~dZe~e~e~eeje@fdddZe~dddZeje~e~ejdddZeje~e~ejdddZeje~e~e~dddZe~e~e~dddZe~edddZe~eefe~dddZe~e~dddZe#dddZGdddZdS)zY High level interface to PyTables for reading and writing pandas data structures to disk )suppressN)datetzinfo)dedent) TYPE_CHECKINGAnyDictListOptionalSequenceTupleTypeUnion)config get_option)libwriters) timezones) ArrayLike FrameOrSeriesFrameOrSeriesUnionLabelShape)import_optional_dependency) patch_pickle)PerformanceWarning)cache_readonly) ensure_objectis_categorical_dtypeis_complex_dtypeis_datetime64_dtypeis_datetime64tz_dtypeis_extension_array_dtype is_list_likeis_string_dtypeis_timedelta64_dtypeneeds_i8_conversion)array_equivalent) DataFrame DatetimeIndexIndex Int64Index MultiIndex PeriodIndexSeriesTimedeltaIndexconcatisna) Categorical DatetimeArray PeriodArray) PyTablesExprmaybe_expression) extract_array) ensure_index)stringify_path)adjoin pprint_thing)ColFileNodez0.15.2UTF-8cCst|tjr|d}|S)z* if we have bytes, decode them to unicode r?) isinstancenpbytes_decode)srE9/tmp/pip-target-zr53vnty/lib/python/pandas/io/pytables.py_ensure_decodedSs  rGcCs|dkr t}|SN)_default_encodingencodingrErErF_ensure_encodingZsrLcCst|trt|}|S)z Ensure that an index / column name is a str (python 3); otherwise they may be np.string dtype. Non-string dtypes are passed through unchanged. https://github.com/pandas-dev/pandas/issues/13492 )r@strnamerErErF _ensure_strbs rP scope_levelcsV|dt|ttfr*fdd|D}nt|r>t|d}|dksNt|rR|SdS)z Ensure that the where is a Term or a list of Term. This makes sure that we are capturing the scope of variables that are passed create the terms here with a frame_level=2 (we are 2 levels down) cs0g|](}|dk rt|r(t|ddn|qS)NrSrQ)r6Term).0ZtermlevelrErF |sz _ensure_term..rQN)r@listtupler6rTlen)whererRrErVrF _ensure_termqs   r]c@s eZdZdS)PossibleDataLossErrorN__name__ __module__ __qualname__rErErErFr^sr^c@s eZdZdS)ClosedFileErrorNr_rErErErFrcsrcc@s eZdZdS)IncompatibilityWarningNr_rErErErFrdsrdz where criteria is being ignored as this version [%s] is too old (or not-defined), read the file in and write it out to a new file to upgrade (with the copy_to method) c@s eZdZdS)AttributeConflictWarningNr_rErErErFresrezu the [%s] attribute of the existing index is [%s] which conflicts with the new [%s], resetting the attribute to None c@s eZdZdS)DuplicateWarningNr_rErErErFrfsrfz; duplicate entries in table, taking most recently appended z your performance may suffer as PyTables will pickle object types that it cannot map directly to c-types [inferred_type->%s,key->%s] [items->%s] fixedtable)frgtrhz; : boolean drop ALL nan rows when appending to a table z~ : format default format writing format, if None, then put will default to 'fixed' and append will default to 'table' zio.hdfZ dropna_tableF)Z validatordefault_formatc Cs8tdkr4ddl}|att|jjdkaW5QRXtS)Nrstrict) _table_modtablesrAttributeErrorfileZ_FILE_OPEN_POLICY!_table_file_open_policy_is_strict)rnrErErF_tabless  rraTrl) keyvaluemode complevelcomplibappendformatindex min_itemsizedropna data_columnserrorsrKc s|r$ f dd}n f dd}t|}t|trzt||||d}||W5QRXn||dS)z- store this object, close it if we opened it c s|j d S)N)rzr{r|nan_repr}r~rrKrystore r~r}rKrrzr{rtr|rrurErFszto_hdf..c s|j d S)N)rzr{r|rr~rrKr}putrrrErFrs)rvrwrxN)r9r@rMHDFStore) path_or_bufrtrurvrwrxryrzr{r|rr}r~rrKrirrErrFto_hdfs   rr)rvrstartstop chunksizec  Ks|dkrtd|d|dk r,t|dd}t|trN|jsDtd|} d} nvt|}t|tshtd zt j |} Wnt tfk rd} YnX| st d |d t|f||d | } d } zx|dkr"| }t|dkrtd|d}|ddD]}t||stdq|j}| j||||||| | dWStt tfk rt|ts|tt| W5QRXYnXdS)a Read from the store, close it if we opened it. Retrieve pandas object stored in file, optionally based on where criteria. .. warning:: Pandas uses PyTables for reading and writing HDF5 files, which allows serializing object-dtype data with pickle when using the "fixed" format. Loading pickled data received from untrusted sources can be unsafe. See: https://docs.python.org/3/library/pickle.html for more. Parameters ---------- path_or_buf : str, path object, pandas.HDFStore or file-like object Any valid string path is acceptable. The string could be a URL. Valid URL schemes include http, ftp, s3, and file. For file URLs, a host is expected. A local file could be: ``file://localhost/path/to/table.h5``. If you want to pass in a path object, pandas accepts any ``os.PathLike``. Alternatively, pandas accepts an open :class:`pandas.HDFStore` object. By file-like object, we refer to objects with a ``read()`` method, such as a file handle (e.g. via builtin ``open`` function) or ``StringIO``. key : object, optional The group identifier in the store. Can be omitted if the HDF file contains a single pandas object. mode : {'r', 'r+', 'a'}, default 'r' Mode to use when opening the file. Ignored if path_or_buf is a :class:`pandas.HDFStore`. Default is 'r'. errors : str, default 'strict' Specifies how encoding and decoding errors are to be handled. See the errors argument for :func:`open` for a full list of options. where : list, optional A list of Term (or convertible) objects. start : int, optional Row number to start selection. stop : int, optional Row number to stop selection. columns : list, optional A list of columns names to return. iterator : bool, optional Return an iterator object. chunksize : int, optional Number of rows to include in an iteration when using an iterator. **kwargs Additional keyword arguments passed to HDFStore. Returns ------- item : object The selected object. Return type depends on the object stored. See Also -------- DataFrame.to_hdf : Write a HDF file from a DataFrame. HDFStore : Low-level access to HDF files. Examples -------- >>> df = pd.DataFrame([[1, 1.0, 'a']], columns=['x', 'y', 'z']) >>> df.to_hdf('./store.h5', 'data') >>> reread = pd.read_hdf('./store.h5') )rr+rszmode zG is not allowed while performing a read. Allowed modes are r, r+ and a.NrSrQz&The HDFStore must be open for reading.Fz5Support for generic buffers has not been implemented.zFile z does not exist)rvrTrz]Dataset(s) incompatible with Pandas data types, not table, or no datasets found in HDF5 file.z?key must be provided when HDF5 file contains multiple datasets.)r\rrcolumnsiteratorr auto_close) ValueErrorr]r@ris_openOSErrorr9rMNotImplementedErrorospathexists TypeErrorFileNotFoundErrorgroupsr[_is_metadata_of _v_pathnameselectKeyErrorrroclose)rrtrvrr\rrrrrkwargsrrrrZcandidate_only_groupZgroup_to_checkrErErFread_hdfsjS           rr>)group parent_groupreturncCsF|j|jkrdS|}|jdkrB|j}||kr:|jdkr:dS|j}qdS)zDCheck if a given group is a metadata group for a given parent_group.FrSmetaT)Z_v_depthZ _v_parent_v_name)rrcurrentparentrErErFrs  rc@sbeZdZUdZeded<eed<eed<eed<d~eeeed d d Z d dZ e ddZ e ddZ edddZedddZedddZedddZeedddZed d!d"Zed d#d$Zd%d&Zd'd(Zdeeed*d+d,Zd-d.Zd/d0ZeZded1d2d3Zd4d5Ze ed d6d7Zded8d9d:Zedd;d<Z deed=d>d?Z!deeeeed@dAdBZ"deeeeeedCdDdEZ#dedFdGdHZ$dee%eeee&ee'eeffeeeeeedKdLdMZ(deddNdOZ)dee%eeee&ee'eeffeeeeeedPdQdRZ*de'dSdTdUZ+deeeeedVdWdXZ,dYdZZ-dd\d]Z.eed^dd_d`Z/ee&daddbdcZ0deeeededfdgZ1ed dhdiZ2djdkZ3eedldmdnZ4dee%eee&dadpdqdrZ5dee%eeee&ee'eeffeedsdtduZ6d^dvdwdxZ7eed^dydzd{Z8ed^dd|d}Z9dS)raa Dict-like IO interface for storing pandas objects in PyTables. Either Fixed or Table format. .. warning:: Pandas uses PyTables for reading and writing HDF5 files, which allows serializing object-dtype data with pickle when using the "fixed" format. Loading pickled data received from untrusted sources can be unsafe. See: https://docs.python.org/3/library/pickle.html for more. Parameters ---------- path : str File path to HDF5 file. mode : {'a', 'w', 'r', 'r+'}, default 'a' ``'r'`` Read-only; no data can be modified. ``'w'`` Write; a new file is created (an existing file with the same name would be deleted). ``'a'`` Append; an existing file is opened for reading and writing, and if the file does not exist it is created. ``'r+'`` It is similar to ``'a'``, but the file must already exist. complevel : int, 0-9, default None Specifies a compression level for data. A value of 0 or None disables compression. complib : {'zlib', 'lzo', 'bzip2', 'blosc'}, default 'zlib' Specifies the compression library to be used. As of v0.20.2 these additional compressors for Blosc are supported (default if no compressor specified: 'blosc:blosclz'): {'blosc:blosclz', 'blosc:lz4', 'blosc:lz4hc', 'blosc:snappy', 'blosc:zlib', 'blosc:zstd'}. Specifying a compression library which is not available issues a ValueError. fletcher32 : bool, default False If applying compression use the fletcher32 checksum. **kwargs These parameters will be passed to the PyTables open_file method. Examples -------- >>> bar = pd.DataFrame(np.random.randn(10, 4)) >>> store = pd.HDFStore('test.h5') >>> store['foo'] = bar # write to HDF5 >>> bar = store['foo'] # retrieve >>> store.close() **Create or load HDF5 file in-memory** When passing the `driver` option to the PyTables open_file method through **kwargs, the HDF5 file is loaded or created in-memory and will only be written when closed: >>> bar = pd.DataFrame(np.random.randn(10, 4)) >>> store = pd.HDFStore('test.h5', driver='H5FD_CORE') >>> store['foo'] = bar >>> store.close() # only now, data is written to disk r=_handle_mode _complevel _fletcher32rsNF)rvrw fletcher32cKsd|krtdtd}|dk r@||jjkr@td|jjd|dkrX|dk rX|jj}t||_|dkrnd}||_d|_|r|nd|_ ||_ ||_ d|_ |j fd|i|dS) Nrzz-format is not a defined argument for HDFStorernzcomplib only supports z compression.rsrrv)rrfiltersZ all_complibsZdefault_complibr9_pathrrr_complibr_filtersopen)selfrrvrwrxrrrnrErErF__init__s&  zHDFStore.__init__cCs|jSrHrrrErErF __fspath__3szHDFStore.__fspath__cCs||jdk st|jjS)z return the root node N)_check_if_openrAssertionErrorrootrrErErFr6sz HDFStore.rootcCs|jSrHrrrErErFfilename=szHDFStore.filenamertcCs ||SrH)getrrtrErErF __getitem__AszHDFStore.__getitem__cCs|||dSrHr)rrtrurErErF __setitem__DszHDFStore.__setitem__cCs ||SrH)removerrErErF __delitem__GszHDFStore.__delitem__rNc CsFz ||WSttfk r$YnXtdt|jd|ddS)z& allow attribute access to get stores 'z' object has no attribute 'N)rrrcrotyper`)rrOrErErF __getattr__Js zHDFStore.__getattr__rtrcCs8||}|dk r4|j}||ks0|dd|kr4dSdS)zx check for existence of this key can match the exact pathname or the pathnm w/o the leading '/' NrSTF)get_noder)rrtnoderOrErErF __contains__Ts  zHDFStore.__contains__rcCs t|SrH)r[rrrErErF__len__`szHDFStore.__len__cCst|j}t|d|dS)N File path:  )r;rr)rpstrrErErF__repr__cs zHDFStore.__repr__cCs|SrHrErrErErF __enter__gszHDFStore.__enter__cCs |dSrH)r)rexc_type exc_value tracebackrErErF__exit__jszHDFStore.__exit__pandas)includercCs^|dkrdd|DS|dkrJ|jdk s0tdd|jjddd DStd |d dS) a# Return a list of keys corresponding to objects stored in HDFStore. Parameters ---------- include : str, default 'pandas' When kind equals 'pandas' return pandas objects. When kind equals 'native' return native HDF5 Table objects. .. versionadded:: 1.1.0 Returns ------- list List of ABSOLUTE path-names (e.g. have the leading '/'). Raises ------ raises ValueError if kind has an illegal value rcSsg|] }|jqSrErrUnrErErFrXsz!HDFStore.keys..ZnativeNcSsg|] }|jqSrErrrErErFrXs/Table) classnamez8`include` should be either 'pandas' or 'native' but is 'r)rrrZ walk_nodesr)rrrErErFkeysms z HDFStore.keyscCs t|SrH)iterrrrErErF__iter__szHDFStore.__iter__ccs|D]}|j|fVqdS)z' iterate on key->group N)rr)rgrErErFitemss zHDFStore.items)rvcKst}|j|krR|jdkr$|dkr$n(|dkrL|jrLtd|jd|jd||_|jr`||jr|jdkrtj|j|j|j d|_ t r|jrd }t ||j |j|jf||_d S) a9 Open the file in the specified mode Parameters ---------- mode : {'a', 'w', 'r', 'r+'}, default 'a' See HDFStore docstring or tables.open_file for info about modes **kwargs These parameters will be passed to the PyTables open_file method. )rsw)rr)rzRe-opening the file [z ] with mode [z] will delete the current file!r)rzGCannot open HDF5 file, which is already opened, even in read-only mode.N)rrrrr^rrrFiltersrrrrqr open_filer)rrvrrnmsgrErErFrs.   z HDFStore.opencCs|jdk r|jd|_dS)z0 Close the PyTables file handle N)rrrrErErFrs  zHDFStore.closecCs|jdkrdSt|jjS)zF return a boolean indicating whether the file is open NF)rboolZisopenrrErErFrs zHDFStore.is_open)fsyncc Cs@|jdk r<|j|r.funcr\nrowsrrrrr)rrr]_create_storer infer_axes TableIteratorr get_result) rrtr\rrrrrrrritrErrFrs(.    zHDFStore.selectrtrrcCs8t|dd}||}t|ts(td|j|||dS)a return the selection as an Index .. warning:: Pandas uses PyTables for reading and writing HDF5 files, which allows serializing object-dtype data with pickle when using the "fixed" format. Loading pickled data received from untrusted sources can be unsafe. See: https://docs.python.org/3/library/pickle.html for more. Parameters ---------- key : str where : list of Term (or convertible) objects, optional start : integer (defaults to None), row number to start selection stop : integer (defaults to None), row number to stop selection rSrQz&can only read_coordinates with a tabler\rr)r] get_storerr@rrread_coordinates)rrtr\rrtblrErErFselect_as_coordinatesOs    zHDFStore.select_as_coordinates)rtcolumnrrcCs,||}t|tstd|j|||dS)a~ return a single column from the table. This is generally only useful to select an indexable .. warning:: Pandas uses PyTables for reading and writing HDF5 files, which allows serializing object-dtype data with pickle when using the "fixed" format. Loading pickled data received from untrusted sources can be unsafe. See: https://docs.python.org/3/library/pickle.html for more. Parameters ---------- key : str column : str The column of interest. start : int or None, default None stop : int or None, default None Raises ------ raises KeyError if the column is not found (or key is not a valid store) raises ValueError if the column can not be extracted individually (it is part of a data block) z!can only read_column with a tablerrr)rr@rr read_column)rrtrrrrrErErF select_columnos#  zHDFStore.select_column)rc  szt|dd}t|ttfr.t|dkr.|d}t|trRj||||||| dSt|ttfshtdt|sxtd|dkr|d}fdd |D |} d} t | |fgt |D]\\} } | dkrt d | d | jstd | jd | dkr | j} q| j| krtdqdd D}tdd|Ddfdd}t| ||| ||||| d }|jddS)a Retrieve pandas objects from multiple tables. .. warning:: Pandas uses PyTables for reading and writing HDF5 files, which allows serializing object-dtype data with pickle when using the "fixed" format. Loading pickled data received from untrusted sources can be unsafe. See: https://docs.python.org/3/library/pickle.html for more. Parameters ---------- keys : a list of the tables selector : the table to apply the where criteria (defaults to keys[0] if not supplied) columns : the columns I want back start : integer (defaults to None), row number to start selection stop : integer (defaults to None), row number to stop selection iterator : boolean, return an iterator, default False chunksize : nrows to include in iteration, return an iterator auto_close : bool, default False Should automatically close the store when finished. Raises ------ raises KeyError if keys or selector is not found or keys is empty raises TypeError if keys is not a list or tuple raises ValueError if the tables are not ALL THE SAME DIMENSIONS rSrQr)rtr\rrrrrrzkeys must be a list/tuplez keys must have a non-zero lengthNcsg|]}|qSrE)rrUkrrErFrXsz/HDFStore.select_as_multiple..zInvalid table []zobject [z>] is not a table, and cannot be used in all select as multiplez,all tables must have exactly the same nrows!cSsg|]}t|tr|qSrE)r@rrUxrErErFrXs cSsh|]}|jddqSr)non_index_axesrUrjrErErF sz.HDFStore.select_as_multiple..cs*fddD}t|ddS)Ncsg|]}|jdqS)r\rrrrr)rrrrrErFrXsz=HDFStore.select_as_multiple..func..F)axisverify_integrity)r0 _consolidate)rrrobjs)rrtblsrrFrsz)HDFStore.select_as_multiple..funcrT coordinates)r]r@rYrZr[rMrrrr itertoolschainzipris_tablepathnamerrr)rrr\selectorrrrrrrrDrrjr Z_tblsrrrE)rrrrrFselect_as_multiplesd+           zHDFStore.select_as_multipleTrl)rtrurwr|r~r track_timesr}cCsH|dkrtdpd}||}|j||||||||| | | | | |ddS)a< Store object in HDFStore. Parameters ---------- key : str value : {Series, DataFrame} format : 'fixed(f)|table(t)', default is 'fixed' Format to use when storing object in HDFStore. Value can be one of: ``'fixed'`` Fixed format. Fast writing/reading. Not-appendable, nor searchable. ``'table'`` Table format. Write as a PyTables Table structure which may perform worse but allow more flexible operations like searching / selecting subsets of the data. append : bool, default False This will force Table format, append the input data to the existing. data_columns : list, default None List of columns to create as data columns, or True to use all columns. See `here `__. encoding : str, default None Provide an encoding for strings. track_times : bool, default True Parameter is propagated to 'create_table' method of 'PyTables'. If set to False it enables to have the same h5 files (same hashes) independent on creation time. .. versionadded:: 1.1.0 Nio.hdf.default_formatrg) rzr{ryrxrwr|rr~rKrr!r})r_validate_format_write_to_group)rrtrurzr{ryrxrwr|rr~rKrr!r}rErErFrs&0  z HDFStore.putc Cst|dd}z||}Wntk r0Ynptk rDYn\tk r}z>|dk rftd|||}|dk r|jddWYdSW5d}~XYnXt |||r|j jddn|j std|j |||dSdS) a= Remove pandas object partially by specifying the where condition Parameters ---------- key : string Node to remove or delete rows from where : list of Term (or convertible) objects, optional start : integer (defaults to None), row number to start selection stop : integer (defaults to None), row number to stop selection Returns ------- number of rows removed (or None if not a Table) Raises ------ raises KeyError if key is not a valid store rSrQNz5trying to remove a node with a non-None where clause!T recursivez7can only remove with where on objects written as tablesr) r]rrr ExceptionrrZ _f_removecomall_nonerrdelete)rrtr\rrrDerrrrErErFrUs2   zHDFStore.remove)rtrurwr|r}r~rcCsl| dk rtd|dkr td}|dkr4tdp2d}||}|j||||||||| | | | ||||ddS)a6 Append to Table in file. Node must already exist and be Table format. Parameters ---------- key : str value : {Series, DataFrame} format : 'table' is the default Format to use when storing object in HDFStore. Value can be one of: ``'table'`` Table format. Write as a PyTables Table structure which may perform worse but allow more flexible operations like searching / selecting subsets of the data. append : bool, default True Append the input data to the existing. data_columns : list of columns, or True, default None List of columns to create as indexed data columns for on-disk queries, or True to use all columns. By default only the axes of the object are indexed. See `here `__. min_itemsize : dict of columns that specify minimum str sizes nan_rep : str to use as str nan representation chunksize : size to chunk the writing expectedrows : expected TOTAL row size of this table encoding : default None, provide an encoding for str dropna : bool, default False Do not write an ALL nan row to the store settable by the option 'io.hdf.dropna_table'. Notes ----- Does *not* check if data being appended overlaps with existing data in the table, so be careful Nz>columns is not a supported keyword in append, try data_columnszio.hdf.dropna_tabler"rh)rzaxesr{ryrxrwr|rr expectedrowsr}r~rKr)rrr#r$)rrtrurzr,r{ryrxrwrr|rrr-r}r~rKrrErErFrys68  zHDFStore.append)dc s|dk rtdt|ts"td||kr2tdtttjttt d}d} g} | D]0\} dkr| dk rtd| } qh| qh| dk rֈj |} | t| } t| | } | | || <|dkr||}|r*fdd|D}t|}|D]}||}qj||d d}| D]h\} | |krT|nd}j|d }|dk rfd d | Dnd}|j| |f||d |q>dS)a Append to multiple tables Parameters ---------- d : a dict of table_name to table_columns, None is acceptable as the values of one node (this will get all the remaining columns) value : a pandas object selector : a string that designates the indexable table; all of its columns will be designed as data_columns, unless data_columns is passed, in which case these are used data_columns : list of columns to create as data columns, or True to use all columns dropna : if evaluates to True, drop rows from all tables if any single row in each table has all NaN. Default False. Notes ----- axes parameter is currently not accepted Nztaxes is currently not accepted as a parameter to append_to_multiple; you can create the tables independently insteadzQappend_to_multiple must have a dictionary specified as the way to split the valuez=append_to_multiple requires a selector that is in passed dictrz.sz.HDFStore.append_to_multiple..r|rcsi|]\}}|kr||qSrErErUrtru)vrErF >sz/HDFStore.append_to_multiple..)r~r|)rr@dictrrYsetrangendim _AXES_MAPrrextendr, differencer*sorted get_indexertakevaluesnext intersectionlocpopreindexry)rr.rurr~r,r}rrZ remain_keyZ remain_valuesr orderedZorddZidxsZ valid_indexr{r|dcvalfilteredrE)r5rurFappend_to_multiplesZ &    zHDFStore.append_to_multiple)rtoptlevelkindcCsBt||}|dkrdSt|ts.td|j|||ddS)a Create a pytables index on the table. Parameters ---------- key : str columns : None, bool, or listlike[str] Indicate which columns to create an index on. * False : Do not create any indexes. * True : Create indexes on all columns. * None : Create indexes on all columns. * listlike : Create indexes on the given columns. optlevel : int or None, default None Optimization level, if None, pytables defaults to 6. kind : str or None, default None Kind of index, if None, pytables defaults to "medium". Raises ------ TypeError: raises if the node is not a table Nz1cannot create table index on a Fixed format store)rrLrM)rrrr@rr create_index)rrtrrLrMrDrErErFcreate_table_indexDs  zHDFStore.create_table_indexcCs<t||jdk sttdk s(tdd|jDS)z Return a list of all the top-level nodes. Each node returned is not a pandas storage object. Returns ------- list List of objects. NcSsPg|]H}t|tjjst|jddsHt|ddsHt|tjjr|jdkr|qS) pandas_typeNrh) r@rmlinkLinkgetattr_v_attrsrhrr)rUrrErErFrX{s  z#HDFStore.groups..)rrrrrrm walk_groupsrrErErFrls  zHDFStore.groupsrccst||jdk sttdk s(t|j|D]}t|jdddk rLq4g}g}|j D]B}t|jdd}|dkrt |tj j r| |jq^| |jq^|jd||fVq4dS)au Walk the pytables group hierarchy for pandas objects. This generator will yield the group path, subgroups and pandas object names for each group. Any non-pandas PyTables objects that are not a group will be ignored. The `where` group itself is listed first (preorder), then each of its child groups (following an alphanumerical order) is also traversed, following the same procedure. .. versionadded:: 0.24.0 Parameters ---------- where : str, default "/" Group where to start walking. Yields ------ path : str Full path to a group (without trailing '/'). groups : list Names (strings) of the groups contained in `path`. leaves : list Names (strings) of the pandas objects contained in `path`. NrPr)rrrrrrmrUrSrTZ _v_childrenrAr@rGroupryrrrstrip)rr\rrZleaveschildrPrErErFwalks  z HDFStore.walkr>cCs||dsd|}|jdk s(ttdk s4tz|j|j|}Wntjjk rbYdSXt |tj s|tt ||S)z; return the node with the key or None if it does not exist rN) r startswithrrrmrr exceptionsZNoSuchNodeErrorr@r>r)rrtrrErErFrs  zHDFStore.get_node GenericFixedrcCs8||}|dkr"td|d||}||S)z> return the storer object for a key, raise if not in the file Nrr)rrrr)rrtrrDrErErFrs   zHDFStore.get_storerr) propindexesrwrc  Cst|||||d} |dkr&t|}t|ttfs:|g}|D]} || } | dk r>| | krj|rj| | || } t| trd} |rdd| j D} | j | | | t | dd| j dq>| j | | | j dq>| S) a; Copy the existing store to a new file, updating in place. Parameters ---------- propindexes : bool, default True Restore indexes in copied file. keys : list, optional List of keys to include in the copy (defaults to all). overwrite : bool, default True Whether to overwrite (remove and replace) existing nodes in the new store. mode, complib, complevel, fletcher32 same as in HDFStore.__init__ Returns ------- open file handle of the new store )rvrxrwrNFcSsg|]}|jr|jqSrE) is_indexedrOrUrsrErErFrXsz!HDFStore.copy..r~)r{r~rKrJ)rrYrr@rZrrrrr,ryrSrKr)rrprvr^rrxrwr overwriteZ new_storer rDdatar{rErErFcopys>      z HDFStore.copyc Cs t|j}t|d|d}|jrt|}t|rg}g}|D]}z<||}|dk r|t|j pj||t|p|dWqDt k rYqDt k r}z(||t|} |d| dW5d}~XYqDXqD|t d||7}n|d7}n|d 7}|S) zg Print detailed information on the store. Returns ------- str rrNzinvalid_HDFStore nodez[invalid_HDFStore node: r  EmptyzFile is CLOSED) r;rrrr>rr[rryrrr'r:) rroutputZlkeysrrAr rDZdetailZdstrrErErFinfos.    & z HDFStore.infocCs|jst|jddS)Nz file is not open!)rrcrrrErErFr;szHDFStore._check_if_open)rzrc CsJzt|}Wn4tk rD}ztd|d|W5d}~XYnX|S)z validate / deprecate formats z#invalid HDFStore format specified [r N) _FORMAT_MAPlowerrr)rrzr+rErErFr#?s $zHDFStore._validate_formatr?)rurKrrc s"dk rtttfstdfdd}ttjdd}ttjdd}|dkrʈdkrttdk stt tddsttj j rd}d }qtd n(td td i} | t }dkr|d 7}d|kr*t td} z | |} Wn.tk r} z|d| W5d} ~ XYnX| |||dS|dkrĈdk r|dkrtdd} | dk r| jdkrpd}n| jdkrd}nB|dkrtdd} | dk r| jdkrd}n| jdkrd}ttttttd}z ||} Wn.tk r} z|d| W5d} ~ XYnX| |||dS)z$ return a suitable class to operate Nz(value must be None, Series, or DataFramec s$td|ddtdS)Nz(cannot properly create the storer for: [z ] [group->,value->z ,format->)rr)rjrzrrurErFerrorWsz&HDFStore._create_storer..errorrP table_typerh frame_table generic_tablezKcannot create a storer if the object is not existing nor a value are passedseriesframeZ_table)rprq _STORER_MAPrKr series_tabler{rSappendable_seriesappendable_multiseriesappendable_frameappendable_multiframe)rorurvrwrxworm _TABLE_MAP)r@r.r(rrGrSrTrrrmrrhrr SeriesFixed FrameFixedrnlevels GenericTableAppendableSeriesTableAppendableMultiSeriesTableAppendableFrameTableAppendableMultiFrameTable WORMTable)rrrzrurKrrlptttZ _TYPE_MAPrrclsr+r{rzrErkrFrIsr                    zHDFStore._create_storer)rtrurwr|rr!cCst|ddr|dks|rdS|||}|j|||||d}|rr|jrZ|jrb|dkrb|jrbtd|jsz|n||js|rtd|j|||||| | | | | |||d t|t r|r|j |ddS) NemptyrhrsrgzCan only append to Tablesz0Compression not supported on Fixed format stores) objr,ryrxrwrr|rr-r}rr~r!)r) rS_identify_grouprr is_existsrset_object_infowriter@rrN)rrtrurzr,r{ryrxrwrr|rr-r}rr~rKrr!rrDrErErFr$s:   zHDFStore._write_to_grouprcCs||}||SrH)rrr)rrrDrErErFrs zHDFStore._read_group)rtryrcCsN||}|jdk st|dk r8|s8|jj|ddd}|dkrJ||}|S)z@Identify HDF5 group based on key, delete/create group if needed.NTr%)rrr remove_node_create_nodes_and_group)rrtryrrErErFrs   zHDFStore._identify_groupcCsv|jdk st|d}d}|D]P}t|s.q |}|dsD|d7}||7}||}|dkrl|j||}|}q |S)z,Create nodes from key and return group name.Nr)rrsplitr[endswithrZ create_group)rrtpathsrpnew_pathrrErErFrs   z HDFStore._create_nodes_and_group)rsNNF)r)rs)F)NNNNFNF)NNN)NN)NNNNNFNF) NTFNNNNNNrlTF)NNN)NNTTNNNNNNNNNNrl)NNF)NNN)r)rTNNNFT)NNr?rl)NTFNNNNNNFNNNrlT):r`rarb__doc__r __annotations__rMintrrrpropertyrrrrrrrrrrrr rrr iteritemsrrrrrrrrr rrrrrryrKrOrrYrrrcrgrr#rr$rrrrErErErFrs A  "    "- N $ +  ~  D=  Z d ( 2  =*  _ >rc@sreZdZUdZeeed<eed<eded<deede eee dd d Z d d Z d dZ de dddZ dS)raa Define the iteration interface on a table Parameters ---------- store : HDFStore s : the referred storer func : the function to execute the query where : the where of the query nrows : the rows to iterate on start : the passed start value (default is None) stop : the passed stop value (default is None) iterator : bool, default False Whether to use the default iterator. chunksize : the passed chunking value (default is 100000) auto_close : bool, default False Whether to automatically close the store at the end of iteration. rrr\rDNF)rrDrrrc Cs||_||_||_||_|jjrN|dkr,d}|dkr8d}|dkrD|}t||}||_||_||_d|_ |sr| dk r| dkr~d} t | |_ nd|_ | |_ dS)Nr順) rrDrr\rminrrrrrrr) rrrDrr\rrrrrrrErErFr&s,    zTableIterator.__init__ccsv|j}|jdkrtd||jkrjt||j|j}|dd|j||}|}|dkst|sbq|Vq|dS)Nz*Cannot iterate until get_result is called.) rrrrrrrr[r)rrrrurErErFrPs  zTableIterator.__iter__cCs|jr|jdSrH)rrrrrErErFr`szTableIterator.closercCs|jdk r4t|jtstd|jj|jd|_|S|rft|jtsLtd|jj|j|j|j d}n|j}| |j|j |}| |S)Nz0can only use an iterator or chunksize on a table)r\z$can only read_coordinates on a tabler) rr@rDrrrr\rrrrr)rrr\resultsrErErFrds"   zTableIterator.get_result)NNFNF)F)r`rarbrr rrrrrrrrrrErErErFrs&    *rc @s|eZdZUdZdZdZdddgZeed<eed<dDee ed d d Z e e d d dZ e ed ddZe dddZed ddZeedddZed ddZe ed ddZejeedddZd d!Ze d"d#Ze d$d%Ze d&d'Ze d(d)Zd*d+ZdEd,d-Zd.d/Z d0ed1d2d3Z!dFd4d5Z"ed6d7d8Z#d9d:Z$d;d<Z%d=d>Z&d0d?d@dAZ'd0d?dBdCZ(dS)GIndexCola an index column description class Parameters ---------- axis : axis which I reference values : the ndarray like converted values kind : a string description of this type typ : the pytables type pos : the position in the pytables Tfreqtz index_namerOcnameN)rOrcCst|tstd||_||_||_||_|p0||_||_||_ ||_ | |_ | |_ | |_ | |_| |_||_|dk r|||t|jtstt|jtstdS)Nz`name` must be a str.)r@rMrrArMtyprOrrposrrrrGrhrmetadataset_posr)rrOrArMrrrrrrrrGrhrrrErErFrs(   zIndexCol.__init__rcCs|jjSrH)ritemsizerrErErFrszIndexCol.itemsizecCs |jdS)N_kindrNrrErErF kind_attrszIndexCol.kind_attr)rcCs$||_|dk r |jdk r ||j_dS)z. set the position of this column in the Table N)rrZ_v_pos)rrrErErFrszIndexCol.set_posc CsFttt|j|j|j|j|jf}dddt dddddg|DS) N,css |]\}}|d|VqdSz->NrEr4rErErFr2sz$IndexCol.__repr__..rOrrrrM) rZmapr;rOrrrrMjoinrrtemprErErFrszIndexCol.__repr__otherrcstfdddDS) compare 2 col items c3s&|]}t|dt|dkVqdSrHrSr`rrrErFr2sz"IndexCol.__eq__..)rOrrrr/rrrErrF__eq__szIndexCol.__eq__cCs || SrH)rrrErErF__ne__szIndexCol.__ne__cCs"t|jdsdSt|jj|jjS)z' return whether I am an indexed column r1F)hasattrrhrSr1rr_rrErErFr_s zIndexCol.is_indexedrArKrcCst|tjstt||jjdk r.||j}t|j }t ||||}i}t|j |d<|j dk rpt|j |d<zt |f|}Wn0tk rd|krd|d<t |f|}YnXt||j}||fS)zV Convert the data from this selection to the appropriate pandas type. NrOr)r@rAndarrayrrdtypefieldsrrGrM_maybe_convertrrr*r_set_tzr)rrArrKrval_kindrZ new_pd_indexrErErFconverts"     zIndexCol.convertcCs|jS)z return the valuesrArrErErF take_data szIndexCol.take_datacCs|jjSrH)rhrTrrErErFattrsszIndexCol.attrscCs|jjSrHrh descriptionrrErErFrszIndexCol.descriptioncCst|j|jdS)z# return my current col description N)rSrrrrErErFcolsz IndexCol.colcCs|jSz return my cython values rrrErErFcvaluesszIndexCol.cvaluescCs t|jSrH)rrArrErErFr!szIndexCol.__iter__cCsPt|jdkrLt|tr$||j}|dk rL|jj|krLtj ||j d|_dS)z maybe set a string col itemsize: min_itemsize can be an integer or a dict with this columns name with an integer size stringN)rr) rGrMr@r7rrOrrrr StringColr)rr|rErErFmaybe_set_size$s   zIndexCol.maybe_set_sizecCsdSrHrErrErErFvalidate_names1szIndexCol.validate_namesAppendableTable)handlerrycCs:|j|_||||||||dSrH)rh validate_col validate_attrvalidate_metadatawrite_metadataset_attr)rrryrErErFvalidate_and_set4s    zIndexCol.validate_and_setcCs^t|jdkrZ|j}|dk rZ|dkr*|j}|j|krTtd|d|jd|jd|jSdS)z< validate this column: return the compared against itemsize rNz#Trying to store a string with len [z] in [z)] column but this column has a limit of [zC]! Consider using min_itemsize to preset the sizes on these columns)rGrMrrrr)rrcrErErFr<s zIndexCol.validate_colrcCsB|r>t|j|jd}|dk r>||jkr>td|d|jddS)Nzincompatible kind in col [ - r )rSrrrMr)rryZ existing_kindrErErFrOs zIndexCol.validate_attrc Cs|jD]}t||d}||ji}||}||kr|dk r||kr|dkrt|||f}tj|tddd||<t ||dqt d|jd|d|d|d q|dk s|dk r|||<qdS) z set/update the info for this indexable with the key/value if there is a conflict raise/warn as needed N)rr stacklevelzinvalid info for [z] for [z], existing_value [z] conflicts with new value [r ) _info_fieldsrS setdefaultrOrattribute_conflict_docwarningswarnresetattrr)rrgrtruidxZexisting_valuewsrErErF update_infoXs   zIndexCol.update_infocCs$||j}|dk r |j|dS)z# set my state from the passed info N)rrO__dict__update)rrgrrErErFset_infows zIndexCol.set_infocCst|j|j|jdS)z set the kind for this column N)rrrrMrrErErFr}szIndexCol.set_attr)rcCsB|jdkr>|j}||j}|dk r>|dk r>t||s>tddS)z< validate that kind=category does not change the categories categoryNzEcannot append a categorical with different categories to the existing)rr read_metadatarr'r)rrZ new_metadataZ cur_metadatarErErFrs  zIndexCol.validate_metadatacCs|jdk r||j|jdS)z set the meta data N)rrr)rrrErErFrs zIndexCol.write_metadata) NNNNNNNNNNNNN)N)N))r`rarbris_an_indexableis_data_indexablerrMrr rrrrrrrrrrrr_rArrrrrrrrrrrrrrrrrrrErErErFr~sl    ,         rc@s>eZdZdZeedddZeje e dddZ dd Z d S) GenericIndexColz< an index which is not represented in the data of the table rcCsdSNFrErrErErFr_szGenericIndexCol.is_indexedrcCs2t|tjstt|ttt|}||fS)z Convert the data from this selection to the appropriate pandas type. Parameters ---------- values : np.ndarray nan_rep : str encoding : str errors : str )r@rArrrr+aranger[)rrArrKrrErErFrs zGenericIndexCol.convertcCsdSrHrErrErErFrszGenericIndexCol.set_attrN) r`rarbrrrr_rArrMrrrErErErFrs rc s0eZdZdZdZdZddgZd2edfdd Ze ed d d Z e ed d d Z ed ddZ e edddZedddZddZeeddddZeddZeeedddd Zeeddd!d"Zed#d$Zed%d&Ze d'd(Ze d)d*Zd+d,Zejeed-d.d/Z d0d1Z!Z"S)3DataCola3 a data holding column, by definition this is not indexable Parameters ---------- data : the actual data cname : the column name in the table to hold the data (typically values) meta : a string description of the metadata metadata : the actual metadata FrrGNrNc s2tj||||||||| | | d | |_| |_dS)N) rOrArMrrrrrGrhrr)superrrrb)rrOrArMrrrrrGrhrrrrb __class__rErFrs zDataCol.__init__rcCs |jdS)N_dtyperNrrErErF dtype_attrszDataCol.dtype_attrcCs |jdS)N_metarNrrErErF meta_attrszDataCol.meta_attrc CsFttt|j|j|j|j|jf}dddt dddddg|DS) Nrcss |]\}}|d|VqdSrrEr4rErErFr2sz#DataCol.__repr__..rOrrrMshape) rZrr;rOrrrMrrrrrErErFrszDataCol.__repr__rcstfdddDS)rc3s&|]}t|dt|dkVqdSrHrr`rrErFr2sz!DataCol.__eq__..)rOrrrrrrErrFrszDataCol.__eq__rbcCs@|dk s t|jdkstt|\}}||_||_t||_dSrH)rr_get_data_and_dtype_namerb_dtype_to_kindrM)rrb dtype_namerErErFset_datas   zDataCol.set_datacCs|jS)z return the data rrrErErFr szDataCol.take_datar<)rArcCs|j}|j}|j}|jdkr&d|jf}t|trJ|j}|j||jj d}ntt |sZt |rf| |}nXt |rz||}nDt|rtj||dd}n&t|r|||}n|j||j d}|S)zW Get an appropriately typed and shaped pytables.Col object for values. rSrMrrr)rrrr:sizer@r2codes get_atom_datarOr r!get_atom_datetime64r%get_atom_timedelta64rrrZ ComplexColr$get_atom_string)rrArrrratomrErErF _get_atom s$     zDataCol._get_atomcCstj||ddS)NrrrrrrrrrErErFr+ szDataCol.get_atom_stringrMrcCsR|dr$|dd}d|d}n"|dr4d}n|}|d}tt|S)z2 return the PyTables column class for this column uintNZUIntr<periodInt64Col)rZ capitalizerSrr)rrMZk4Zcol_nameZkcaprErErFget_atom_coltype/ s    zDataCol.get_atom_coltypecCs|j|d|ddS)Nrrrr rrrMrErErFr> szDataCol.get_atom_datacCstj|ddSNrrrrr rrrErErFrB szDataCol.get_atom_datetime64cCstj|ddSrrrrErErFrF szDataCol.get_atom_timedelta64cCst|jddS)Nr)rSrbrrErErFrJ sz DataCol.shapecCs|jSrrrrErErFrN szDataCol.cvaluescCs`|r\t|j|jd}|dk r2|t|jkr2tdt|j|jd}|dk r\||jkr\tddS)zAvalidate that we have the same order as the existing & same dtypeNz4appended items do not match existing items in table!z@appended items dtype do not match existing items dtype in table!)rSrrrYrArrr)rryZexisting_fieldsZexisting_dtyperErErFrS szDataCol.validate_attrrcCst|tjstt||jjdk r.||j}|jdk s.cSsg|]}t|qSrEr fromtimestamprrErErFrX sr) categoriesrGFrcOrrrKr)#r@rArrrrrrrrrrMrGrrrGrrasarrayobjectrravelr*float64r1anyastyperZcumsum_valuesr2Z from_codesr_unconvert_string_arrayrA)rrArrKr convertedrrMrrrGrrr rmaskrErErFr` st                zDataCol.convertcCsHt|j|j|jt|j|j|j|jdk s2tt|j|j|jdS)z set the data for this column N) rrrrArrrrrrrErErFr szDataCol.set_attr) NNNNNNNNNNNN)#r`rarbrrrrrMrrrrrrrrrrr classmethodrrr r rrrrrrrArrr __classcell__rErErrFrs\          erc@sTeZdZdZdZddZeddZeeddd d Z ed d Z ed dZ dS)DataIndexableColz- represent a data column that can be indexed TcCst|jstddS)N-cannot have non-object label DataIndexableCol)r*rA is_objectrrrErErFr szDataIndexableCol.validate_namescCstj|dS)N)rrrrErErFr sz DataIndexableCol.get_atom_stringr<rcCs|j|dS)NrrrrErErFr szDataIndexableCol.get_atom_datacCs tSrHrrrErErFr sz$DataIndexableCol.get_atom_datetime64cCs tSrHrrrErErFr sz%DataIndexableCol.get_atom_timedelta64N) r`rarbrrrr.rrMrrrrErErErFr0 s  r0c@seZdZdZdS)GenericDataIndexableColz* represent a generic pytables data column N)r`rarbrrErErErFr3 sr3c@seZdZUdZeed<dZeed<eeed<e ed<eed<e ed<d ed <eed <d Z dEe d eedddZ e edddZe ee e e fdddZe ddZedddZddZddZe dd Ze d!d"Ze d#d$Ze d%d&Ze e dd'd(Ze edd)d*Ze d+d,Zd-d.Zd/d0Ze d1d2Ze edd3d4Z e d5d6Z!d7d8Z"dFd:d;Z#dd?d@Z&dAdBZ'dHe%e e%e d>dCdDZ(d9S)IFixedz represent an object in my store facilitate read/write of various types of objects this is an abstract base class Parameters ---------- parent : HDFStore group : Node The group node where the table resides. pandas_kindrg format_typeobj_typer:rKrr>rrFr?rl)rrrKrcCsZt|tstt|tdk s"tt|tjs:tt|||_||_t||_ ||_ dSrH) r@rrrrmr>rrrLrKr)rrrrKrrErErFr s  zFixed.__init__rcCs*|jddko(|jddko(|jddkS)NrrS )versionrrErErFis_old_version szFixed.is_old_versioncCsbtt|jjdd}z0tdd|dD}t|dkrB|d}Wntk r\d}YnX|S) z compute and set our version pandas_versionNcss|]}t|VqdSrHrr rErErFr2 sz Fixed.version...r9r)rrr)rGrSrrTrZrr[ro)rr:rErErFr: s   z Fixed.versioncCstt|jjddS)NrP)rGrSrrTrrErErFrP$ szFixed.pandas_typecCs^||j}|dk rXt|ttfrDddd|D}d|d}|jdd|d S|jS) * return a pretty representation of myself Nrcss|]}t|VqdSrHr;r rErErFr2. sz!Fixed.__repr__..[r 12.12z (shape->))rrr@rYrZrrP)rrDZjshaperErErFr( s zFixed.__repr__cCst|j|j_tt|j_dS)z set my pandas type & version N)rMr5rrP_versionr<rrErErFr3 szFixed.set_object_infocCst|}|SrHr!)rZnew_selfrErErFrc8 s z Fixed.copycCs|jSrH)rrrErErFr< sz Fixed.shapecCs|jjSrHrrrrErErFr@ szFixed.pathnamecCs|jjSrH)rrrrErErFrD sz Fixed._handlecCs|jjSrH)rrrrErErFrH szFixed._filterscCs|jjSrH)rrrrErErFrL szFixed._complevelcCs|jjSrH)rrrrErErFrP szFixed._fletcher32cCs|jjSrH)rrTrrErErFrT sz Fixed.attrscCsdSz set our object attributes NrErrErErF set_attrsX szFixed.set_attrscCsdS)z get our object attributes NrErrErErF get_attrs\ szFixed.get_attrscCs|jS)z return my storable rrrErErFstorable` szFixed.storablecCsdSrrErrErErFre szFixed.is_existscCst|jddS)Nr)rSrIrrErErFri sz Fixed.nrowscCs|dkr dSdS)z' validate against an existing storable NTrErrErErFvalidatem szFixed.validateNcCsdS)- are we trying to operate on an old version? TrE)rr\rErErFvalidate_versions szFixed.validate_versioncCs|j}|dkrdS|dS)zr infer the axes of my storer return a boolean indicating if we have a valid storer or not NFT)rIrH)rrDrErErFrw s zFixed.infer_axesrrcCs tddS)Nz>cannot read on an abstract storer: subclasses should implementrrr\rrrrErErFr sz Fixed.readcKs tddS)Nz?cannot write on an abstract storer: subclasses should implementrNrrrErErFr sz Fixed.writecCs0t|||r$|jj|jdddStddS)zs support fully deleting the node in its entirety (only) - where specification must be None Tr%Nz#cannot delete on an abstract storer)r(r)rrrr)rr\rrrErErFr* sz Fixed.delete)r?rl)N)NNNN)NNN))r`rarbrrMrr6r rrrrrrrr;r r:rPrrrcrrrrrrrrGrHrIrrrJrLrr rrr*rErErErFr4 s                  r4c@sNeZdZUdZedediZddeDZgZ e e e d<e ddd Z d d Zd d ZddZeedddZddZddZddZd0e eeeedddZd1e eeeeedddZe edd d!Ze edd"d#Zd2e eeeeedd$d%Zd3d&eeeeed'd(d)Z e e!d*d+d,Z"d4e e#eed-d.d/Z$dS)5r]z a generified fixed version datetimer cCsi|]\}}||qSrErE)rUr r5rErErFr6 szGenericFixed. attributesrcCs|j|dS)N)_index_type_mapr)rrrErErF_class_to_alias szGenericFixed._class_to_aliascCst|tr|S|j|tSrH)r@r_reverse_index_maprr*)raliasrErErF_alias_to_class s zGenericFixed._alias_to_classcCs0|tkrddd}|S|tkr,ddd}|S|S)NcSs:tj|j|d}tj|dd}|dk r6|d|}|S)NrrNUTC)r3 _simple_newrAr) tz_localize tz_convert)rArrZdtaresultrErErFri s z*GenericFixed._get_index_factory..fcSstj||d}tj|ddS)NrYrN)r4r[r-)rArrZparrrErErFri s)NN)NN)r)r-)rklassrirErErF_get_index_factory s  zGenericFixed._get_index_factorycCs$|dk rtd|dk r tddS)zE raise if any keywords are passed which are not-None Nzqcannot pass a column specification when reading a Fixed format store. this store must be selected in its entiretyzucannot pass a where specification when reading from a Fixed format store. this store must be selected in its entirety)r)rrr\rErErF validate_read szGenericFixed.validate_readcCsdS)NTrErrErErFr szGenericFixed.is_existscCs|j|j_|j|j_dSrF)rKrrrrErErFrG s zGenericFixed.set_attrsc CsRtt|jdd|_tt|jdd|_|jD]}t||tt|j|dq.dS) retrieve our attributes rKNrrl)rLrSrrKrGrrRr)rrrErErFrH s zGenericFixed.get_attrscKs |dSrH)rGrrrrErErFr szGenericFixed.writeNrc Csddl}t|j|}|j}t|dd}t||jrD|d||}nztt|dd} t|dd} | dk rxtj| | d}n |||}| dkrt|d d} t || d d }n| d krtj |d d}|r|j S|SdS)z4 read an array for the specified node (off of group rN transposedF value_typerrrrTrrr) rnrSrrTr@ZVLArrayrGrArrr$T) rrtrrrnrrrdretrrrrErErF read_array s&      zGenericFixed.read_array)rtrrrcCshtt|j|d}|dkr.|j|||dS|dkrVt|j|}|j|||d}|Std|dS)N_varietymultirMregularzunrecognized index variety: )rGrSrread_multi_indexrread_index_noder)rrtrrZvarietyrr{rErErF read_index s zGenericFixed.read_index)rtr{cCst|tr,t|j|dd|||nt|j|ddtd||j|j}|||j t |j |}|j |j _ |j|j _t|ttfr|t||j _t|tttfr|j|j _t|tr|jdk rt|j|j _dS)Nrirjrkr{)r@r,rrwrite_multi_index_convert_indexrKr write_arrayrArSrrMrTrOr)r-rUr index_classr/rr_get_tz)rrtr{r,rrErErF write_index s     zGenericFixed.write_indexc Cst|j|d|jtt|j|j|jD]\}\}}}t|rJt d|d|}t |||j |j }| ||jt|j|} |j| j_|| j_t| j|d|||d|} | | |q,dS)N_nlevelsz=Saving a MultiIndex with an extension dtype is not supported._level_name_label)rrr} enumeraterlevelsrnamesr"rrprKrrqrArSrrMrTrO) rrtr{ilev level_codesrO level_keyZ conv_levelr label_keyrErErFro2 s"  zGenericFixed.write_multi_indexcCst|j|d}g}g}g}t|D]l}|d|} t|j| } |j| ||d} || || j|d|} |j| ||d} || q&t|||ddS)NrurvrMrxT)rzrr{r) rSrr9rrmryrOrhr,)rrtrrr}rzrr{r|rrr}rr~rErErFrlK s&     zGenericFixed.read_multi_indexr>)rrrrc Cs>|||}d|jkr>t|jjdkr>tj|jj|jjd}t|jj}d}d|jkrlt|jj }t|}| tt |jdd}| |}i} d|jkr|jd| d<d|jkrt |jdtr|jdd | d<n|jd| d<|d kr|t|||j|jd fd ti| } n|t|||j|jd f| } || _ | S) NrrrrOrrrSrrzutf-8rrsr)rTrAprodrrrerGrMrPrOrXrSr`r@bytesrC_unconvert_indexrKrr%) rrrrrbrMrOrrfactoryrr{rErErFrmb sX        zGenericFixed.read_index_node)rtrucCsJtd|j}|j|j||t|j|}t|j|j _ |j |j _ dS)z write a 0-len array rSN) rArr:r create_arrayrrSrMrrTrer)rrtruZarrrrErErFwrite_array_empty s  zGenericFixed.write_array_empty)rtrrc Cs2t|dd}||jkr&|j|j||jdk}d}t|jrFtd|s^t|dr^|j }d}d}|j dk rt t t j|j}W5QRX|dk r|s|jj|j|||j|j d}||dd<n |||nH|jjtjkrHtj|dd} |rn*| d krnt| ||f} tj| td d |j|j|t } | |nt|jr~|j |j||!d d t"|j|j#_$nt%|jr|j |j||j&t"|j|} t'|j(| j#_(d | j#_$n\t)|jr|j |j||!d dt"|j|j#_$n&|r |||n|j |j|||t"|j|j#_*dS)NT)Z extract_numpyrFz]Cannot store a category dtype in a HDF5 dataset that uses format="fixed". Use format="table".rf)rZskipnarri8rr)+r7rrrrrrrrrfrrrrrZAtomZ from_dtypeZ create_carrayrrrrAZobject_r infer_dtypeperformance_docrrrZcreate_vlarray ObjectAtomryr rviewrSrTrer!asi8rsrr%rd) rrtrrruZ empty_arrayrdrca inferred_typerZvlarrrrErErFrq sj              zGenericFixed.write_array)NN)NN)NN)NN)N)%r`rarbrr)r-rTrrVrRr rMrrUrXr`rarrrrGrHrr rrhr*rnrtr,rorlrmrrrrqrErErErFr] s`   %   4 r]csVeZdZUdZdgZeed<eddZd e e e e dddZ fd d Z Z S) r{rprOc Cs0zt|jjfWSttfk r*YdSXdSrH)r[rrArrorrErErFr szSeriesFixed.shapeNrMcCs<||||jd||d}|jd||d}t|||jdS)Nr{rMrA)r{rO)rarnrhr.rO)rr\rrrr{rArErErFr s zSeriesFixed.readc s8tj|f||d|j|d||j|j_dS)Nr{rA)rrrtr{rqrOrrcrrErFr s zSeriesFixed.write)NNNN)r`rarbr5rRrrrrr rrrr/rErErrFr{ s   r{cs^eZdZUddgZeed<eeedddZ d eeeeddd Z fd d Z Z S) BlockManagerFixedr:nblocksrcCsz|j}d}t|jD]8}t|jd|d}t|dd}|dk r||d7}q|jj}t|dd}|dk rt|d|d}ng}|||WStk rYdSXdS)Nrblock_itemsrrS) r:r9rrSrZ block0_valuesrYryro)rr:rr|rrrErErFr s"   zBlockManagerFixed.shapeNrMcCs||||d}g}t|jD]<}||kr<||fnd\}} |jd||| d} || q(|d} g} t|jD]Z}|d|d} |jd|d|| d}| | | }t |j ||dd }| |q|t | dkrt | dd }|j| d d }|St |d|dd S) Nr)NNrrMrrr*rSrr{r3F)rrc)rar7Z_get_block_manager_axisr9r:rnryrrhr?r(rfr[r0rF)rr\rrrZ select_axisr,r|rraxrdfs blk_itemsrAdfoutrErErFr1 s(    zBlockManagerFixed.readc stj|f||j}|s&|}|j|j_t|jD]0\}}|dkrX|j sXt d| d||q:t |j |j_t|j D]D\}}|j|j}|jd|d|j|d| d|d|qdS)Nrz/Columns index has to be unique for fixed formatrrr*)rr)rr_mgrZis_consolidatedZ consolidater:rryr,Z is_uniquerrtr[blocksrrr@mgr_locsrqrA)rrrrbr|rblkrrrErFrV s zBlockManagerFixed.write)NNNN) r`rarbrRrrrr rrrrr/rErErrFr s  %rc@seZdZdZeZdS)r|rqN)r`rarbr5r(r7rErErErFr|k sr|cseZdZUdZdZdZeed<eed<dZe e e e fed<dZ e eed <e ee efed <e eed <e ed <e ed <eed<dnededfdd ZeedddZedddZedddZddZeeddd Zeeee e fd!d"d#Zee dd$d%Zeedd&d'Z ed(d)Z!ed*d+Z"ed,d-Z#ed.d/Z$ed0d1Z%ee dd2d3Z&eedd4d5Z'ed6d7Z(eeefdd8d9Z)d:d;Z*e eddd?d@Z,ee-j.dAdBdCZ/edDdEdFZ0dGdHZ1dIdJZ2dodKdLZ3dMdNZ4e5dOdPZ6dpe7edQdRdSZ8dqe7e e7e e ee9e9fdTdUdVZ:e;edWdXdYZe?d_d`Z@dsdadbdcddZAe7e ee7e eeefdedfdgZBdte7e e7e dhdidjZCduee7e e7e dkdldmZDZES)vra represent a table: facilitate read/write of various types of tables Attrs in Table Node ------------------- These are attributes that are store in the main table node, they are necessary to recreate these tables when read back in. index_axes : a list of tuples of the (original indexing axis and index column) non_index_axes: a list of tuples of the (original index axis and columns on a non-indexing axis) values_axes : a list of the columns which comprise the data of this table data_columns : a list of the columns that we are allowing indexing (these become single columns in values_axes), or True to force all columns nan_rep : the string to use for nan representations for string objects levels : the names of levels metadata : the names of the metadata columns Z wide_tablerhr6rmrSrzT index_axesr values_axesr~rrgNrlr>)rrrc sPtj||||d|pg|_|p$g|_|p.g|_|p8g|_| pBi|_| |_dS)Nrs)rrrrrr~rgr) rrrrKrrrrr~rgrrrErFr s      zTable.__init__rcCs|jddS)N_r)rmrrrErErFtable_type_short szTable.table_type_shortc Cs|t|jrd|jnd}d|d}d}|jrZddd|jD}d|d}dd d|jD}|jd |d |jd |j d |j d|d|d S)r?rrSz,dc->[r r>css|]}t|VqdSrHrMr rErErFr2 sz!Table.__repr__..rAcss|] }|jVqdSrHrNr`rErErFr2 srBz (typ->z,nrows->z,ncols->z ,indexers->[rC) rr[r~rr;r:rrPrrncols)rZjdcrHverZjverZ jindex_axesrErErFr s  4zTable.__repr__)rcCs"|jD]}||jkr|SqdS)z return the axis for c N)r,rO)rrrsrErErFr s   zTable.__getitem__c Cs|dkr dS|j|jkr2td|jd|jddD]~}t||d}t||d}||kr6t|D]4\}}||}||krbtd|d|d|dqbtd|d|d|dq6dS) z$ validate against an existing table Nz'incompatible table_type with existing [rr )rrrzinvalid combination of [z] on appending data [z] vs current table [)rmrrSryrr')rrrsvovr|saxZoaxrErErFrJ s&   zTable.validatecCs t|jtS)z@the levels attribute is 1 or a list in the case of a multi-index)r@rzrYrrErErFis_multi_index szTable.is_multi_index)rrc Csfddt|jjD}z |}Wn,tk rN}ztd|W5d}~XYnXt|ts^t||fS)ze validate that we can store the multi-index; reset and return the new object cSs&g|]\}}|dk r|nd|qS)NZlevel_rE)rUr|lrErErFrX sz-Table.validate_multiindex..zBduplicate names/columns in the multi-index when storing as a tableN)ryr{r{Z reset_indexrr@r(r)rrrzZ reset_objr+rErErFvalidate_multiindex s  zTable.validate_multiindexcCstdd|jDS)z/ based on our axes, compute the expected nrows cSsg|]}|jjdqSr)rrrUr|rErErFrX sz(Table.nrows_expected..)rArrrrErErFnrows_expected szTable.nrows_expectedcCs d|jkS)z has this table been created rhrrrErErFr szTable.is_existscCst|jddSNrhrSrrrErErFrI szTable.storablecCs|jS)z. return the table group (this is my storable) )rIrrErErFrh sz Table.tablecCs|jjSrH)rhrrrErErFr sz Table.dtypecCs|jjSrHrrrErErFr szTable.descriptioncCst|j|jSrH)rrrrrrErErFr, sz Table.axescCstdd|jDS)z0 the number of total columns in the values axes css|]}t|jVqdSrH)r[rAr`rErErFr2! szTable.ncols..)sumrrrErErFr sz Table.ncolscCsdSrrErrErErF is_transposed# szTable.is_transposedcCs(ttdd|jDdd|jDS)z@return a tuple of my permutated axes, non_indexable at the frontcSsg|]}t|dqSrr=r`rErErFrX, sz*Table.data_orientation..cSsg|]}t|jqSrE)rrr`rErErFrX- s)rZrrrrrrErErFdata_orientation' s zTable.data_orientationcsRdddddjD}fddjD}fddjD}t|||S)z> return a dict of the kinds allowable columns for this object r{rrrScSsg|]}|j|fqSrErr`rErErFrX7 sz$Table.queryables..csg|]\}}|dfqSrHrE)rUrrA) axis_namesrErFrX8 scs&g|]}|jtjkr|j|fqSrE)rOr8r~rrrrErFrX9 s)rrrr7)rd1Zd2Zd3rE)rrrF queryables1 s  zTable.queryablescCsdd|jDS)z return a list of my index cols cSsg|]}|j|jfqSrE)rrrrErErFrXD sz$Table.index_cols..rrrErErF index_colsA szTable.index_colscCsdd|jDS)z! return a list of my values cols cSsg|] }|jqSrErrrErErFrXH sz%Table.values_cols..)rrrErErF values_colsF szTable.values_colsrcCs|jj}|d|dS)z+ return the metadata pathname for this key z/meta/z/metarErrErErF_get_metadata_pathJ szTable._get_metadata_path)rtrAcCs0t|}|jj|||d|j|j|jddS)z Write out a metadata array to the key as a fixed-format Series. Parameters ---------- key : str values : ndarray rh)rzrKrrN)r.rrrrKrr)rrtrArErErFrO s zTable.write_metadatarcCs0tt|jdd|ddk r,|j||SdS)z) return the meta data array for this key rN)rSrrrrrrErErFrb szTable.read_metadatacCspt|j|j_||j_||j_|j|j_|j|j_|j|j_|j|j_|j |j_ |j |j_ |j |j_ dS)z! set our table type & indexables N) rMrmrrrrr~rrKrrzrgrrErErFrGh s        zTable.set_attrscCst|jddpg|_t|jddp$g|_t|jddp8i|_t|jdd|_tt|jdd|_tt|jdd|_ t|jd dpg|_ d d |j D|_ d d |j D|_ dS) rbrNr~rgrrKrrlrzcSsg|]}|jr|qSrErr`rErErFrX~ sz#Table.get_attrs..cSsg|]}|js|qSrErr`rErErFrX s)rSrrr~rgrrLrKrGrrz indexablesrrrrErErFrHu szTable.get_attrscCs\|dk rX|jddkrX|jddkrX|jddkrXtddd|jD}t|tdS) rKNrrSr8r9r>cSsg|] }t|qSrErr rErErFrX sz*Table.validate_version..)r:incompatibility_docrrrrd)rr\rrErErFrL s*zTable.validate_versioncCsZ|dkr dSt|tsdS|}|D]*\}}|dkr.fcsg|]\}}||qSrErE)rUr|r)rirErFrX sz$Table.indexables..)rrhrryrrSrrryr8r~r[r<r) r _indexablesr|rrOrrrrrM index_colrE)rrHrrirrrFr s2      # zTable.indexablesrc CsR|s dS|dkrdS|dks(|dkr8dd|jD}t|ttfsL|g}i}|dk r`||d<|dk rp||d<|j}|D]}t|j|d}|dk r|jr|j }|j } |j } |dk r| |kr| n| |d<|dk r| |kr| n| |d<|jsL|j drtd |jf|qz||jd d krztd |d |d|dqzdS)aZ Create a pytables index on the specified columns. Parameters ---------- columns : None, bool, or listlike[str] Indicate which columns to create an index on. * False : Do not create any indexes. * True : Create indexes on all columns. * None : Create indexes on all columns. * listlike : Create indexes on the given columns. optlevel : int or None, default None Optimization level, if None, pytables defaults to 6. kind : str or None, default None Kind of index, if None, pytables defaults to "medium". Raises ------ TypeError if trying to create an index on a complex-type column. Notes ----- Cannot index Time64Col or ComplexCol. Pytables must be >= 3.0. NFTcSsg|]}|jr|jqSrE)rrr`rErErFrX sz&Table.create_index..rLrMcomplexzColumns containing complex values can be stored but cannot be indexed when using table format. Either use fixed format, set index=False, or do not include the columns containing complex values to data_columns when initializing the table.rrSzcolumn z/ is not a data_column. In order to read column z: you must reload the dataframe into HDFStore and include z with the data_columns argument.)rr,r@rZrYrhrSr1r_r{rLrMZ remove_indexrrZrrNrro) rrrLrMkwrhrr5r{Z cur_optlevelZcur_kindrErErFrN sJ   zTable.create_indexrrrc CsZt||||d}|}g}|jD]2}||j|j||j|j|jd}| |q"|S)a Create the axes sniffed from the table. Parameters ---------- where : ??? start : int or None, default None stop : int or None, default None Returns ------- List[Tuple[index_values, column_values]] rr#) Selectionrr,rrgrrrKrry) rr\rr selectionrArrsresrErErF _read_axes<s   zTable._read_axesrdcCs|S)z return the data for this obj rErrrdrErErF get_object^szTable.get_objectcst|s gS|d\}|j|i}|ddkrL|rLtd|d||dkr^t}n |dkrjg}t|trt|t|}|fdd | Dfd d |DS) zd take the input data_columns and min_itemize and create a data columns spec rrr,z"cannot use a multi-index on axis [z] with data_columns TNcs g|]}|dkr|kr|qSrrEr )existing_data_columnsrErFrXsz/Table.validate_data_columns..csg|]}|kr|qSrErE)rUr) axis_labelsrErFrXs) r[rgrrrYr@r7r8r<r)rr~r|rrrgrE)rrrFvalidate_data_columnscs*     zTable.validate_data_columns)rrJc0stts,|jj}td|dtddkr:dgfddD|rzd}d d|jDt|j }|j }nd }|j } |j d kst t|j d krtd g} |dkrd}fdddDd} j| } t| } |r|j|j|j|| ||.| |d }/t6|d rh|j?|/_?|/@||r|r|/A||/S)!a0 Create and return the axes. Parameters ---------- axes: list or None The names or numbers of the axes to create. obj : DataFrame The object to create axes on. validate: bool, default True Whether to validate the obj against an existing object already written. nan_rep : A value to use for string column nan_rep. data_columns : List[str], True, or None, default None Specify the columns that we want to create to allow indexing on. * True : Use all available columns. * None : Use no columns. * List[str] : Use the specified columns. min_itemsize: Dict[str, int] or None, default None The min itemsize for a column in bytes. z/cannot properly create the storer for: [group->rjr Nrcsg|]}|qSrE)_get_axis_numberr`)rrErFrXsz&Table._create_axes..TcSsg|] }|jqSrEr3r`rErErFrXsFr9rSz.get_blk_items..rErrrErrF get_blk_itemssz)Table._create_axes..get_blk_itemsr1zIncompatible appended table [z]with existing table [Z values_block_) existing_colr|rrKrrrr!) rOrrArrrMrrGrrrrbcSsg|]}|jr|jqSrE)rrO)rUrrErErFrXZs) rrrKrrrrr~rgrrz)Br@r(rrrrrrrYr~rrgr:rr[rr,rr'rAarrayr>rr{r`ryZ_get_axis_namerprKrrrrr _reindex_axisrrr_get_blocks_and_itemsrryrrr0rM IndexErrorr_maybe_convert_for_string_atomrr:rrrrOrrsrrrGr r&rrrzrrJ)0rr,rrJrr~r|r table_existsZnew_infonew_non_index_axesrrsZ append_axisindexerZ exist_axisrg axis_nameZ new_indexZnew_index_axesjrrd block_objrrZvaxesr|bb_itemsr_rOrr+new_namedata_convertedrrrMrrrrGrbrrZdcsZ new_tablerE)r,rrF _create_axess           "              zTable._create_axesc Cs\dd}|jj}||j|}t|r|d\}} t| t|} |j| |dj} t| j}|| |}|D]4} |j| g|dj} || j||| | jqj|rTddt||D} g}g}|D]}t |j }z&| |\}}| || |Wqt tfk rH}z*ddd |D}td |d |W5d}~XYqXq|}|}||fS) Ncsfdd|DS)Ncsg|]}j|jqSrErrrrErFrXzszFTable._get_blocks_and_items..get_blk_items..rErrErrFrysz2Table._get_blocks_and_items..get_blk_itemsrr3cSs"i|]\}}t|||fqSrE)rZtolist)rUrrrErErFr6s z/Table._get_blocks_and_items..rcss|]}t|VqdSrHr@)rUitemrErErFr2sz.Table._get_blocks_and_items..z+cannot match existing table structure for [z] on appending data)rrr[r*r=rFrYr<rrZrArEryrrrr)rrrrr~rrrrrZ new_labelsrrZby_itemsZ new_blocksZ new_blk_itemsZearrrr+ZjitemsrErErFrssF        zTable._get_blocks_and_itemsr)rc s|dk rt|}|dk rNjrNtjts.tjD]}||kr4|d|q4jD]\}}t|||qT|jdk r|j D]$\}}fdd} | ||qS)z process axes filters NrcsjD]}|}|}|dk s*t||krfjrH|tj}||}j|d|S||krt t |j }t |}t t rd|}||}j|d|Sqtd|ddS)Nr3rSzcannot find the field [z] for filtering!)Z _AXIS_ORDERSr _get_axisrrunionr*rzrDr8rSrAr@r(r)fieldfiltrZ axis_numberZ axis_valuesZtakersrAroprrErFprocess_filters"       z*Table.process_axes..process_filter) rYrr@rzrinsertrrfilterrz) rrrrrrlabelsrrrrErrF process_axess  ! zTable.process_axes)rwrr-rcCs|dkrt|jd}d|d}dd|jD|d<|rj|dkrH|jpFd}tj|||pZ|jd }||d <n|jdk r~|j|d <|S) z< create the description of the table from the axes & values Ni'rh)rOr-cSsi|]}|j|jqSrE)rrr`rErErFr6sz,Table.create_description..r )rwrxrr)maxrr,rrrrrr)rrxrwrr-r.rrErErFcreate_descriptions       zTable.create_descriptionrMc Cs|||sdSt||||d}|}|jdk r|jD]D\}}}|j|||dd} ||| j |||j }qBt |S)zf select coordinates (row numbers) from a table; return the coordinates object FrNrSrM) rLrr select_coordsrrzrrrilocrAr*) rr\rrrZcoordsrrrrbrErErFrs    zTable.read_coordinatesrcCs||sdS|dk r$td|jD]z}||jkr*|jsNtd|dt|jj |}| |j |j ||||j |j|jd}tt|d|j|dSq*td|d dS) zj return a single column from the table, generally only indexables are interesting FNz4read_column does not currently accept a where clausezcolumn [z=] can not be extracted individually; it is not data indexabler#rSrNz] not found in the table)rLrrr,rOrrrSrhr1rrgrrrKrr.rrr)rrr\rrrsrZ col_valuesrErErFrs*      zTable.read_column)NrlNNNNNN)N)NNN)NN)TNNN)N)NNN)NNN)Fr`rarbrr5r6rMrrzrrr rrrr rrrrrrrrrrJrrrr(rrrrIrhrrr,rrrrrrrrArrrrGrHrLrrrr rNrrr.rrr staticmethodrrrrrr/rErErrFrp s              IU "* k 0=  rc@s6eZdZdZdZd eeeedddZddZdS) rz a write-once read-many table: this format DOES NOT ALLOW appending to a table. writing is a one-time operation the data are stored in a format that allows for searching the data on disk ryNrMcCs tddS)z[ read the indices and the indexing array, calculate offset rows and return z!WORMTable needs to implement readNrNrOrErErFrLs zWORMTable.readcKs tddS)z write in a format that we can search later on (but cannot append to): write out the indices and the values using _write_array (e.g. a CArray) create an indexing table so that we can search z"WORMTable needs to implement writeNrNrPrErErFrXszWORMTable.write)NNNN) r`rarbrrmr rrrrErErErFrCs rc @sveZdZdZdZdddZdeeedd d Z e j e e j ee j e e j d d d Z deeeedddZdS)r* support the new appendable table formats Z appendableNFTcCs|s|jr|j|jd|j||||| | d}|jD] }|q6|js~|j|||| d}|| |d<|jj |jf||j |j _ |jD]}| ||q|j || ddS)Nrh)r,rrJr|rr~)rxrwrr-r!)r})rrrrrr,rrrGZ create_tablergrr write_data)rrr,ryrxrwrr|rr-r}rr~r!rhrsoptionsrErErFrfs4     zAppendableTable.write)rr}cs|jj}|j}g}|rT|jD]6}t|jjdd}t|tj r| |j dddqt |r|d}|ddD] }||@}qp| }nd}dd |jD} t | } | dkst| d d |jD} d d | D} g} t| D]6\} }|f|j|| | j}| | | |q|dkr$d }tjt|||jd }||d}t|D]x} | |t| d||kr|q|j|fdd | D|dk r|ndfdd | DdqNdS)z` we form the data into a 2-d including indexes,values,mask write chunk-by-chunk rr3u1Fr!rSNcSsg|] }|jqSrE)rr`rErErFrXsz.AppendableTable.write_data..cSsg|] }|qSrE)rr`rErErFrXsc Ss,g|]$}|tt|j|jdqSr)Z transposerAZrollrr:rrErErFrXsrrcsg|]}|qSrErEr`Zend_iZstart_irErFrXscsg|]}|qSrErErrrErFrXs)indexesr-rA)rr{rrr1rbr/r@rArryr)r[r&rrryrreshaperrr9write_data_chunk)rrr}r{rmasksrsr-mrnindexesrAZbvaluesr|r5Z new_shaperowschunksrErrFrsL        zAppendableTable.write_data)rrr-rAc Cs|D]}t|jsdSq|djd}|t|krFtj||jd}|jj}t|}t|D]\} } | ||| <q^t|D]\} }|||| |<q||dk r|j t dd} | s|| }t|r|j ||j dS)z Parameters ---------- rows : an empty memory space where we are putting the chunk indexes : an array of the indexes mask : an array of the masks values : an array of the values NrrFr!)rArrr[rrr{ryr&r)rr/rhryr) rrrr-rAr5rr{r r|rr rErErFr s&   z AppendableTable.write_data_chunkrMcCsb|dkst|sf|dkr:|dkr:|j}|jj|jddn(|dkrH|j}|jj||d}|j|S|srdS|j}t ||||d}| }t | }t|} | r^| } t| | dkj} t| sdg} | d| kr| | | ddkr| dd| } t| D]@} |t| | }|j||jd||jddd| } q|j| S)NTr%rMrSrr)r[rrrrrhZ remove_rowsrrrrr.Z sort_valuesdiffrYr{ryrrEreversedr@r9)rr\rrrrhrrAZ sorted_serieslnrrZpgrrrErErFr* sF        zAppendableTable.delete) NFNNNNNNFNNT)F)NNN)r`rarbrrmrr rrrrArr r r*rErErErFras8 <= -rc@sleZdZUdZdZdZdZeZe e e d<e e dddZee d d d Zdeeeed ddZd S)rrrnrwr9r7rcCs|jdjdkS)NrrS)rrrrErErFrPsz"AppendableFrameTable.is_transposedrcCs|r |j}|S)z these are written transposed )rfrrErErFrTszAppendableFrameTable.get_objectNrMcs.|sdSj|||d}tjrHjjddini}fddtjD}t|dkstt |d}||d} g} tjD]L\} } | j krq|| \} }|ddkrt | }nt | }|d}|dk r|j|d d jr |}|}t | t| d dd }n|j}t | t| d dd }|}|jdkrlt|tjrl|d|jdf}t|tjrt|j||d }n,t|t rt|||d }nt|g||d }|j|jkst |j|jf| |qt| dkr| d}n t| dd}t|||d}j|||d}|S)Nrrcs"g|]\}}|jdkr|qSrr)rUr|rrrErFrXrsz-AppendableFrameTable.read..rSrr,r{TZinplacerOrNrr3)rr) rLrrr[rrgrryr,rrr, from_tuplesr* set_namesrrSrfr:r@rArr rr(Zdtypesrr/ryr0rr)rr\rrrr^rgZindsindr{framesr|rsZ index_valsrr1r{rAZindex_Zcols_rrrErrFr[sZ       "   zAppendableFrameTable.read)NNNN)r`rarbrr5rmr:r(r7r rrrrrr.rr rrrErErErFrHs" rcsveZdZdZdZdZdZeZe e dddZ e e dd d Z dfd d Zdeeeeedfdd ZZS)rrrtrur9rcCsdSrrErrErErFrsz#AppendableSeriesTable.is_transposedrcCs|SrHrErrErErFrsz AppendableSeriesTable.get_objectNc s<t|ts|jpd}||}tjf||jd|S)- we are going to write this as a frame table rArr~)r@r(rOZto_framerrrr)rrr~rrOrrErFrs   zAppendableSeriesTable.writercs|j}|dk rB|rBt|jts"t|jD]}||kr(|d|q(tj||||d}|rj|j|jdd|j dddf}|j dkrd|_ |S)NrrTrrA) rr@rzrYrrrr set_indexrrO)rr\rrrrrrDrrErFrs   zAppendableSeriesTable.read)N)NNNN)r`rarbrr5rmr:r.r7rrrr.rrr rrr/rErErrFrs& rcs(eZdZdZdZdZfddZZS)rrrtrvc s^|jpd}||\}|_t|jts*tt|j}||t||_t j fd|i|S)rrAr) rOrrzr@rYrryr*rrr)rrrrOZnewobjr1rrErFrs    z AppendableMultiSeriesTable.write)r`rarbrr5rmrr/rErErrFrsrc@sheZdZUdZdZdZdZeZe e e d<e e dddZe d d Zd d Zed dZddZdS)r~z< a table that read/writes the generic pytables table format rnror9rzrcCs|jSrH)r5rrErErFrPszGenericTable.pandas_typecCst|jddp|jSrrrrErErFrIszGenericTable.storablecCsLg|_d|_g|_dd|jD|_dd|jD|_dd|jD|_dS)rbNcSsg|]}|jr|qSrErr`rErErFrXsz*GenericTable.get_attrs..cSsg|]}|js|qSrErr`rErErFrXscSsg|] }|jqSrErNr`rErErFrXs)rrrzrrrr~rrErErFrHs zGenericTable.get_attrsc Cs|j}|d}|dk rdnd}tdd|j||d}|g}t|jD]^\}}t|tsZtt ||}||}|dk rzdnd}t |||g||j||d} | | qD|S)z2 create the indexables from the table description r{Nrr)rOrrhrr)rOrrArrhrr) rrrrhryZ_v_namesr@rMrrSr3ry) rr.rrrrr|rrrHrErErFr s6    zGenericTable.indexablescKs tddS)Nz cannot write on an generic tablerNrPrErErFr,szGenericTable.writeN)r`rarbrr5rmr:r(r7r rrrrMrPrIrHrrrrErErErFr~s     "r~csheZdZdZdZeZdZe dZ e e dddZ dfd d Zdeeeed fd d ZZS)rz a frame with a multi-index rxr9z ^level_\d+$rcCsdS)NZappendable_multirErrErErFr8sz*AppendableMultiFrameTable.table_type_shortNc sx|dkrg}n|dkr |j}||\}|_t|jts@t|jD]}||krF|d|qFtj f||d|S)NTrr) rrrrzr@rYrrrr)rrr~rrrrErFr<s  zAppendableMultiFrameTable.writerMcsDtj||||d}|j}|jfdd|jjD|_|S)Nrcs g|]}j|rdn|qSrH) _re_levelssearch)rUrOrrErFrXUsz2AppendableMultiFrameTable.read..)rrrrzr{rr{)rr\rrrrrrrFrHs  zAppendableMultiFrameTable.read)N)NNNN)r`rarbrrmr(r7r:recompilerrrMrrr rrr/rErErrFr0s  r)rrrrcCs||}t|}|dk r"t|}|dks4||rB||rB|St|}|dk rlt|j|dd}||stddg|j}|||<|jt|}|S)NF)sort) rr8equalsuniquerCslicer:rDrZ)rrrrrZslicerrErErFr[s   r)rrcCst|}|S)z- for a tz-aware type, return an encoded zone )rZ get_timezone)rzonerErErFrsss rs)rArrrcCst|tr"|jdks"|j|ks"t|dk rtt|trB|j}|j}n d}|}t|}t||d}|d |}n|rt j |dd}|S)a coerce the values to a DatetimeIndex if tz is set preserve the input shape if possible Parameters ---------- values : ndarray or Index tz : str or tzinfo coerce : if we do not have a passed timezone, coerce to M8[ns] ndarray NrNrZM8[ns]r) r@r)rrrOrr&rGr\r]rAr$)rArrrOrErErFrys   r)rOr{rKrrc Cstt|tst|j}t|\}}t|}t|}t|tsFt |j rlt ||||t |ddt |dd|dSt|t r~tdtj|dd} t|} | dkrtjdd | Dtjd }t ||dt|d S| d krt| ||}|j j} t ||d t| |d S| d kr$t |||||dSt|tjr>|j tksBt|dksTt|t}t |||||d SdS)Nrr)rArMrrrrzMultiIndex not supported here!FrrcSsg|] }|qSrE) toordinalrrErErFrXsz"_convert_index..r)rr)integerZfloating)rArMrrr%)r@rMrrOrrr0rr+r&rrrSr,rrrrAr$int32rrZ Time32Col_convert_string_arrayrrrr%r) rOr{rKrrr,rrMrrrArrErErFrpsd           rp)rMrKrrcCs|dkrt|}n|dkr$t|}n|dkrxztjdd|Dtd}Wqtk rttjdd|Dtd}YqXnT|dkrt|}n@|d krt|d||d }n&|d krt|d }ntd ||S)NrrrcSsg|]}t|qSrErrrErErFrXsz$_unconvert_index..rcSsg|]}t|qSrErrrErErFrXs)r&floatrr#r%rzunrecognized index type )r)r/rAr$r%rr+)rbrMrKrr{rErErFrs,    rrNcCs|js |jS|jj}tj|jdd}|dkr6tdn(|dkrHtdn|dks^|dks^|jS|j|dd }t|t r~|d }|j} tj| dd}|dkrt t |j d D]F} | | } tj| dd}|dkr|jj| } td | d |d qt| ||| j } | j |j ks&t| j |j f| j}t|trXt||pT|dpTd }t|pbd |}|dk r||}||kr|}| jd|dd} | S)NFrrz+[date] is not implemented as a table columnrQz>too many timezones in this block, create separate data columnsrr%)ZdowncastrzCannot serialize the column [z!] because its data contents are [z] object dtyperAz|Sr!)r2rArrOrrrZfillnar@rYr9r[rigetrrr(r rrr7rrrrr))rOrrr|rrKrrrrbr|rrrrZecirErErFrsL         r)rbrKrrcCs\t|r(t|j||j|j}t|}t dt |}t j |d|d}|S)a Take a string-like that is object dtype and coerce to a fixed size string type. Parameters ---------- data : np.ndarray[object] encoding : str errors : str Handler for encoding errors. Returns ------- np.ndarray[fixed-length-string] rSSr)r[r.r&rMencoder*r rrr libwritersmax_len_string_arrayrAr$)rbrKrensuredrrErErFr(5s r(cCs|j}tj|td}t|rvtt|}d|}t |dt r^t |j j ||dj}n|j|ddjtdd}|dkrd}t||}||S) a* Inverse of _convert_string_array. Parameters ---------- data : np.ndarray[fixed-length-string] nan_rep : the storage repr of NaN encoding : str errors : str Handler for encoding errors. Returns ------- np.ndarray[object] Decoded data. rUr)rFr!Nr)rrAr$r&r%r[r-r.rr@rr.rMrCr*r)Z!string_array_replace_from_nan_repr )rbrrKrrrrrErErFr+Ts  r+)rArrKrcCs6t|tstt|t|r2t|||}||}|SrH)r@rMrr _need_convert_get_converter)rArrKrconvrErErFr{s  rrMrKrcs8|dkrddS|dkr&fddStd|dS)NrcSstj|ddS)Nr$r)rAr$r rErErFrz _get_converter..rcst|ddS)Nr#)r+r5rsrErFrs z invalid kind )rr4rErsrFr2s r2rcCs|dkr dSdS)N)rrTFrErrErErFr1sr1)rOr:rcCslt|tst|dkrtd|ddkrh|ddkrh|ddkrhtd|}|rh|d}d|}|S) z Prior to 0.10.1, we named values blocks like: values_block_0 an the name values_0, adjust the given name if necessary. Parameters ---------- name : str version : Tuple[int, int, int] Returns ------- str z6Version is incorrect, expected sequence of 3 integers.rrSr8r9zvalues_block_(\d+)Zvalues_)r@rMr[rrrr)rOr:r grprErErFrs$   r) dtype_strrcCst|}|ds|dr"d}n|dr2d}n|drBd}n|dsV|dr\d}nn|drld}n^|d r|d }nN|d rd }n>|d rd }n.|d rd}n|dkrd}ntd|d|S)zA Find the "kind" string describing the given dtype name. rrr)rrrr&r timedeltarrrr r%zcannot interpret dtype of [r )rGrZr)r9rMrErErFrs.       rrcCsbt|tr|j}|jjdd}|jjdkr@t| d}nt|t rP|j }t|}||fS)zJ Convert the passed data into a storable form and a dtype string. rAr)r Mr) r@r2rrrOrrMrAr$rr-r)rbrrErErFrs    rc@sDeZdZdZd eeeeedddZddZdd Z d d Z dS) rz Carries out a selection operation on a tables.Table object. Parameters ---------- table : a Table object where : list of Terms (or convertible to) start, stop: indices to start and/or stop selection N)rhrrc CsR||_||_||_||_d|_d|_d|_d|_t|rt t t j |dd}|dksd|dkrt |}|jt jkr|j|j}}|dkrd}|dkr|jj}t ||||_nVt|jjt jr|jdk r||jks|jdk r||jkrt d||_W5QRX|jdkrN|||_|jdk rN|j\|_|_dS)NFrr&booleanrz3where must have index locations >= start and < stop)rhr\rr conditionrZtermsrr#rrrrrAr$rZbool_rr issubclassrr&r(generateevaluate)rrhr\rrinferredrErErFrsD        zSelection.__init__c Cs|dkr dS|j}zt|||jjdWStk rz}z2d|}td|d|d}t||W5d}~XYnXdS)z) where can be a : dict,list,tuple,string N)rrKrz- The passed where expression: a* contains an invalid variable reference all of the variable references must be a reference to an axis (e.g. 'index' or 'columns'), or a data_column The currently defined references are: z ) rhrr5rK NameErrorrrrr)rr\rr+ZqkeysrrErErFr?s  zSelection.generatecCsX|jdk r(|jjj|j|j|jdS|jdk rB|jj|jS|jjj|j|jdS)( generate the selection NrM) r=rhZ read_whererzrrrrrrrErErFr6s  zSelection.selectcCs|j|j}}|jj}|dkr$d}n|dkr4||7}|dkrB|}n|dkrR||7}|jdk rx|jjj|j||ddS|jdk r|jSt ||S)rCNrT)rrr) rrrhrr=Zget_where_listrzrrAr)rrrrrErErFrBs(  zSelection.select_coords)NNN) r`rarbrrr rrr?rrrErErErFrs / r) rsNNFNTNNNNrlr?) NrrlNNNNFN)N)F)r contextlibrrcrQrrrrrtextwraprtypingrrrr r r r r rrnumpyrAZpandas._configrrZ pandas._libsrrr-Zpandas._libs.tslibsrZpandas._typingrrrrrZpandas.compat._optionalrZpandas.compat.pickle_compatrZ pandas.errorsrZpandas.util._decoratorsrZpandas.core.dtypes.commonrrrr r!r"r#r$r%r&Zpandas.core.dtypes.missingr'rr(r)r*r+r,r-r.r/r0r1Zpandas.core.arraysr2r3r4Zpandas.core.commoncorecommonr(Z pandas.core.computation.pytablesr5r6Zpandas.core.constructionr7Zpandas.core.indexes.apir8Zpandas.io.commonr9Zpandas.io.formats.printingr:r;rnr<r=r>rDrIrGrLrPrTrr]r'r^rcWarningrdrrerrfZ duplicate_docrrhr;Z dropna_docZ format_docZ config_prefixZregister_optionZis_boolZis_one_of_factoryrmrqrrrMrrrrrrrrrr0r3r4r]r{rr|rrrrrrr~rrrsrrrprrr(r+rr2r1rrrrrErErErFsj  ,      0 0        :  Np3S ZZhd1B+   &<  @  ' !