Skip to content

added pd as namespace #12268

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 4 commits into from
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
32 changes: 24 additions & 8 deletions doc/source/io.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1399,7 +1399,7 @@ Writing to a file, with a date index and a date column
dfj2['date'] = Timestamp('20130101')
dfj2['ints'] = list(range(5))
dfj2['bools'] = True
dfj2.index = date_range('20130101', periods=5)
dfj2.index = pd.date_range('20130101', periods=5)
dfj2.to_json('test.json')
open('test.json').read()

Expand Down Expand Up @@ -2553,7 +2553,7 @@ for some advanced strategies

.. ipython:: python

store = HDFStore('store.h5')
store = pd.HDFStore('store.h5')
print(store)

Objects can be written to the file just like adding key-value pairs to a
Expand All @@ -2562,7 +2562,7 @@ dict:
.. ipython:: python

np.random.seed(1234)
index = date_range('1/1/2000', periods=8)
index = pd.date_range('1/1/2000', periods=8)
s = Series(randn(5), index=['a', 'b', 'c', 'd', 'e'])
df = DataFrame(randn(8, 3), index=index,
columns=['A', 'B', 'C'])
Expand Down Expand Up @@ -2611,7 +2611,7 @@ Closing a Store, Context Manager

# Working with, and automatically closing the store with the context
# manager
with HDFStore('store.h5') as store:
with pd.HDFStore('store.h5') as store:
store.keys()

.. ipython:: python
Expand Down Expand Up @@ -2754,7 +2754,7 @@ enable ``put/append/to_hdf`` to by default store in the ``table`` format.

.. ipython:: python

store = HDFStore('store.h5')
store = pd.HDFStore('store.h5')
df1 = df[0:4]
df2 = df[4:]

Expand Down Expand Up @@ -2801,6 +2801,22 @@ everything in the sub-store and BELOW, so be *careful*.

.. _io.hdf5-types:

.. warning:: Hierarchical keys cannot be retrieved as dotted (attribute) access as described above for items stored under root node.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This belongs in a separate PR --- or I would consider renaming this one to make it obvious that this is the big change here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Either way, what would you prefer?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd opt for another PR, it just keeps things easier if PRs are smaller and about one thing only.


.. ipython:: python

store.foo.bar.bah
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this sectionneeds to be in a separete PR

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought with was already merged. So here you go.

AttributeError: 'HDFStore' object has no attribute 'foo'

store.root.foo.bar.bah
/foo/bar/bah (Group) ''
children := ['block0_items' (Array), 'axis1' (Array), 'axis0' (Array), 'block0_values' (Array)]






Storing Types
'''''''''''''

Expand Down Expand Up @@ -3364,7 +3380,7 @@ Compression for all objects within the file

.. code-block:: python

store_compressed = HDFStore('store_compressed.h5', complevel=9, complib='blosc')
store_compressed = pd.HDFStore('store_compressed.h5', complevel=9, complib='blosc')

Or on-the-fly compression (this only applies to tables). You can turn
off file compression for a specific table by passing ``complevel=0``
Expand Down Expand Up @@ -3573,7 +3589,7 @@ It is possible to write an ``HDFStore`` object that can easily be imported into
index=range(100))
df_for_r.head()

store_export = HDFStore('export.h5')
store_export = pd.HDFStore('export.h5')
store_export.append('df_for_r', df_for_r, data_columns=df_dc.columns)
store_export

Expand Down Expand Up @@ -3662,7 +3678,7 @@ number of options, please see the docstring.
.. ipython:: python

# a legacy store
legacy_store = HDFStore(legacy_file_path,'r')
legacy_store = pd.HDFStore(legacy_file_path,'r')
legacy_store

# copy (and return the new handle)
Expand Down