Skip to content

Fix ValueError when reading a Dataframe with HDFStore in Python 3 fro… #24510

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 8 commits into from
Jan 1, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion doc/source/whatsnew/v0.24.0.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1515,7 +1515,6 @@ MultiIndex
I/O
^^^


.. _whatsnew_0240.bug_fixes.nan_with_str_dtype:

Proper handling of `np.NaN` in a string data-typed column with the Python engine
Expand Down Expand Up @@ -1587,6 +1586,7 @@ Notice how we now instead output ``np.nan`` itself instead of a stringified form
- Bug in :func:`pandas.io.json.json_normalize` that caused it to raise ``TypeError`` when two consecutive elements of ``record_path`` are dicts (:issue:`22706`)
- Bug in :meth:`DataFrame.to_stata`, :class:`pandas.io.stata.StataWriter` and :class:`pandas.io.stata.StataWriter117` where a exception would leave a partially written and invalid dta file (:issue:`23573`)
- Bug in :meth:`DataFrame.to_stata` and :class:`pandas.io.stata.StataWriter117` that produced invalid files when using strLs with non-ASCII characters (:issue:`23573`)
- Bug in :class:`HDFStore` that caused it to raise ``ValueError`` when reading a Dataframe in Python 3 from fixed format written in Python 2 (:issue:`24510`)

Plotting
^^^^^^^^
Expand Down
3 changes: 2 additions & 1 deletion pandas/io/pytables.py
Original file line number Diff line number Diff line change
Expand Up @@ -2501,7 +2501,7 @@ def set_attrs(self):
def get_attrs(self):
""" retrieve our attributes """
self.encoding = _ensure_encoding(getattr(self.attrs, 'encoding', None))
self.errors = getattr(self.attrs, 'errors', 'strict')
self.errors = _ensure_decoded(getattr(self.attrs, 'errors', 'strict'))
for n in self.attributes:
setattr(self, n, _ensure_decoded(getattr(self.attrs, n, None)))

Expand Down Expand Up @@ -2661,6 +2661,7 @@ def read_index_node(self, node, start=None, stop=None):

if 'name' in node._v_attrs:
name = _ensure_str(node._v_attrs.name)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you can just directly call this, no if is needed

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

name = _ensure_decoded(name)

index_class = self._alias_to_class(_ensure_decoded(
getattr(node._v_attrs, 'index_class', '')))
Expand Down
Binary file not shown.
14 changes: 14 additions & 0 deletions pandas/tests/io/test_pytables.py
Original file line number Diff line number Diff line change
Expand Up @@ -4540,6 +4540,20 @@ def test_pytables_native2_read(self, datapath):
d1 = store['detector']
assert isinstance(d1, DataFrame)

def test_legacy_table_fixed_format_read_py2(self, datapath):
# GH 24510
# legacy table with fixed format written en Python 2
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

add the issue number here as a comments

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added

with ensure_clean_store(
datapath('io', 'data', 'legacy_hdf',
'legacy_table_fixed_py2.h5'),
mode='r') as store:
result = store.select('df')
expected = pd.DataFrame([[1, 2, 3, 'D']],
columns=['A', 'B', 'C', 'D'],
index=pd.Index(['ABC'],
name='INDEX_NAME'))
assert_frame_equal(expected, result)

def test_legacy_table_read(self, datapath):
# legacy table types
with ensure_clean_store(
Expand Down