Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
BUG: Write compliant Parquet with
pyarrow
#43690New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BUG: Write compliant Parquet with
pyarrow
#43690Changes from all commits
e3f1687
39010eb
cc7e6a8
17b4d17
5f06acd
4025014
File filter
Filter by extension
Conversations
Jump to
There are no files selected for viewing
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
use
check_round_trip(df, pa)
as this will assert the resultsThere was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That isn't what I'm testing though... I'm testing that the Parquet that is written is correct...
ie. "list<element: int64>" not "list<item: int64>"
The serializer and deserializer always worked.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
https://github.com/apache/parquet-format/blob/master/LogicalTypes.md#nested-types
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
apache/arrow#9489
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
well if the serialize & de-serialize works then i am not sure i understand the problem. you can certainly assert the meta-data if you want. the most important point is that you need this option to have correct results yes?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The problem is that
to_parquet
doesn't write Parquet compliant data. Parquet is a 'standard' after all. Pandas doesn't follow the standard.What if I want to load data from Pandas into a different language?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here is an example of a place where this is an issue: googleapis/python-bigquery#980
And the
use_compliant_nested_type
has to be passed manually.Really
use_compliant_nested_type
should be the default in PyArrow but it isn't for compatibility reasons. It's all in the Arrow PR I linked.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In summary: yes,
use_compliant_nested_type
is needed in order to output the correct, compliant Parquet format.But then that comes back around to changing output format and so my point about calling users attention to that in What's New. It they're using the data elsewhere (not Pandas) it could cause them problems.