-
-
Notifications
You must be signed in to change notification settings - Fork 18.5k
Series.unique() dies with many NaNs #714
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Interesting. Guessing because the hash back-end doesn't realize nan != nan |
went a different route, added a float64 hash table with NA handling. getting this result now:
|
Thanks for quick response.
|
reopened the issue and will take a look |
fixed this in master, let me know if you have any more issues |
* commit 'v0.7.0rc1-94-ge3df4e2': DOC: added info on encoding parameter for csv i/o TST: renamed io b/c module conflict, made suite check for config added vbench for write csv BUG: made encoding optional on csv read/write, addresses pandas-dev#717 BUG: float64 hash table for handling NAs in Series.unique, close pandas-dev#714 TST: add bench_unique.py TST: added better testing for pandas-dev#709 BUG: closes pandas-dev#709, bug in ix + multiindex use case DOC: release notes BUG: don't assume that each object contains every unique block type in concat, GH pandas-dev#708 BUG: inconsistency in .ix with integer label and float index Fix test that assumed py2. Don't use unnecessary UnicodeReader on Python 3. BUG: remove poor man's breakpoint BUG: closes pandas-dev#705, csv is encoded utf-8 and then decoded on the read side updated support contact info DOC: note EWMA adjustment, closes pandas-dev#703 ENH: close pandas-dev#694, pandas-dev#693, pandas-dev#692 BUG: Bar plot fails if axis parameter supplied, closes pandas-dev#702
Based on discussions and given the fact that this collection is not exposed via any api, it should be safe to remove all usage of this.
Fixes pandas-dev#714 Do not write to changes collection
Series.unique() dies with many NaNs:
The text was updated successfully, but these errors were encountered: