Skip to content

BUG: pandas.cut incorrectly raises a ValueError due to an overflow #26045

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
jschendel opened this issue Apr 10, 2019 · 3 comments · Fixed by #26063
Closed

BUG: pandas.cut incorrectly raises a ValueError due to an overflow #26045

jschendel opened this issue Apr 10, 2019 · 3 comments · Fixed by #26063
Labels
Bug Reshaping Concat, Merge/Join, Stack/Unstack, Explode
Milestone

Comments

@jschendel
Copy link
Member

Code Sample, a copy-pastable example if possible

In [1]: import pandas as pd; pd.__version__
Out[1]: '0.25.0.dev0+389.g6d9b702a66'

In [2]: bins = [pd.Timestamp.min, pd.Timestamp('2018-01-01'), pd.Timestamp.max]

In [3]: values = pd.date_range('2017-12-31', periods=3)

In [4]: pd.cut(values, bins=bins)
---------------------------------------------------------------------------
ValueError: bins must increase monotonically.

The issue appears to be due to an overflow in np.diff, which in turn makes it appear that the bins are not monotonically increasing. Essentially the following:

In [5]: bins_numeric = [b.value for b in bins]

In [6]: bins_numeric
Out[6]: [-9223372036854775000, 1514764800000000000, 9223372036854775807]

In [7]: np.diff(bins_numeric)
Out[7]: array([-7708607236854776616,  7708607236854775807], dtype=int64)

Problem description

The bins are monotonically increasing but a ValueError is raised indicating that they aren't.

Expected Output

I'd expect [4] to not raise a ValueError.

Output of pd.show_versions()

INSTALLED VERSIONS

commit: 6d9b702
python: 3.6.8.final.0
python-bits: 64
OS: Windows
OS-release: 10
machine: AMD64
processor: Intel64 Family 6 Model 78 Stepping 3, GenuineIntel
byteorder: little
LC_ALL: None
LANG: None
LOCALE: None.None

pandas: 0.25.0.dev0+389.g6d9b702a66
pytest: 4.2.0
pip: 19.0.1
setuptools: 40.6.3
Cython: 0.28.2
numpy: 1.14.6
scipy: 1.0.0
pyarrow: 0.6.0
xarray: 0.9.6
IPython: 7.2.0
sphinx: 1.8.2
patsy: 0.4.1
dateutil: 2.6.0
pytz: 2017.2
blosc: None
bottleneck: 1.2.1
tables: 3.4.2
numexpr: 2.6.4
feather: 0.4.0
matplotlib: 2.0.2
openpyxl: 2.4.8
xlrd: 1.1.0
xlwt: 1.3.0
xlsxwriter: 0.9.8
lxml.etree: 3.8.0
bs4: None
html5lib: 0.999
sqlalchemy: 1.1.13
pymysql: None
psycopg2: None
jinja2: 2.9.6
s3fs: None
fastparquet: 0.1.5
pandas_gbq: None
pandas_datareader: None
gcsfs: None

@jschendel jschendel added Bug Reshaping Concat, Merge/Join, Stack/Unstack, Explode labels Apr 10, 2019
@jschendel jschendel added this to the Contributions Welcome milestone Apr 10, 2019
@jschendel
Copy link
Member Author

It looks like a potential fix is to cast the bins to a float dtype during the np.diff:

diff --git a/pandas/core/reshape/tile.py b/pandas/core/reshape/tile.py
index f99fd9004b..a9271404be 100644
--- a/pandas/core/reshape/tile.py
+++ b/pandas/core/reshape/tile.py
@@ -230,7 +230,7 @@ def cut(x, bins, right=True, labels=None, retbins=False, precision=3,
         else:
             bins = np.asarray(bins)
         bins = _convert_bin_to_numeric_type(bins, dtype)
-        if (np.diff(bins) < 0).any():
+        if (np.diff(bins.astype('float64')) < 0).any():
             raise ValueError('bins must increase monotonically.')

     fac, bins = _bins_to_cuts(x, bins, right=right, labels=labels,

This looks to provide the expected output for the example data. Not sure if this is the best solution and haven't run the tests to see if this causes any unintended issues.

@Batalex
Copy link
Contributor

Batalex commented Apr 11, 2019

I see that you welcome contributions on this issue, I would like to give it a try.

@jschendel
Copy link
Member Author

@Batalex : sure, go for it!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug Reshaping Concat, Merge/Join, Stack/Unstack, Explode
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants