-
-
Notifications
You must be signed in to change notification settings - Fork 18.4k
date_range() with closed=left and sub-second granularity returns wrong number of elements #24110
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Thanks for the report! Your second example is indeed bizarre behavior. Investigations and PR's always welcome. |
gabrielreid
pushed a commit
to gabrielreid/pandas
that referenced
this issue
Dec 6, 2018
Improves (but doesn't completely resolve) pandas-dev#24110, to avoid rounding issues with sub-second granularity timestamps when creating a date range.
gabrielreid
pushed a commit
to gabrielreid/pandas
that referenced
this issue
Dec 6, 2018
Improves (but doesn't completely resolve) pandas-dev#24110, to avoid rounding issues with sub-second granularity timestamps when creating a date range.
4 tasks
This appears to be due to limitations of the integer resolution of doubles in python, brought on by the use of numpy.linspace. I've added a PR with a fix which will largely reduce (but not completely resolve) the occurrence of this issue in #24129 |
gabrielreid
pushed a commit
to gabrielreid/pandas
that referenced
this issue
Dec 7, 2018
Fixes pandas-dev#24110, by avoid floating-point rounding issues with sub-second granularity timestamps when creating a date range.
gabrielreid
pushed a commit
to gabrielreid/pandas
that referenced
this issue
Dec 9, 2018
Fixes pandas-dev#24110, by avoid floating-point rounding issues with millisecond resolution or higher timestamps when creating a date range.
mroeschke
pushed a commit
that referenced
this issue
Dec 9, 2018
Fixes #24110, by avoid floating-point rounding issues with millisecond resolution or higher timestamps when creating a date range.
Pingviinituutti
pushed a commit
to Pingviinituutti/pandas
that referenced
this issue
Feb 28, 2019
Fixes pandas-dev#24110, by avoid floating-point rounding issues with millisecond resolution or higher timestamps when creating a date range.
Pingviinituutti
pushed a commit
to Pingviinituutti/pandas
that referenced
this issue
Feb 28, 2019
Fixes pandas-dev#24110, by avoid floating-point rounding issues with millisecond resolution or higher timestamps when creating a date range.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Code Sample, a copy-pastable example if possible
Problem description
As far as I understand it, calling
date_range
with two absolute endpoints, a number of periods, and closed='left' should return a DatetimeIndex withperiods - 1
entries. This appears to work as expected in most cases, but supplying more recent dates with sub-second granularity (e.g.2018-01-01T00:00:00.010Z
) appears to trigger an issue which causesperiods
entries to be contained in the returned DatetimeIndex instead ofperiods - 1
.I've verified this on a number of older (pre 0.24) versions, as well as in the current HEAD of the master branch, and it appears to be present in all cases.
Expected Output
Output of
pd.show_versions()
INSTALLED VERSIONS
commit: d7e96d8
python: 3.7.1.final.0
python-bits: 64
OS: Darwin
OS-release: 17.7.0
machine: x86_64
processor: i386
byteorder: little
LC_ALL: en_US.UTF-8
LANG: en_US.UTF-8
LOCALE: en_US.UTF-8
pandas: 0.24.0.dev0+1208.gd7e96d830
pytest: 4.0.1
pip: 18.1
setuptools: 40.6.2
Cython: 0.29
numpy: 1.15.4
scipy: 1.1.0
pyarrow: 0.11.1
xarray: 0.11.0
IPython: 7.2.0
sphinx: 1.8.2
patsy: 0.5.1
dateutil: 2.7.5
pytz: 2018.7
blosc: None
bottleneck: 1.2.1
tables: 3.4.4
numexpr: 2.6.8
feather: None
matplotlib: 3.0.1
openpyxl: 2.5.11
xlrd: 1.1.0
xlwt: 1.3.0
xlsxwriter: 1.1.2
lxml.etree: 4.2.5
bs4: 4.6.3
html5lib: 1.0.1
sqlalchemy: 1.2.14
pymysql: None
psycopg2: None
jinja2: 2.10
s3fs: None
fastparquet: 0.1.6
pandas_gbq: None
pandas_datareader: None
gcsfs: None
The text was updated successfully, but these errors were encountered: