Skip to content

BUG: Series.cumin/cummax fails with datetime64[ns, tz] dtype #15553

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
3 of 4 tasks
jreback opened this issue Mar 2, 2017 · 4 comments · Fixed by #30460
Closed
3 of 4 tasks

BUG: Series.cumin/cummax fails with datetime64[ns, tz] dtype #15553

jreback opened this issue Mar 2, 2017 · 4 comments · Fixed by #30460
Assignees
Labels
Bug Indexing Related to indexing on series/frames, not to indexes themselves Timezones Timezone data dtype
Milestone

Comments

@jreback
Copy link
Contributor

jreback commented Mar 2, 2017

  • .where
  • .combine_first
  • .update
  • .cummin/cummax

works with naive dt, should work with tz-aware

In [40]: s = Series(pd.date_range('20130101', periods=2))

In [41]: s.where(s>s,s)
Out[41]: 
0   2013-01-01
1   2013-01-02
dtype: datetime64[ns]

In [42]: s = Series(pd.date_range('20130101', periods=2, tz='US/Eastern'))

In [43]: s.where(s>s,s)
TypeError: invalid type promotion
TypeError: Could not operate [array(['2013-01-01T05:00:00.000000000', '2013-01-02T05:00:00.000000000'], dtype='datetime64[ns]')] with block values [invalid type promotion]
@jreback jreback added Bug Difficulty Intermediate Indexing Related to indexing on series/frames, not to indexes themselves Timezones Timezone data dtype labels Mar 2, 2017
@jreback jreback added this to the Next Major Release milestone Mar 2, 2017
@adbull
Copy link
Contributor

adbull commented Mar 3, 2017

Note there's also odd behaviour on tz-aware series with update or combine_first:

>>> import pandas as pd
>>> utc = pd.Series(pd.to_datetime(['now'], utc=True))
>>> utc.dtype

datetime64[ns, UTC]

>>> utc.combine_first(utc).dtype

dtype('<M8[ns]')

>>> utc.update(utc)

~/anaconda3/lib/python3.5/site-packages/pandas/core/internals.py:1517: VisibleDeprecationWarning: using a non-integer number instead of an integer will result in an error in the future
  new = new[mask]
Traceback (most recent call last):
  File "bug.py", line 1, in <module>
    utc.update(utc)
  File "~/anaconda3/lib/python3.5/site-packages/pandas/core/series.py", line 1718, in update
    self._data = self._data.putmask(mask=mask, new=other, inplace=True)
  File "~/anaconda3/lib/python3.5/site-packages/pandas/core/internals.py", line 3171, in putmask
    return self.apply('putmask', **kwargs)
  File "~/anaconda3/lib/python3.5/site-packages/pandas/core/internals.py", line 3056, in apply
    applied = getattr(b, f)(**kwargs)
  File "~/anaconda3/lib/python3.5/site-packages/pandas/core/internals.py", line 1517, in putmask
    new = new[mask]
IndexError: index 1 is out of bounds for axis 0 with size 1

@adbull
Copy link
Contributor

adbull commented Mar 3, 2017

And cummax or cummin:

>>> utc.cummax().dtype

dtype('<M8[ns]')

>>> utc.cummin().dtype

dtype('<M8[ns]')

@mroeschke
Copy link
Member

#21660 closed where (and update since it uses where under the hood) and combine_first

@mroeschke mroeschke changed the title BUG: datetime w/tz comparisions BUG: Series.cumin/cummax fails with datetime64[ns, tz] dtype Jul 28, 2018
@jbrockmendel
Copy link
Member

jbrockmendel commented Jun 18, 2019

Looks like the DataFrame cummin/cummax methods are OK, just the Series ones are losing timezones

I think this is because np.array(s.to_frame()) converts to object dtype whereas np.array(s) goes to datetime64[ns]

@jbrockmendel jbrockmendel reopened this Jun 18, 2019
@jbrockmendel jbrockmendel self-assigned this Oct 16, 2019
@jreback jreback modified the milestones: Contributions Welcome, 1.0 Dec 25, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug Indexing Related to indexing on series/frames, not to indexes themselves Timezones Timezone data dtype
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants