-
-
Notifications
You must be signed in to change notification settings - Fork 18.6k
TYP: pd.isna #46222
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TYP: pd.isna #46222
Changes from all commits
632eed5
4496f42
b77f716
24aec6f
c1204b4
a0d8123
b412060
17c9779
a4d0dfa
2d7f86a
eb16bc3
b9288cb
c77248f
70d5d60
1d2afd5
5f8b856
7e65d89
a0cf860
a29a9e0
d1957f4
c319692
f1d0309
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -5,6 +5,10 @@ | |
|
||
from decimal import Decimal | ||
from functools import partial | ||
from typing import ( | ||
TYPE_CHECKING, | ||
overload, | ||
) | ||
|
||
import numpy as np | ||
|
||
|
@@ -16,11 +20,6 @@ | |
NaT, | ||
iNaT, | ||
) | ||
from pandas._typing import ( | ||
ArrayLike, | ||
DtypeObj, | ||
npt, | ||
) | ||
|
||
from pandas.core.dtypes.common import ( | ||
DT64NS_DTYPE, | ||
|
@@ -54,6 +53,19 @@ | |
) | ||
from pandas.core.dtypes.inference import is_list_like | ||
|
||
if TYPE_CHECKING: | ||
from pandas._typing import ( | ||
ArrayLike, | ||
DtypeObj, | ||
NDFrame, | ||
NDFrameT, | ||
Scalar, | ||
npt, | ||
) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Thanks @twoertwein for the PR. I didn't want to comment on this PR before to avoid "too many cooks" and in general if mypy is happy, so am I. (In some respects, because we have mypy we don't need to review certain aspects of typing as if it is not 100% correct, it will create issues down the line. I see it like a jigsaw puzzle that won't be complete until the last piece is in place, i.e. the codebase is 100% typed.) However, I think I've seen that your preference is to import from pandas._typing inside the TYPE_CHECKING also elsewhere? If this is correct can you explain the reasoning so that it helps when reviewing other contributors PRs. AFAIK, we ensure all imports in pandas._typing are guarded so that they can be imported at the top level and for instance npt was added so that we could import without needing to add a TYPE_CHECKING block everywhere it was needed, otherwise we would just import from numpy directly and not include in pandas._typing? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Good point! I imported it within the TYPE_CHECKING block because it is only needed for type checking . I will import it outside the TYPE_CHECKING block in next PRs. Probably one reason why I prefer to put imports in this block is that there are some import cycles that prevent even mypy to function correctly: I think having There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Yes, for import cycles it is definitely needed. My preference is also adding imports inside the type checking if added specifically in a typing PR for type annotations but this is difficult to enforce since say a refactor could remove the need for a top level import and it would not be removed because it is still used in type annotations then in theory the import should be moved to the type checking for consistency. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
my preference here is to use There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I just tried replacing it with |
||
|
||
from pandas.core.indexes.base import Index | ||
|
||
|
||
isposinf_scalar = libmissing.isposinf_scalar | ||
isneginf_scalar = libmissing.isneginf_scalar | ||
|
||
|
@@ -63,7 +75,35 @@ | |
_dtype_str = np.dtype(str) | ||
|
||
|
||
def isna(obj): | ||
@overload | ||
def isna(obj: Scalar) -> bool: | ||
... | ||
|
||
|
||
@overload | ||
def isna( | ||
obj: ArrayLike | Index | list, | ||
) -> npt.NDArray[np.bool_]: | ||
... | ||
|
||
|
||
@overload | ||
def isna(obj: NDFrameT) -> NDFrameT: | ||
... | ||
|
||
|
||
# handle unions | ||
@overload | ||
def isna(obj: NDFrameT | ArrayLike | Index | list) -> NDFrameT | npt.NDArray[np.bool_]: | ||
... | ||
|
||
|
||
@overload | ||
def isna(obj: object) -> bool | npt.NDArray[np.bool_] | NDFrame: | ||
twoertwein marked this conversation as resolved.
Show resolved
Hide resolved
|
||
... | ||
|
||
|
||
def isna(obj: object) -> bool | npt.NDArray[np.bool_] | NDFrame: | ||
jbrockmendel marked this conversation as resolved.
Show resolved
Hide resolved
|
||
""" | ||
Detect missing values for an array-like object. | ||
|
||
|
@@ -284,7 +324,35 @@ def _isna_string_dtype(values: np.ndarray, inf_as_na: bool) -> npt.NDArray[np.bo | |
return result | ||
|
||
|
||
def notna(obj): | ||
@overload | ||
def notna(obj: Scalar) -> bool: | ||
... | ||
|
||
|
||
@overload | ||
def notna( | ||
obj: ArrayLike | Index | list, | ||
) -> npt.NDArray[np.bool_]: | ||
... | ||
|
||
|
||
@overload | ||
def notna(obj: NDFrameT) -> NDFrameT: | ||
... | ||
|
||
|
||
# handle unions | ||
@overload | ||
def notna(obj: NDFrameT | ArrayLike | Index | list) -> NDFrameT | npt.NDArray[np.bool_]: | ||
... | ||
|
||
|
||
@overload | ||
def notna(obj: object) -> bool | npt.NDArray[np.bool_] | NDFrame: | ||
... | ||
|
||
|
||
def notna(obj: object) -> bool | npt.NDArray[np.bool_] | NDFrame: | ||
""" | ||
Detect non-missing values for an array-like object. | ||
|
||
|
@@ -362,7 +430,7 @@ def notna(obj): | |
Name: 1, dtype: bool | ||
""" | ||
res = isna(obj) | ||
if is_scalar(res): | ||
if isinstance(res, bool): | ||
return not res | ||
return ~res | ||
|
||
|
Uh oh!
There was an error while loading. Please reload this page.