Skip to content

Fix pd.concat to accept None values as input. #858

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Feb 21, 2024
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
27 changes: 23 additions & 4 deletions pandas-stubs/core/reshape/concat.pyi
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,10 @@ from pandas import (
DataFrame,
Series,
)
from typing_extensions import Never

from pandas._typing import (
Axis,
AxisColumn,
AxisIndex,
HashableT1,
Expand All @@ -24,9 +26,23 @@ from pandas._typing import (

@overload
def concat(
objs: Iterable[DataFrame] | Mapping[HashableT1, DataFrame],
objs: Iterable[None] | Mapping[HashableT1, None],
*,
axis: AxisIndex = ...,
axis: Axis = ...,
join: Literal["inner", "outer"] = ...,
ignore_index: bool = ...,
keys: Iterable[HashableT2] = ...,
levels: Sequence[list[HashableT3] | tuple[HashableT3, ...]] = ...,
names: list[HashableT4] = ...,
verify_integrity: bool = ...,
sort: bool = ...,
copy: bool = ...,
) -> Never: ...
@overload
def concat( # type: ignore[overload-overlap] # pyright: ignore[reportOverlappingOverload]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

overlaps because of Iterable[None]

objs: Iterable[DataFrame | None] | Mapping[HashableT1, DataFrame | None],
*,
axis: Axis = ...,
join: Literal["inner", "outer"] = ...,
ignore_index: bool = ...,
keys: Iterable[HashableT2] = ...,
Expand All @@ -38,7 +54,7 @@ def concat(
) -> DataFrame: ...
@overload
def concat(
objs: Iterable[Series] | Mapping[HashableT1, Series],
objs: Iterable[Series | None] | Mapping[HashableT1, Series | None],
*,
axis: AxisIndex = ...,
join: Literal["inner", "outer"] = ...,
Expand All @@ -52,7 +68,10 @@ def concat(
) -> Series: ...
@overload
def concat(
objs: Iterable[Series | DataFrame] | Mapping[HashableT1, Series | DataFrame],
objs: (
Iterable[Series | DataFrame | None]
| Mapping[HashableT1, Series | DataFrame | None]
),
*,
axis: AxisColumn,
join: Literal["inner", "outer"] = ...,
Expand Down
28 changes: 27 additions & 1 deletion tests/test_pandas.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,10 @@

# TODO: github.com/pandas-dev/pandas/issues/55023
import pytest
from typing_extensions import assert_type
from typing_extensions import (
Never,
assert_type,
)

from pandas._libs.missing import NAType
from pandas._libs.tslibs import NaTType
Expand Down Expand Up @@ -49,6 +52,29 @@ def test_types_to_datetime() -> None:
)


def test_types_concat_none() -> None:
"""Test concatenation with None values."""
series = pd.Series([7, -5, 10])
df = pd.DataFrame({"a": [7, -5, 10]})

check(assert_type(pd.concat([None, series]), pd.Series), pd.Series)
check(assert_type(pd.concat([None, df]), pd.DataFrame), pd.DataFrame)
check(
assert_type(pd.concat([None, series, df], axis=1), pd.DataFrame), pd.DataFrame
)

check(assert_type(pd.concat({"a": None, "b": series}), pd.Series), pd.Series)
check(assert_type(pd.concat({"a": None, "b": df}), pd.DataFrame), pd.DataFrame)
check(
assert_type(pd.concat({"a": None, "b": series, "c": df}, axis=1), pd.DataFrame),
pd.DataFrame,
)

if TYPE_CHECKING_INVALID_USAGE:
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@twoertwein can you explain what this block means and why this is needed? Trying to understand what was missing from my tests.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is testing that the type checkers see the code as invalid. Although I don't think the test is constructed correctly....

assert_type(pd.concat({"a": None}), Never)
assert_type(pd.concat([None]), Never)
Comment on lines +75 to +76
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think these tests should be of the form:

    pd.concat({"a": None})  # type: ignore[some_mypy_error] # pyright: ignore[some_pyright_error]
    pd.concat([None])  # type: ignore[some_mypy_error] # pyright: ignore[some_pyright_error]

so we are checking that the type checkers see that invalid code as an error.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe I tried that first - the issue is that the second call will not be checked as it cannot be reached (by the type checkers). Would need to split it into two functions.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That makes sense. I think for consistency's sake, we should do that, although I could be convinced otherwise. If we want to use your pattern here, then add a comment to indicate why we can't just check for a specific type checker error based on your comment here.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we need the assert_type only when testing at least two invalid function calls where the first one "returns" NoReturn/Never. I think we also have many cases were we do not explicitly return NoReturn/Never. In that case, we might not need the assert_type.

I would be inclined to use the assert_type only when we have to.



def test_types_concat() -> None:
s: pd.Series = pd.Series([0, 1, -10])
s2: pd.Series = pd.Series([7, -5, 10])
Expand Down