Skip to content

GH456 First attempt GroupBy.transform improved typing #1242

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 8 commits into from
Jun 13, 2025
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
38 changes: 38 additions & 0 deletions pandas-stubs/_typing.pyi
Original file line number Diff line number Diff line change
Expand Up @@ -925,6 +925,44 @@ GroupByObjectNonScalar: TypeAlias = (
| list[Grouper]
)
GroupByObject: TypeAlias = Scalar | Index | GroupByObjectNonScalar | Series
GroupByFuncStrs: TypeAlias = Literal[
# Reduction/aggregation functions
"all",
"any",
"corrwith",
"count",
"first",
"idxmax",
"idxmin",
"last",
"max",
"mean",
"median",
"min",
"nunique",
"prod",
"quantile",
"sem",
"size",
"skew",
"std",
"sum",
"var",
# Transformation functions
"bfill",
"cumcount",
"cummax",
"cummin",
"cumprod",
"cumsum",
"diff",
"ffill",
"fillna",
"ngroup",
"pct_change",
"rank",
"shift",
]

StataDateFormat: TypeAlias = Literal[
"tc",
Expand Down
58 changes: 49 additions & 9 deletions pandas-stubs/core/groupby/generic.pyi
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ from collections.abc import (
)
from typing import (
Any,
Concatenate,
Generic,
Literal,
NamedTuple,
Expand All @@ -22,7 +23,10 @@ from pandas.core.groupby.groupby import (
GroupBy,
GroupByPlot,
)
from pandas.core.series import Series
from pandas.core.series import (
Series,
UnknownSeries,
)
from typing_extensions import (
Self,
TypeAlias,
Expand All @@ -31,15 +35,18 @@ from typing_extensions import (
from pandas._libs.tslibs.timestamps import Timestamp
from pandas._typing import (
S1,
S2,
AggFuncTypeBase,
AggFuncTypeFrame,
ByT,
CorrelationMethod,
Dtype,
GroupByFuncStrs,
IndexLabel,
Level,
ListLike,
NsmallestNlargestKeep,
P,
Scalar,
TakeIndexer,
WindowingEngine,
Expand All @@ -53,10 +60,21 @@ class NamedAgg(NamedTuple):
aggfunc: AggScalar

class SeriesGroupBy(GroupBy[Series[S1]], Generic[S1, ByT]):
@overload
def aggregate(
self,
func: Callable[Concatenate[Series[S1], P], S2],
/,
*args,
engine: WindowingEngine = ...,
engine_kwargs: WindowingEngineKwargs = ...,
**kwargs,
) -> Series[S2]: ...
@overload
def aggregate(
self,
func: list[AggFuncTypeBase],
/,
*args,
engine: WindowingEngine = ...,
engine_kwargs: WindowingEngineKwargs = ...,
Expand All @@ -66,20 +84,32 @@ class SeriesGroupBy(GroupBy[Series[S1]], Generic[S1, ByT]):
def aggregate(
self,
func: AggFuncTypeBase | None = ...,
/,
*args,
Comment on lines 93 to 97
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Before this overload, you could add this overload:

    @overload
    def aggregate(
        self,
        func: Callable[[Series], S2],
        *args,
        engine: WindowingEngine = ...,
        engine_kwargs: WindowingEngineKwargs = ...,
        **kwargs,
    ) -> Series[S2]: ...

Then you know that if you start with a Series with a known type, then the return type would be inferred from the callable. And it works with a lambda function, e.g.:

    s = pd.Series([1, 2, 3, 4])
    q = s.groupby([1, 1, 2, 2]).agg(lambda x: x.astype(float).min())

In this case, q would have type Series[float], which is what you want.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

image Even with the new overload mypy still complains (pyright does not). I think it does not recognize the lamda as returning S2 (float).

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think that's because the type of new_func isn't clear.

But I think it would work if you did check(assert_type(s.groupby([1,1,2,2]).agg(lambda x: x.astype(float).min()), "pd.Series[int]"), pd.Series, int)

Because then it can know that x is a Series[int] and that the lambda becomes Series[int]

Can you try that?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I tried that for the last push, see

check(assert_type(s.groupby([1,1,2,2]).agg(lambda x: x.astype(float).min()), "pd.Series[float]"), pd.Series, int)

It fails in all CI:

===========================================
Beginning: 'Run mypy on 'tests' (using the local stubs) and on the local stubs'
===========================================

tests/test_series.py:1167: error: Expression is of type "Series[Any]", not "Series[float]"  [assert-type]
Found 1 error in 1 file (checked 224 source files)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When I look with how mypy reads the type of the lambda, it has no idea about the type of x:

tests/test_series.py:1168: note: Revealed type is "def (x: Any) -> Any"

so that may explain why it fails on lambda expressions whatsoever.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK - so we can leave the lambda test in, but just have it assert_type() against Series instead of Series[float]

engine: WindowingEngine = ...,
engine_kwargs: WindowingEngineKwargs = ...,
**kwargs,
) -> Series: ...
) -> UnknownSeries: ...
agg = aggregate
@overload
def transform(
self,
func: Callable | str,
*args,
func: Callable[Concatenate[Series[S1], P], Series[S2]],
/,
*args: Any,
engine: WindowingEngine = ...,
engine_kwargs: WindowingEngineKwargs = ...,
**kwargs,
) -> Series: ...
**kwargs: Any,
) -> Series[S2]: ...
@overload
def transform(
self,
func: Callable,
*args: Any,
**kwargs: Any,
) -> UnknownSeries: ...
@overload
def transform(self, func: GroupByFuncStrs, *args, **kwargs) -> UnknownSeries: ...
def filter(
self, func: Callable | str, dropna: bool = ..., *args, **kwargs
) -> Series: ...
Expand Down Expand Up @@ -206,14 +236,24 @@ class DataFrameGroupBy(GroupBy[DataFrame], Generic[ByT, _TT]):
**kwargs,
) -> DataFrame: ...
agg = aggregate
@overload
def transform(
self,
func: Callable | str,
*args,
func: Callable[Concatenate[DataFrame, P], DataFrame],
*args: Any,
engine: WindowingEngine = ...,
engine_kwargs: WindowingEngineKwargs = ...,
**kwargs,
**kwargs: Any,
) -> DataFrame: ...
@overload
def transform(
self,
func: Callable,
*args: Any,
**kwargs: Any,
) -> DataFrame: ...
@overload
def transform(self, func: GroupByFuncStrs, *args, **kwargs) -> DataFrame: ...
def filter(
self, func: Callable, dropna: bool = ..., *args, **kwargs
) -> DataFrame: ...
Expand Down
69 changes: 67 additions & 2 deletions tests/test_series.py
Original file line number Diff line number Diff line change
Expand Up @@ -1078,25 +1078,90 @@ def test_types_groupby_agg() -> None:
r"The provided callable <built-in function (min|sum)> is currently using",
upper="2.2.99",
):
check(assert_type(s.groupby(level=0).agg(sum), pd.Series), pd.Series)

def sum_sr(s: pd.Series[int]) -> int:
# type of `sum` not well inferred by mypy
return sum(s)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why not use s.sum() ?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The issue was if I passed sum the builtin one in the aggregate method directly, here is does not change anything.


check(
assert_type(s.groupby(level=0).agg(sum_sr), "pd.Series[int]"),
pd.Series,
np.integer,
)
check(
assert_type(s.groupby(level=0).agg([min, sum]), pd.DataFrame), pd.DataFrame
)


def test_types_groupby_transform() -> None:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think you should add tests for two of the string transform arguments (e.g., "mean", "first")

s: pd.Series[int] = pd.Series([4, 2, 1, 8], index=["a", "b", "a", "b"])

def transform_func(
x: pd.Series[int], pos_arg: bool, kw_arg: str
) -> pd.Series[float]:
return x / (2.0 if pos_arg else 1.0)

check(
assert_type(
s.groupby(lambda x: x).transform(transform_func, True, kw_arg="foo"),
"pd.Series[float]",
),
pd.Series,
float,
)
check(
assert_type(
s.groupby(lambda x: x).transform(
transform_func, True, engine="cython", kw_arg="foo"
),
"pd.Series[float]",
),
pd.Series,
float,
)


def test_types_groupby_aggregate() -> None:
s = pd.Series([4, 2, 1, 8], index=["a", "b", "a", "b"])
check(assert_type(s.groupby(level=0).aggregate("sum"), pd.Series), pd.Series)
check(
assert_type(s.groupby(level=0).aggregate(["min", "sum"]), pd.DataFrame),
pd.DataFrame,
)

def func(s: pd.Series[int]) -> float:
return s.astype(float).min()

s = pd.Series([1, 2, 3, 4])
s.groupby([1, 1, 2, 2]).agg(lambda x: x.astype(float).min())
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

don't you want a check(assert_type(... here?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Correct my mistake

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Turns out the inference on the fly of lambdas is not super clear so you need to define the function on the side to have the right types.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, that is an issue with lambda functions.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, I think you can have a test of

check(assert_type(    s.groupby([1, 1, 2, 2]).agg(lambda x: x.astype(float).min()), pd.Series), pd.Series)

which would be worthwhile

check(
assert_type(s.groupby(level=0).aggregate(func), "pd.Series[float]"),
pd.Series,
np.floating,
)
check(
assert_type(
s.groupby(level=0).aggregate(func, engine="cython"), "pd.Series[float]"
),
pd.Series,
np.floating,
)

with pytest_warns_bounded(
FutureWarning,
r"The provided callable <built-in function (min|sum)> is currently using",
upper="2.2.99",
):
check(assert_type(s.groupby(level=0).aggregate(sum), pd.Series), pd.Series)

def sum_sr(s: pd.Series[int]) -> int:
# type of `sum` not well inferred by mypy
return sum(s)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use s.sum()


check(
assert_type(s.groupby(level=0).aggregate(sum_sr), "pd.Series[int]"),
pd.Series,
np.integer,
)
check(
assert_type(s.groupby(level=0).aggregate([min, sum]), pd.DataFrame),
pd.DataFrame,
Expand Down