Skip to content

wrap returns out annotation #93

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Mar 31, 2023
Merged

wrap returns out annotation #93

merged 2 commits into from
Mar 31, 2023

Conversation

ev-br
Copy link
Collaborator

@ev-br ev-br commented Mar 24, 2023

Use a return annotation for functions with an out= argument. Other returns are wrapped generically.

Detect the out argument in signatures and implement the out= semantics.

supersedes and closes gh-83

@ev-br ev-br requested a review from lezcano March 24, 2023 23:21
@ev-br ev-br changed the title Mangle returns out annotation wrap returns out annotation Mar 24, 2023
Comment on lines 181 to 187
result = func(*args, **kwds)
result = wrap_tensors(result)

out = None
if sig.return_annotation == OutArray:
result, out = result

result = wrap_tensors(result, out, promote_scalar_result)
Copy link
Collaborator

@lezcano lezcano Mar 27, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I really don't like having to return both the result and the out. It feels very unnatural and counterintuitive to me, as, to implement the function, you have to effectively change its signature (e.g. in C++ you'd go from returning a Tensor to returning a std::tuple<Tensor, Tensor&>?!). Even more, the wrap_tensors function now does too many things: It wraps result and it implements the out behaviour. These are two independent things.

I believe a better way to do this would be:

  1. Make functions always return just the result and never annotate the result i.e. leave things how they were before.
  2. Leave wrap_tensor as it was.
  3. Rewrite this bit as
def maybe_copy_to(out, result, promote_scalar_result=False):
    if out is None:
        return result
    elif isinstance(result, torch.Tensor):
        if result.shape != out.shape:
            can_fit = result.numel() == 1 and out.ndim == 0
            if promote_scalar_result and can_fit:
                result = result.squeeze()
            else:
                raise ValueError(
                    f"Bad size of the out array: out.shape = {out.shape}"
                    f" while result.shape = {result.shape}."
                )
        out.copy_(result)
        return out
    elif isinstance(result, (tuple, list)):
        return type(result)(map(maybe_copy_to, zip(result, out)))
    else:
        assert False  # We should never hit this path

...
if "out" in params:
  out = sig.bind(*args, **kwargs).arguments.get("out")
  result = maybe_copy_out(result, out, promote_scalar_result)
result = wrap_tensors(result)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, the annotation for out doesn't seem correct, as it may be a List[NDArray] or Tuple[NDArray] as well, right?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm going to prepare a version without return annotations.

Base automatically changed from mangle_returns to main March 27, 2023 18:44
@ev-br ev-br mentioned this pull request Mar 30, 2023
@ev-br ev-br force-pushed the mangle_returns_out_annotation branch from ec2a7be to a126f71 Compare March 31, 2023 10:33
@ev-br ev-br force-pushed the mangle_returns_out_annotation branch from a126f71 to c81f677 Compare March 31, 2023 10:35
@ev-br
Copy link
Collaborator Author

ev-br commented Mar 31, 2023

Force pushed a version without type annotations, almost identical to #93 (comment).
I was wanting to avoid dispatching on argument names ("out"), but the alternatives seem clunky, so let's roll with this and see if want to improve at leisure.

One possibly semi-missing bit is divmod (again!), which has several ways of accepting out arrays. Working out all details of what numpy really does requires taking over numpy/core/tests/test_umath.py. This is rather large and rather hairy, so best not conflate it with the cleanup that is this PR.

Copy link
Collaborator

@lezcano lezcano left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree that dispatching on name is not the best. Another way of doing this would be to have an annotation Output = Union[NDArray, Tuple[NDArray...], Tuple[Any...]] (for example, it's not like we're using them for typecheking really) and then dispatch on that. Either works for me.

About divmod... my my. Yeah, your one-off solution LGTM.

/,
out=None,
out: Optional[tuple[NDArray]] = (None, None),
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit tuple[Optional[NDArray], Optional[NDArray]].

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why? It's either (None, None) or (NDArray, NDArray); can't have (out1, None) (I think)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tuple, as opposed, to list takes the number of arguments that it accepts explicitly. As such, there you are declaring there a tuple of length 1 and one NDArray.
Then, Optional[tuple[NDArray, NDArray]] accepts either a tuple of ndarrays (fine) or None (not (None, None)).
What you want is to have a (partial) function that accepts tuple[Optional[NDArray], Optional[NDArray]] and errors out when just one of the elements of the tuple is None.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Feel free to patch this small correction in any other PR.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ouch. python typing meta-language is seriously weird.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note that this is not specific to Python, the same would be true in C++. This would be an std::tuple<std::optional<array, std::optional<array>> rather than a std::optional<std::tuple<array, array>> :)

signature=signature,
extobj=extobj,
tensors = _helpers.ufunc_preprocess(
(x1, x2), out, True, casting, order, dtype, subok, signature, extobj
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As discussed ufunc_preprocess should not take an out parameter as setting an out= parameter should not affect the type of the computation. Or is this fixed in some other PR and we'll just need to update this one afterwards?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's still in the TODO pipeline. Will do after the current "stack" is done.

@ev-br
Copy link
Collaborator Author

ev-br commented Mar 31, 2023

Thanks Mario

@ev-br ev-br merged commit f267249 into main Mar 31, 2023
@ev-br ev-br deleted the mangle_returns_out_annotation branch March 31, 2023 11:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants