-
Notifications
You must be signed in to change notification settings - Fork 4
wrap returns out annotation #93
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
torch_np/_normalizations.py
Outdated
result = func(*args, **kwds) | ||
result = wrap_tensors(result) | ||
|
||
out = None | ||
if sig.return_annotation == OutArray: | ||
result, out = result | ||
|
||
result = wrap_tensors(result, out, promote_scalar_result) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I really don't like having to return both the result and the out. It feels very unnatural and counterintuitive to me, as, to implement the function, you have to effectively change its signature (e.g. in C++ you'd go from returning a Tensor
to returning a std::tuple<Tensor, Tensor&>
?!). Even more, the wrap_tensors
function now does too many things: It wraps result and it implements the out behaviour. These are two independent things.
I believe a better way to do this would be:
- Make functions always return just the result and never annotate the result i.e. leave things how they were before.
- Leave
wrap_tensor
as it was. - Rewrite this bit as
def maybe_copy_to(out, result, promote_scalar_result=False):
if out is None:
return result
elif isinstance(result, torch.Tensor):
if result.shape != out.shape:
can_fit = result.numel() == 1 and out.ndim == 0
if promote_scalar_result and can_fit:
result = result.squeeze()
else:
raise ValueError(
f"Bad size of the out array: out.shape = {out.shape}"
f" while result.shape = {result.shape}."
)
out.copy_(result)
return out
elif isinstance(result, (tuple, list)):
return type(result)(map(maybe_copy_to, zip(result, out)))
else:
assert False # We should never hit this path
...
if "out" in params:
out = sig.bind(*args, **kwargs).arguments.get("out")
result = maybe_copy_out(result, out, promote_scalar_result)
result = wrap_tensors(result)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, the annotation for out doesn't seem correct, as it may be a List[NDArray]
or Tuple[NDArray]
as well, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm going to prepare a version without return annotations.
ec2a7be
to
a126f71
Compare
a126f71
to
c81f677
Compare
Force pushed a version without type annotations, almost identical to #93 (comment). One possibly semi-missing bit is divmod (again!), which has several ways of accepting out arrays. Working out all details of what numpy really does requires taking over |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree that dispatching on name is not the best. Another way of doing this would be to have an annotation Output = Union[NDArray, Tuple[NDArray...], Tuple[Any...]]
(for example, it's not like we're using them for typecheking really) and then dispatch on that. Either works for me.
About divmod
... my my. Yeah, your one-off solution LGTM.
/, | ||
out=None, | ||
out: Optional[tuple[NDArray]] = (None, None), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit tuple[Optional[NDArray], Optional[NDArray]]
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why? It's either (None, None)
or (NDArray, NDArray)
; can't have (out1, None)
(I think)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
tuple
, as opposed, to list
takes the number of arguments that it accepts explicitly. As such, there you are declaring there a tuple of length 1 and one NDArray.
Then, Optional[tuple[NDArray, NDArray]]
accepts either a tuple of ndarrays (fine) or None
(not (None, None)
).
What you want is to have a (partial) function that accepts tuple[Optional[NDArray], Optional[NDArray]]
and errors out when just one of the elements of the tuple is None
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Feel free to patch this small correction in any other PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ouch. python typing meta-language is seriously weird.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note that this is not specific to Python, the same would be true in C++. This would be an std::tuple<std::optional<array, std::optional<array>>
rather than a std::optional<std::tuple<array, array>>
:)
signature=signature, | ||
extobj=extobj, | ||
tensors = _helpers.ufunc_preprocess( | ||
(x1, x2), out, True, casting, order, dtype, subok, signature, extobj |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As discussed ufunc_preprocess
should not take an out
parameter as setting an out=
parameter should not affect the type of the computation. Or is this fixed in some other PR and we'll just need to update this one afterwards?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's still in the TODO pipeline. Will do after the current "stack" is done.
Thanks Mario |
Use a return annotation for functions with anout=
argument. Other returns are wrapped generically.Detect the out argument in signatures and implement the out= semantics.
supersedes and closes gh-83