-
Notifications
You must be signed in to change notification settings - Fork 4
Split implementations and normalizations #105
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
torch_np/_ufuncs.py
Outdated
decorated.__qualname__ = name # XXX: is this really correct? | ||
decorated.__name__ = name |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
According to @lysnikolaou, this shouldn't be necessary, as this is automagically done by wraps
(which we use internally in normalizer
, right?). He pointed out though that wraps
also sets the __module__
attribute to be the previous module the function had, so we probably want to set it to be this module here.
ditto for all the other places where we use this technique.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We're not using @wraps
here. This decorator actually defines the signature and wraps a pytorch function which does not have numpy ufunc arguments.
The __module__
attribute is set correctly automagically somehow though,
In [3]: tnp.add.__module__
Out[3]: 'torch_np._ufuncs'
What I'm not sure about is what the __qualname__
is and what it should be. E.g., with NumPy itself,
In [4]: import numpy as np
In [5]: np.add.__name__
Out[5]: 'add'
In [6]: np.add.__qualname__
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
Input In [6], in <cell line: 1>()
----> 1 np.add.__qualname__
AttributeError: 'numpy.ufunc' object has no attribute '__qualname__'
In [7]: np.array.__qualname__
Out[7]: 'array'
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We are using wraps
at
numpy_pytorch_interop/torch_np/_normalizations.py
Lines 141 to 142 in 3ec9dda
@functools.wraps(func) | |
def wrapped(*args, **kwds): |
__qualname__
stands for "qualified name". It means that it returns the full qualified name of a given function. For example, for a method foo
of a class A
, it'll return A.foo
, while if foo
is a free function, it'll return just foo
.
This reminds me, we should probably set this variable to its correct value when registering the methods of torch_np.ndarray
. Same for their name, which is not correctly set as we're just assigning to them. Now, this is fairly low prio, so I wouldn't worry too much about it for now.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removing these assignments gives
In [1]: import torch_np as tnp
In [2]: tnp.sin.__qualname__
Out[2]: 'deco_unary_ufunc.<locals>.wrapped'
In [3]: tnp.sin.__name__
Out[3]: 'wrapped'
which is precisely why they were added in the first place. It's full well possible that this is because I'm doing something stupid somewhere in the decorator chain, but I'm not seeing what it is.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You are missing a functools.wraps
in
numpy_pytorch_interop/torch_np/_unary_ufuncs.py
Lines 14 to 21 in 3ec9dda
def deco_unary_ufunc(torch_func): | |
"""Common infra for unary ufuncs. | |
Normalize arguments, sort out type casting, broadcasting and delegate to | |
the pytorch functions for the actual work. | |
""" | |
def wrapped( |
Probably also for the
deco_binary_ufunc
, I didn't check.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The point as to why doing it in the decorator is that the decorator should return a function as correct as possible. What's the point of having a decorator that returns a function that's half correct and then you need to manually do some post-processing every time you call that decorator?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, done in 14273c8
This does have a side effect that implementation details leak into the names but if that's wanted, fine :-).
In [1]: import torch_np as tnp
In [2]: tnp.invert.__name__
Out[2]: 'bitwise_not'
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That was also an issue with the previous implementation. The way to fix that is by rather than doing from torch import bitwise_not as invert
, you should have a function that registers a given function with the correct name. See e.g.
https://github.com/pytorch/pytorch/blob/37b9143206dbe53aa27405a0a889b2fd9c76ec8b/torch/_refs/__init__.py#L416-L426
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It wasn't because the previous implementation never used torch_func.__name__
. It used
decorated.__name__ = name
, where name
came from dir(_ufunc_impl)
, so it picked from torch import bitwise_not as invert
.
Taking a step back, I do not think it's a big deal and the status quo is good enough. WDYT?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
not a big deal nope. Whatever works
The former fills with repeated copies of input, while the latter fills with zeros.
Co-authored-by: Mario Lezcano Casado <[email protected]>
539024c
to
a11c041
Compare
torch_np/_ufuncs.py
Outdated
decorated.__qualname__ = name # XXX: is this really correct? | ||
decorated.__name__ = name |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That was also an issue with the previous implementation. The way to fix that is by rather than doing from torch import bitwise_not as invert
, you should have a function that registers a given function with the correct name. See e.g.
https://github.com/pytorch/pytorch/blob/37b9143206dbe53aa27405a0a889b2fd9c76ec8b/torch/_refs/__init__.py#L416-L426
14273c8
to
215498c
Compare
I think all review comments are addressed. |
Thanks Mario for the review |
(Finally) split torch tensor in, torch tensor out implementations and their normalization decoration. Type annotations stay with implementations, so that e.g. an implementer function with an argument annotated as ArrayLike accepts a torch.Tensor. These implementer functions are then bulk decorated with
@normalizer
and exported to the top level namespace.As discussed offline with @lezcano.
While at it, merge
{_binary,_unary}_ufuncs
into a single_ufuncs
module. There is some room for deduplication of the ufunc decorators, which I'm going to deal with after gh-97 is fixed (quite likely, this will mean_helpers.py
is gone and we'll see how to reorg various decorators and other meta stuff).