Skip to content

Merge wrapper and implementations #94

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 10 commits into from
Mar 31, 2023
Merged

Merge wrapper and implementations #94

merged 10 commits into from
Mar 31, 2023

Conversation

ev-br
Copy link
Collaborator

@ev-br ev-br commented Mar 25, 2023

Undo the split between decorated functions and torch implementations: the formers are nothing but fall-throught to tha latters now that all argument pre/postprocessing is contained in normalization decorators.
While at it, merge _funcs.py and _wrapper.py, the split was purely technical.
The diff looks scary, but this is mainly just shuffling things around and trimming a small amount of dead weight.

@ev-br ev-br requested a review from lezcano March 25, 2023 12:22
Copy link
Collaborator

@lezcano lezcano left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The merge LGTM. We are still missing to merge the other files within _detail, but this first step looks good.

I took this chance to go through the implementations here and flag a few issues with them. Just minor things all throughout.

Style Nit. Very often you name a variable result, just to return it in the next line. I think it's cleaner and more concise to simply do return torch.fn.

There's a number of places where we use methods rather than their torch.fn versions. It'd be good to be consistent all throughout. I flagged a few of these, but there's more (roll, squeeze, reshape...)

Comment on lines 1870 to 1872
# XXX: this is a stub
if a is False:
return a
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What does this construction mean?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if a is False: return False?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, but a is an ArrrayLike, when is it going to be the literal False?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, fair enough. Thanks for spotting this. Removed.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This uncovered a latent bug:

    @normalizer(return_on_failure=False)
    def isscalar(a: ArrayLike):
        # XXX: this is a stub
>       return a.numel() == 1
E       AttributeError: 'bool' object has no attribute 'numel'

It may be better to just implement this function without the normalizer and remove the return_on_failure flag, as you initially had?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All right. Here's what's going on: isscalar(np.int) (np.intis not a scalar!) fails to convert its argument to tensor, and the return_on_failure code path feeds False into the the body of the function. This is in fact requested by the normalizer argument three lines above, via normalizer(return_on_failure=False). So, can either

  • put back the if a is False: return False clause + a code comment to not confuse future readers
  • rip the return_on_failure code path and normalize manually (and possibly will need to normalize manually other functions from the not-yet-implemented numpy apis, who knows)

Your preference @lezcano ?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Probably the simplest option is the second one, as this function can be implemented as

def isscalar(a):
    try:
        return asarray(a).size == 1
    except:
        return False

and it's very much one of a kind.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, remove_on_failure is gone.

@ev-br ev-br force-pushed the merge_detail branch 3 times, most recently from e75772d to f9f9ba4 Compare March 27, 2023 20:36
@ev-br ev-br mentioned this pull request Mar 30, 2023
Copy link
Collaborator

@lezcano lezcano left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing two minor things. Approving this contingent on fixing those two.

This one got us -600LOC, not too bad!

@ev-br ev-br force-pushed the mangle_returns_out_annotation branch 2 times, most recently from a126f71 to c81f677 Compare March 31, 2023 10:35
Base automatically changed from mangle_returns_out_annotation to main March 31, 2023 11:49
@ev-br
Copy link
Collaborator Author

ev-br commented Mar 31, 2023

CI is green, the two last comments are addressed, merging.

@ev-br ev-br merged commit faf2350 into main Mar 31, 2023
@ev-br ev-br deleted the merge_detail branch March 31, 2023 12:43
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants