Skip to content

More torch fixes #184

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 10 commits into from
May 23, 2023
Merged

More torch fixes #184

merged 10 commits into from
May 23, 2023

Conversation

honno
Copy link
Member

@honno honno commented Apr 27, 2023

@honno honno force-pushed the fix-complex-stype branch from 4ad8996 to 7a1664a Compare April 27, 2023 12:16
@honno honno force-pushed the fix-complex-stype branch from 7a1664a to 9bffaea Compare May 1, 2023 12:37
@honno honno marked this pull request as ready for review May 1, 2023 12:38
@honno honno requested a review from asmeurer May 1, 2023 12:40
@asmeurer
Copy link
Member

asmeurer commented May 1, 2023

Here's another error:

>       return torch.full(shape, fill_value, dtype=dtype, device=device, **kwargs)
E       RuntimeError: value cannot be converted to type c10::complex<float> without overflow
E       Falsifying example: test_full(
E           shape=(),
E           fill_value=complex(0.0, 3.402823466385289e+38),
E           kw={},
E       )

full uses the default complex dtype by default, which is complex64 for torch, so it shouldn't generate fill values outside of that dtype unless dtype=complex128 explicitly.

@asmeurer
Copy link
Member

asmeurer commented May 1, 2023

There's also this error. I can't tell if it's caused by this or something else:

_______________________________________________________________________________ test_prod _______________________________________________________________________________

    @given(
>       x=xps.arrays(
            dtype=xps.numeric_dtypes(),
            shape=hh.shapes(min_side=1),
            elements={"allow_nan": False},
        ),
        data=st.data(),
    )

array_api_tests/test_statistical_functions.py:112:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
array_api_tests/test_statistical_functions.py:143: in test_prod
    d_m, d_M = dh.dtype_ranges[default_dtype]
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = EqualityMapping({torch.int8: MinMax(min=-128, max=127), torch.int16: MinMax(min=-32768, max=32767), torch.int32: MinMa...86e+38, max=3.4028234663852886e+38), torch.float64: MinMax(min=-1.7976931348623157e+308, max=1.7976931348623157e+308)})
key = None

    def __getitem__(self, key):
        for k, v in self._key_value_pairs:
            if key == k:
                return v
        else:
>           raise KeyError(f"{key!r} not found")
E           KeyError: 'None not found'
E           Falsifying example: test_prod(
E               x=tensor(0, dtype=torch.uint8),
E               data=data(...),
E           )
E           Draw 1 (kw): {}

array_api_tests/dtype_helpers.py:83: KeyError

@asmeurer
Copy link
Member

asmeurer commented May 1, 2023

I think that's it. Every other test failure seems to be either an issue with torch or an unrelated test suite problem which I've reported elsewhere.

@asmeurer asmeurer closed this May 1, 2023
@asmeurer asmeurer reopened this May 1, 2023
@asmeurer
Copy link
Member

asmeurer commented May 1, 2023

Not that it's that important, but can we address #181 (comment) here?

@honno
Copy link
Member Author

honno commented May 2, 2023

full uses the default complex dtype by default, which is complex64 for torch, so it shouldn't generate fill values outside of that dtype unless dtype=complex128 explicitly.

There's also this error. I can't tell if it's caused by this or something else:

Fixed and fixed—thanks for catching these. I folded in #182 as it addresses the former issue. Whilst its nice to keep PRs atomic I think we're not keeping track of multiple PRs at once, so mono PRs with at least a general cohesion is probably better.

Not that it's that important, but can we address #181 (comment) here?

Already addressed!

@honno honno changed the title Fix inferring complex builtin for complex dtypes in get_scalar_type More torch fixes May 2, 2023
@honno
Copy link
Member Author

honno commented May 18, 2023

(I'll merge this soon if no review)

@honno honno merged commit fb49802 into data-apis:master May 23, 2023
@honno honno deleted the fix-complex-stype branch February 28, 2024 13:19
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants