Skip to content

Clean Up C Warnings #31935

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
Feb 17, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions pandas/_libs/algos.pyx
Original file line number Diff line number Diff line change
Expand Up @@ -1173,12 +1173,12 @@ ctypedef fused out_t:

@cython.boundscheck(False)
@cython.wraparound(False)
def diff_2d(ndarray[diff_t, ndim=2] arr,
ndarray[out_t, ndim=2] out,
def diff_2d(diff_t[:, :] arr,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

const?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm tried but seems to crash the compiler. Traceback below

[1/1] Cythonizing pandas/_libs/algos.pyx

Error compiling Cython file:
------------------------------------------------------------
...
    float64_t


@cython.boundscheck(False)
@cython.wraparound(False)
def diff_2d(const diff_t[:, :] arr,
                        ^
------------------------------------------------------------

pandas/_libs/algos.pyx:1178:25: Compiler crash in AnalyseDeclarationsTransform

File 'ModuleNode.py', line 124, in analyse_declarations: ModuleNode(algos.pyx:1:0,
    full_module_name = 'pandas._libs.algos')
File 'Nodes.py', line 431, in analyse_declarations: StatListNode(algos.pyx:1:0)
File 'Nodes.py', line 431, in analyse_declarations: StatListNode(algos.pyx:1178:0)
File 'Nodes.py', line 375, in analyse_declarations: CompilerDirectivesNode(algos.pyx:1178:0)
File 'Nodes.py', line 431, in analyse_declarations: StatListNode(algos.pyx:1178:0)
File 'Nodes.py', line 2887, in analyse_declarations: DefNode(algos.pyx:1178:0,
    modifiers = [...]/0,
    name = 'diff_2d',
    num_required_args = 4,
    outer_attrs = [...]/2,
    py_wrapper_required = True,
    reqd_kw_flags_cname = '0')
File 'Nodes.py', line 2925, in analyse_argument_types: DefNode(algos.pyx:1178:0,
    modifiers = [...]/0,
    name = 'diff_2d',
    num_required_args = 4,
    outer_attrs = [...]/2,
    py_wrapper_required = True,
    reqd_kw_flags_cname = '0')
File 'Nodes.py', line 1093, in analyse: MemoryViewSliceTypeNode(algos.pyx:1178:25,
    name = 'memoryview')

Compiler crash traceback from this point on:
  File "/Users/williamayd/miniconda3/envs/pandas-dev/lib/python3.7/site-packages/Cython/Compiler/Nodes.py", line 1093, in analyse
    self.type = PyrexTypes.MemoryViewSliceType(base_type, axes_specs)
  File "/Users/williamayd/miniconda3/envs/pandas-dev/lib/python3.7/site-packages/Cython/Compiler/PyrexTypes.py", line 625, in __init__
    self.dtype_name = Buffer.mangle_dtype_name(self.dtype)
  File "/Users/williamayd/miniconda3/envs/pandas-dev/lib/python3.7/site-packages/Cython/Compiler/Buffer.py", line 631, in mangle_dtype_name
    return prefix + dtype.specialization_name()
  File "/Users/williamayd/miniconda3/envs/pandas-dev/lib/python3.7/site-packages/Cython/Compiler/PyrexTypes.py", line 57, in specialization_name
    common_subs = (self.empty_declaration_code()
  File "/Users/williamayd/miniconda3/envs/pandas-dev/lib/python3.7/site-packages/Cython/Compiler/PyrexTypes.py", line 51, in empty_declaration_code
    self._empty_declaration = self.declaration_code('')
  File "/Users/williamayd/miniconda3/envs/pandas-dev/lib/python3.7/site-packages/Cython/Compiler/PyrexTypes.py", line 1582, in declaration_code
    return self.const_base_type.declaration_code("const %s" % entity_code, for_display, dll_linkage, pyrex)
  File "/Users/williamayd/miniconda3/envs/pandas-dev/lib/python3.7/site-packages/Cython/Compiler/PyrexTypes.py", line 1649, in declaration_code
    raise Exception("This may never happen, please report a bug")
Exception: This may never happen, please report a bug
Traceback (most recent call last):
  File "setup.py", line 784, in <module>
    setup_package()
  File "setup.py", line 754, in setup_package
    ext_modules=maybe_cythonize(extensions, compiler_directives=directives),
  File "setup.py", line 527, in maybe_cythonize
    return cythonize(extensions, *args, **kwargs)
  File "/Users/williamayd/miniconda3/envs/pandas-dev/lib/python3.7/site-packages/Cython/Build/Dependencies.py", line 1096, in cythonize
    cythonize_one(*args)
  File "/Users/williamayd/miniconda3/envs/pandas-dev/lib/python3.7/site-packages/Cython/Build/Dependencies.py", line 1219, in cythonize_one
    raise CompileError(None, pyx_file)
Cython.Compiler.Errors.CompileError: pandas/_libs/algos.pyx

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

huh, possibly a cython issue

out_t[:, :] out,
Py_ssize_t periods, int axis):
cdef:
Py_ssize_t i, j, sx, sy, start, stop
bint f_contig = arr.flags.f_contiguous
bint f_contig = arr.is_f_contig()

# Disable for unsupported dtype combinations,
# see https://github.com/cython/cython/issues/2646
Expand Down
4 changes: 2 additions & 2 deletions pandas/_libs/hashing.pyx
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ import cython
from libc.stdlib cimport malloc, free

import numpy as np
from numpy cimport uint8_t, uint32_t, uint64_t, import_array
from numpy cimport ndarray, uint8_t, uint32_t, uint64_t, import_array
import_array()

from pandas._libs.util cimport is_nan
Expand All @@ -15,7 +15,7 @@ DEF dROUNDS = 4


@cython.boundscheck(False)
def hash_object_array(object[:] arr, object key, object encoding='utf8'):
def hash_object_array(ndarray[object] arr, object key, object encoding='utf8'):
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I get the impression that memory views used in tandem with object will throw warnings cc @jbrockmendel

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

that makes sense. this would also raise ATM if we were to pass a read-only ndarray[object], so a worthwhile change

"""
Parameters
----------
Expand Down
2 changes: 1 addition & 1 deletion pandas/_libs/tslibs/conversion.pyx
Original file line number Diff line number Diff line change
Expand Up @@ -152,7 +152,7 @@ def ensure_timedelta64ns(arr: ndarray, copy: bool=True):

@cython.boundscheck(False)
@cython.wraparound(False)
def datetime_to_datetime64(object[:] values):
def datetime_to_datetime64(ndarray[object] values):
"""
Convert ndarray of datetime-like objects to int64 array representing
nanosecond timestamps.
Expand Down
2 changes: 1 addition & 1 deletion pandas/_libs/window/aggregations.pyx
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ cdef:
cdef inline int int_max(int a, int b): return a if a >= b else b
cdef inline int int_min(int a, int b): return a if a <= b else b

cdef inline bint is_monotonic_start_end_bounds(
cdef bint is_monotonic_start_end_bounds(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm, why can't this also be in-line?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This was the cause of the warning: pandas/_libs/window/aggregations.pyx:60:4: Buffer unpacking not optimized away. warning. @mroeschke suggested a potential inline approach that got rid of the warning and used memory views but caused test failures. Could still adjust a few things to make that work but I think without a larger effort this is the best option for now

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is a highly optimized path

if u can show this doesn’t have any perf regressions then ok

ndarray[int64_t, ndim=1] start, ndarray[int64_t, ndim=1] end
):
return is_monotonic(start, False)[0] and is_monotonic(end, False)[0]
Expand Down