From yury at shurup.com Tue Jul 2 13:54:41 2013 From: yury at shurup.com (Yury V. Zaytsev) Date: Tue, 02 Jul 2013 13:54:41 +0200 Subject: [Cython] Memory views: dereferencing pointer does break strict-aliasing rules Message-ID: <1372766081.2659.14.camel@newpride> Hi, The simplest possible program using memory views compiles with a large number of warnings for me, even for a rather outdated version of gcc: def hello(int [:] a): print(a, "world") If I translate it with the latest released version of Cython like this: cython cpp.pyx cython --cplus cpp.pyx and compile like this: gcc -O3 -march=native -Wall -fPIC -I/opt/ActivePython-2.7/include/python2.7 -c ./cpp.c -o cpp.o g++ -O3 -march=native -Wall -fPIC -I/opt/ActivePython-2.7/include/python2.7 -c ./cpp.cpp -o cpp.o I get lots of warnings (see attached). It doesn't seem to be related to C++ as such, but rather it seems that the memory views code indeed somehow violates strict-aliasing rules. I'm not sure of how severe it is, but the documentation seems to suggest that this might even lead to incorrect results. Can this possibly be fixed in Cython and how important is that? Shall I create a bug report on the Trac? Is my only resort to test whether the compiler supports -fno-strict-aliasing and use that? Thanks! -- Sincerely yours, Yury V. Zaytsev -------------- next part -------------- ./cpp.c: In function ?get_memview_MemoryView_5array_7memview___get__?: ./cpp.c:2565: warning: dereferencing type-punned pointer will break strict-aliasing rules ./cpp.c:2565: warning: dereferencing type-punned pointer will break strict-aliasing rules ./cpp.c: In function ?__pyx_array_new?: ./cpp.c:2873: warning: dereferencing type-punned pointer will break strict-aliasing rules ./cpp.c:2873: warning: dereferencing type-punned pointer will break strict-aliasing rules ./cpp.c: In function ?__pyx_memoryview_is_slice?: ./cpp.c:4045: warning: dereferencing type-punned pointer will break strict-aliasing rules ./cpp.c:4045: warning: dereferencing type-punned pointer will break strict-aliasing rules ./cpp.c: In function ?__pyx_memoryview_MemoryView_10memoryview_16is_c_contig?: ./cpp.c:6196: warning: dereferencing type-punned pointer will break strict-aliasing rules ./cpp.c:6196: warning: dereferencing type-punned pointer will break strict-aliasing rules ./cpp.c: In function ?__pyx_memoryview_MemoryView_10memoryview_18is_f_contig?: ./cpp.c:6261: warning: dereferencing type-punned pointer will break strict-aliasing rules ./cpp.c:6261: warning: dereferencing type-punned pointer will break strict-aliasing rules ./cpp.c: In function ?__pyx_memoryview_new?: ./cpp.c:6479: warning: dereferencing type-punned pointer will break strict-aliasing rules ./cpp.c:6479: warning: dereferencing type-punned pointer will break strict-aliasing rules ./cpp.c: In function ?_unellipsify?: ./cpp.c:6660: warning: dereferencing type-punned pointer will break strict-aliasing rules ./cpp.c:6660: warning: dereferencing type-punned pointer will break strict-aliasing rules ./cpp.c:6816: warning: dereferencing type-punned pointer will break strict-aliasing rules ./cpp.c:6816: warning: dereferencing type-punned pointer will break strict-aliasing rules ./cpp.c:6876: warning: dereferencing type-punned pointer will break strict-aliasing rules ./cpp.c:6876: warning: dereferencing type-punned pointer will break strict-aliasing rules ./cpp.c: In function ?__pyx_memoryview_fromslice?: ./cpp.c:8920: warning: dereferencing type-punned pointer will break strict-aliasing rules ./cpp.c:8920: warning: dereferencing type-punned pointer will break strict-aliasing rules ./cpp.c: In function ?__pyx_getprop___pyx_array_memview?: ./cpp.c:2565: warning: dereferencing pointer ?_Py_TrueStruct.131? does break strict-aliasing rules ./cpp.c:2565: warning: dereferencing pointer ?_Py_TrueStruct.131? does break strict-aliasing rules ./cpp.c:2565: note: initialized from here ./cpp.c:2565: warning: dereferencing pointer ?_Py_ZeroStruct.132? does break strict-aliasing rules ./cpp.c:2565: warning: dereferencing pointer ?_Py_ZeroStruct.132? does break strict-aliasing rules ./cpp.c:2565: note: initialized from here ./cpp.c: In function ?__pyx_memoryview_new?: ./cpp.c:6479: warning: dereferencing pointer ?_Py_TrueStruct.440? does break strict-aliasing rules ./cpp.c:6479: warning: dereferencing pointer ?_Py_TrueStruct.440? does break strict-aliasing rules ./cpp.c:6479: note: initialized from here ./cpp.c:6479: warning: dereferencing pointer ?_Py_ZeroStruct.441? does break strict-aliasing rules ./cpp.c:6479: warning: dereferencing pointer ?_Py_ZeroStruct.441? does break strict-aliasing rules ./cpp.c:6479: note: initialized from here ./cpp.c: In function ?__pyx_memoryview_is_f_contig?: ./cpp.c:6261: warning: dereferencing pointer ?_Py_TrueStruct.430? does break strict-aliasing rules ./cpp.c:6261: warning: dereferencing pointer ?_Py_TrueStruct.430? does break strict-aliasing rules ./cpp.c:6261: note: initialized from here ./cpp.c:6261: warning: dereferencing pointer ?_Py_ZeroStruct.431? does break strict-aliasing rules ./cpp.c:6261: warning: dereferencing pointer ?_Py_ZeroStruct.431? does break strict-aliasing rules ./cpp.c:6261: note: initialized from here ./cpp.c:6270: warning: dereferencing pointer ?__pyx_r? does break strict-aliasing rules ./cpp.c:6270: warning: dereferencing pointer ?__pyx_r? does break strict-aliasing rules ./cpp.c:6270: warning: dereferencing pointer ?__pyx_r? does break strict-aliasing rules ./cpp.c:6270: warning: dereferencing pointer ?__pyx_r? does break strict-aliasing rules ./cpp.c:6261: note: initialized from here ./cpp.c:6261: note: initialized from here ./cpp.c: In function ?__pyx_memoryview_is_c_contig?: ./cpp.c:6196: warning: dereferencing pointer ?_Py_TrueStruct.424? does break strict-aliasing rules ./cpp.c:6196: warning: dereferencing pointer ?_Py_TrueStruct.424? does break strict-aliasing rules ./cpp.c:6196: note: initialized from here ./cpp.c:6196: warning: dereferencing pointer ?_Py_ZeroStruct.425? does break strict-aliasing rules ./cpp.c:6196: warning: dereferencing pointer ?_Py_ZeroStruct.425? does break strict-aliasing rules ./cpp.c:6196: note: initialized from here ./cpp.c:6205: warning: dereferencing pointer ?__pyx_r? does break strict-aliasing rules ./cpp.c:6205: warning: dereferencing pointer ?__pyx_r? does break strict-aliasing rules ./cpp.c:6205: warning: dereferencing pointer ?__pyx_r? does break strict-aliasing rules ./cpp.c:6205: warning: dereferencing pointer ?__pyx_r? does break strict-aliasing rules ./cpp.c:6196: note: initialized from here ./cpp.c:6196: note: initialized from here ./cpp.c: In function ?__pyx_memoryview_is_slice?: ./cpp.c:4045: warning: dereferencing pointer ?_Py_TrueStruct.251? does break strict-aliasing rules ./cpp.c:4045: warning: dereferencing pointer ?_Py_TrueStruct.251? does break strict-aliasing rules ./cpp.c:4045: note: initialized from here ./cpp.c:4045: warning: dereferencing pointer ?_Py_ZeroStruct.252? does break strict-aliasing rules ./cpp.c:4045: warning: dereferencing pointer ?_Py_ZeroStruct.252? does break strict-aliasing rules ./cpp.c:4045: note: initialized from here ./cpp.c: In function ?_unellipsify?: ./cpp.c:6660: warning: dereferencing pointer ?_Py_ZeroStruct.453? does break strict-aliasing rules ./cpp.c:6660: warning: dereferencing pointer ?_Py_ZeroStruct.453? does break strict-aliasing rules ./cpp.c:6660: note: initialized from here ./cpp.c:6816: warning: dereferencing pointer ?_Py_TrueStruct.482? does break strict-aliasing rules ./cpp.c:6816: warning: dereferencing pointer ?_Py_TrueStruct.482? does break strict-aliasing rules ./cpp.c:6816: note: initialized from here ./cpp.c:6973: warning: dereferencing pointer ?__pyx_v_have_slices? does break strict-aliasing rules ./cpp.c:6973: warning: dereferencing pointer ?__pyx_v_have_slices? does break strict-aliasing rules ./cpp.c:6884: warning: dereferencing pointer ?__pyx_v_have_slices? does break strict-aliasing rules ./cpp.c:6884: warning: dereferencing pointer ?__pyx_v_have_slices? does break strict-aliasing rules ./cpp.c:6884: warning: dereferencing pointer ?__pyx_v_have_slices? does break strict-aliasing rules ./cpp.c:6884: warning: dereferencing pointer ?__pyx_v_have_slices? does break strict-aliasing rules ./cpp.c:6881: warning: dereferencing pointer ?__pyx_v_have_slices? does break strict-aliasing rules ./cpp.c:6881: warning: dereferencing pointer ?__pyx_v_have_slices? does break strict-aliasing rules ./cpp.c:6818: warning: dereferencing pointer ?__pyx_v_have_slices? does break strict-aliasing rules ./cpp.c:6818: warning: dereferencing pointer ?__pyx_v_have_slices? does break strict-aliasing rules ./cpp.c:6818: warning: dereferencing pointer ?__pyx_v_have_slices? does break strict-aliasing rules ./cpp.c:6818: warning: dereferencing pointer ?__pyx_v_have_slices? does break strict-aliasing rules ./cpp.c:6662: note: initialized from here ./cpp.c:6819: note: initialized from here ./cpp.c:6876: note: initialized from here ./cpp.c:6876: note: initialized from here ./cpp.c:6876: warning: dereferencing pointer ?_Py_TrueStruct.482? does break strict-aliasing rules ./cpp.c:6876: warning: dereferencing pointer ?_Py_TrueStruct.482? does break strict-aliasing rules ./cpp.c:6876: note: initialized from here ./cpp.c:6876: warning: dereferencing pointer ?_Py_ZeroStruct.453? does break strict-aliasing rules ./cpp.c:6876: warning: dereferencing pointer ?_Py_ZeroStruct.453? does break strict-aliasing rules ./cpp.c:6876: note: initialized from here cc1: warning: dereferencing pointer ?__pyx_v_have_slices? does break strict-aliasing rules ./cpp.c:6816: note: initialized from here ./cpp.c:15681: note: initialized from here ./cpp.c:6660: note: initialized from here ./cpp.c: In function ?__pyx_memoryview_fromslice?: ./cpp.c:8920: warning: dereferencing pointer ?_Py_TrueStruct.595? does break strict-aliasing rules ./cpp.c:8920: warning: dereferencing pointer ?_Py_TrueStruct.595? does break strict-aliasing rules ./cpp.c:8920: note: initialized from here ./cpp.c:8920: warning: dereferencing pointer ?_Py_ZeroStruct.596? does break strict-aliasing rules ./cpp.c:8920: warning: dereferencing pointer ?_Py_ZeroStruct.596? does break strict-aliasing rules ./cpp.c:8920: note: initialized from here ./cpp.c:9128: warning: dereferencing pointer ?__pyx_t_2? does break strict-aliasing rules ./cpp.c:9128: warning: dereferencing pointer ?__pyx_t_2? does break strict-aliasing rules ./cpp.c:9128: warning: dereferencing pointer ?__pyx_t_2? does break strict-aliasing rules ./cpp.c:8920: note: initialized from here ./cpp.c:8920: note: initialized from here From robertwb at gmail.com Tue Jul 2 18:07:49 2013 From: robertwb at gmail.com (Robert Bradshaw) Date: Tue, 2 Jul 2013 09:07:49 -0700 Subject: [Cython] Memory views: dereferencing pointer does break strict-aliasing rules In-Reply-To: <1372766081.2659.14.camel@newpride> References: <1372766081.2659.14.camel@newpride> Message-ID: On Tue, Jul 2, 2013 at 4:54 AM, Yury V. Zaytsev wrote: > Hi, > > The simplest possible program using memory views compiles with a large > number of warnings for me, even for a rather outdated version of gcc: > > def hello(int [:] a): > print(a, "world") > > If I translate it with the latest released version of Cython like this: > > cython cpp.pyx > cython --cplus cpp.pyx > > and compile like this: > > gcc -O3 -march=native -Wall -fPIC -I/opt/ActivePython-2.7/include/python2.7 -c ./cpp.c -o cpp.o > g++ -O3 -march=native -Wall -fPIC -I/opt/ActivePython-2.7/include/python2.7 -c ./cpp.cpp -o cpp.o > > I get lots of warnings (see attached). > > It doesn't seem to be related to C++ as such, but rather it seems that > the memory views code indeed somehow violates strict-aliasing rules. > > I'm not sure of how severe it is, but the documentation seems to suggest > that this might even lead to incorrect results. > > Can this possibly be fixed in Cython and how important is that? Shall I > create a bug report on the Trac? Is my only resort to test whether the > compiler supports -fno-strict-aliasing and use that? You should compile with -fno-strict-aliasing--if you were using distutils rather than gcc directly it should add the all necessary flags for you. Aliasing different pointer types is necessary for Cython--it's how it implements inheritance (in plain C, a PyObject* could be a pointer to a list or dict or your own cdef class--pointer aliasing right there. Also with memory views (and numpy arrays), the underlying data is allocated as a char* and interpreted as a float* or int* or according to the metadata in the array. - Robert From yury at shurup.com Thu Jul 4 17:34:45 2013 From: yury at shurup.com (Yury V. Zaytsev) Date: Thu, 04 Jul 2013 17:34:45 +0200 Subject: [Cython] Memory views: dereferencing pointer does break strict-aliasing rules In-Reply-To: References: <1372766081.2659.14.camel@newpride> Message-ID: <1372952085.7180.23.camel@newpride> Hi Robert, On Tue, 2013-07-02 at 09:07 -0700, Robert Bradshaw wrote: > > You should compile with -fno-strict-aliasing--if you were using > distutils rather than gcc directly it should add the all necessary > flags for you. Indeed, I'm using autotools to compile the module, so now I've added the AX_CHECK_COMPILE_FLAG macro from the Autoconf Archive to check whether -fno-strict-aliasing is supported by the compiler and append it to CXXFLAGS if necessary. > Aliasing different pointer types is necessary for Cython--it's how it > implements inheritance (in plain C, a PyObject* could be a pointer to > a list or dict or your own cdef class--pointer aliasing right there. > Also with memory views (and numpy arrays), the underlying data is > allocated as a char* and interpreted as a float* or int* or according > to the metadata in the array. Thank you very much for this explanation, I just wanted to make sure that this is unavoidable! -- Sincerely yours, Yury V. Zaytsev From yury at shurup.com Fri Jul 5 14:40:51 2013 From: yury at shurup.com (Yury V. Zaytsev) Date: Fri, 05 Jul 2013 14:40:51 +0200 Subject: [Cython] Cython and Py2.6/Py2.7 format() differences: '{}' vs. '{0}' Message-ID: <1373028051.2827.32.camel@newpride> Hi, I have just discovered that if one uses Python 2.7 constructs in the code like print("Object: {}".format(obj)) the resulting C++ module compiles just fine with both Python 2.6 and Python 2.7 headers, however, in the former case, ValueError is raised at runtime: ValueError: zero length field name in format Is this an intended behavior or Cython should have abstracted this difference for me, and it can be considered a bug? Thanks! -- Sincerely yours, Yury V. Zaytsev From stefan_ml at behnel.de Fri Jul 5 15:52:42 2013 From: stefan_ml at behnel.de (Stefan Behnel) Date: Fri, 05 Jul 2013 15:52:42 +0200 Subject: [Cython] Cython and Py2.6/Py2.7 format() differences: '{}' vs. '{0}' In-Reply-To: <1373028051.2827.32.camel@newpride> References: <1373028051.2827.32.camel@newpride> Message-ID: <51D6CFAA.2040509@behnel.de> Yury V. Zaytsev, 05.07.2013 14:40: > I have just discovered that if one uses Python 2.7 constructs in the > code like > > print("Object: {}".format(obj)) > > the resulting C++ module compiles just fine with both Python 2.6 and > Python 2.7 headers, however, in the former case, ValueError is raised at > runtime: > > ValueError: zero length field name in format > > Is this an intended behavior or Cython should have abstracted this > difference for me, and it can be considered a bug? Cython has nothing to do with this. Stefan From stefan_ml at behnel.de Fri Jul 5 20:49:59 2013 From: stefan_ml at behnel.de (Stefan Behnel) Date: Fri, 05 Jul 2013 20:49:59 +0200 Subject: [Cython] Sage build broken on Jenkins Message-ID: <51D71557.2080901@behnel.de> Hi, the Sage build has been broken for a while now. Might be due to changes in Sage, or due to one of the (minor?) cythonize() changes in master. https://sage.math.washington.edu:8091/hudson/job/sage-build/ Build error message is: """ IOError: could not find dependency sage/calculus/numpy.pxd included in sage/calculus/riemann.pyx. """ https://sage.math.washington.edu:8091/hudson/job/sage-build/1819/console Stefan From yury at shurup.com Fri Jul 5 23:01:38 2013 From: yury at shurup.com (Yury V. Zaytsev) Date: Fri, 05 Jul 2013 23:01:38 +0200 Subject: [Cython] Cython and Py2.6/Py2.7 format() differences: '{}' vs. '{0}' In-Reply-To: <51D6CFAA.2040509@behnel.de> References: <1373028051.2827.32.camel@newpride> <51D6CFAA.2040509@behnel.de> Message-ID: <1373058098.2645.5.camel@newpride> On Fri, 2013-07-05 at 15:52 +0200, Stefan Behnel wrote: > > Cython has nothing to do with this. Well, Cython does abstract differences for instance in terms of string handling between Py2.x and Py3.x, so that in fact one can enjoy it as a "porting" instrument. But thank you for the verdict and your time, point taken, it's not a bug. -- Sincerely yours, Yury V. Zaytsev From stefan_ml at behnel.de Sat Jul 6 07:39:25 2013 From: stefan_ml at behnel.de (Stefan Behnel) Date: Sat, 06 Jul 2013 07:39:25 +0200 Subject: [Cython] Cython and Py2.6/Py2.7 format() differences: '{}' vs. '{0}' In-Reply-To: <1373058098.2645.5.camel@newpride> References: <1373028051.2827.32.camel@newpride> <51D6CFAA.2040509@behnel.de> <1373058098.2645.5.camel@newpride> Message-ID: <51D7AD8D.10601@behnel.de> Yury V. Zaytsev, 05.07.2013 23:01: > On Fri, 2013-07-05 at 15:52 +0200, Stefan Behnel wrote: >> Cython has nothing to do with this. > > Well, Cython does abstract differences for instance in terms of string > handling between Py2.x and Py3.x, so that in fact one can enjoy it as a > "porting" instrument. To a certain extent, yes. However, hiding this specific difference would mean that a) we'd have to backport the format string parser from CPython and b) it would only work for literal format strings, i.e. this won't work: def func(fstr): return fstr.format(1,2) func('{} abc {}') Meaning that even just a slight refactoring might then break your code. This kind of partial "make it look as if it worked" is a very bad idea. Stefan From scopatz at gmail.com Sat Jul 6 23:26:00 2013 From: scopatz at gmail.com (Anthony Scopatz) Date: Sat, 6 Jul 2013 16:26:00 -0500 Subject: [Cython] Update xdress link Message-ID: Hello All, I was wondering if it would be possible to update the xdress link on the AutoPxd wiki page [1] to the new url: http://xdress.org/ Thanks in advance! Be Well Anthony 1. http://wiki.cython.org/AutoPxd -------------- next part -------------- An HTML attachment was scrubbed... URL: From robertwb at gmail.com Sun Jul 7 00:10:39 2013 From: robertwb at gmail.com (Robert Bradshaw) Date: Sat, 6 Jul 2013 15:10:39 -0700 Subject: [Cython] Update xdress link In-Reply-To: References: Message-ID: On Sat, Jul 6, 2013 at 2:26 PM, Anthony Scopatz wrote: > Hello All, > > I was wondering if it would be possible to update the xdress link on the Of course. Done. - Robert From scopatz at gmail.com Sun Jul 7 01:04:01 2013 From: scopatz at gmail.com (Anthony Scopatz) Date: Sat, 6 Jul 2013 18:04:01 -0500 Subject: [Cython] Update xdress link In-Reply-To: References: Message-ID: Thanks Robert! On Sat, Jul 6, 2013 at 5:10 PM, Robert Bradshaw wrote: > On Sat, Jul 6, 2013 at 2:26 PM, Anthony Scopatz wrote: > > Hello All, > > > > I was wondering if it would be possible to update the xdress link on the > > Of course. Done. > > - Robert > _______________________________________________ > cython-devel mailing list > cython-devel at python.org > http://mail.python.org/mailman/listinfo/cython-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From mrjbq7 at gmail.com Tue Jul 9 16:36:07 2013 From: mrjbq7 at gmail.com (John Benediktsson) Date: Tue, 9 Jul 2013 07:36:07 -0700 Subject: [Cython] Bug report: compiler traceback instead of error message Message-ID: Hi, I had some trouble debugging a traceback that I got from Cython, which ended up being a simple typo in a .pyx file. Below is a reduced example of this problem. If you have a foo.pxd file (specifying a variable "bar"): $ cat foo.pxd cdef class Tree: cpdef build(self, int bar=*) And a foo.pyx file (accidentally calling the variable "baz"): $ cat foo.pyx cdef class Tree: cpdef build(self, int baz=None): print baz You get this nasty traceback when you try to cython it, which makes you think there is a bug in Cython instead of a bug in your code: $ cython foo.pyx Traceback (most recent call last): File "/usr/local/bin/cython", line 9, in load_entry_point('Cython==0.19.1', 'console_scripts', 'cython')() File "/Library/Python/2.7/site-packages/Cython -0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/Main.py", line 618, in setuptools_main return main(command_line = 1) File "/Library/Python/2.7/site-packages/Cython -0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/Main.py", line 635, in main result = compile(sources, options) File "/Library/Python/2.7/site-packages/Cython -0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/Main.py", line 610, in compile return compile_multiple(source, options) File "/Library/Python/2.7/site-packages/Cython -0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/Main.py", line 579, in compile_multiple result = run_pipeline(source, options, context=context) File "/Library/Python/2.7/site-packages/Cython -0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/Main.py", line 425, in run_pipeline err, enddata = Pipeline.run_pipeline(pipeline, source) File "/Library/Python/2.7/site-packages/Cython -0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/Pipeline.py", line 322, in run_pipeline data = phase(data) File "/Library/Python/2.7/site-packages/Cython -0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/Pipeline.py", line 51, in generate_pyx_code_stage module_node.process_implementation(options, result) File "/Library/Python/2.7/site-packages/Cython -0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/ModuleNode.py", line 111, in process_implementation self.generate_c_code(env, options, result) File "/Library/Python/2.7/site-packages/Cython -0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/ModuleNode.py", line 342, in generate_c_code self.body.generate_function_definitions(env, code) File "/Library/Python/2.7/site-packages/Cython -0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/Nodes.py", line 393, in generate_function_definitions stat.generate_function_definitions(env, code) File "/Library/Python/2.7/site-packages/Cython -0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/Nodes.py", line 4267, in generate_function_definitions self.body.generate_function_definitions(self.scope, code) File "/Library/Python/2.7/site-packages/Cython -0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/Nodes.py", line 393, in generate_function_definitions stat.generate_function_definitions(env, code) File "/Library/Python/2.7/site-packages/Cython -0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/Nodes.py", line 1704, in generate_function_definitions self.generate_argument_parsing_code(env, code) File "/Library/Python/2.7/site-packages/Cython -0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/Nodes.py", line 2297, in generate_argument_parsing_code self.type.opt_arg_cname(declarator.name))) File "/Library/Python/2.7/site-packages/Cython -0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/PyrexTypes.py", line 2684, in opt_arg_cname return self.op_arg_struct.base_type.scope.lookup(arg_name).cname AttributeError: 'NoneType' object has no attribute 'cname' Thanks, John. -------------- next part -------------- An HTML attachment was scrubbed... URL: From robertwb at gmail.com Tue Jul 9 22:23:27 2013 From: robertwb at gmail.com (Robert Bradshaw) Date: Tue, 9 Jul 2013 13:23:27 -0700 Subject: [Cython] Bug report: compiler traceback instead of error message In-Reply-To: References: Message-ID: Yes, we should be giving an error message here rather than crashing; thanks for the report. On Tue, Jul 9, 2013 at 7:36 AM, John Benediktsson wrote: > Hi, > > I had some trouble debugging a traceback that I got from Cython, which ended > up being a simple typo in a .pyx file. Below is a reduced example of this > problem. > > If you have a foo.pxd file (specifying a variable "bar"): > > $ cat foo.pxd > cdef class Tree: > cpdef build(self, int bar=*) > > And a foo.pyx file (accidentally calling the variable "baz"): > > $ cat foo.pyx > cdef class Tree: > cpdef build(self, int baz=None): > print baz > > You get this nasty traceback when you try to cython it, which makes you > think there is a bug in Cython instead of a bug in your code: > > $ cython foo.pyx > Traceback (most recent call last): > File "/usr/local/bin/cython", line 9, in > load_entry_point('Cython==0.19.1', 'console_scripts', 'cython')() > File > "/Library/Python/2.7/site-packages/Cython-0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/Main.py", > line 618, in setuptools_main > return main(command_line = 1) > File > "/Library/Python/2.7/site-packages/Cython-0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/Main.py", > line 635, in main > result = compile(sources, options) > File > "/Library/Python/2.7/site-packages/Cython-0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/Main.py", > line 610, in compile > return compile_multiple(source, options) > File > "/Library/Python/2.7/site-packages/Cython-0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/Main.py", > line 579, in compile_multiple > result = run_pipeline(source, options, context=context) > File > "/Library/Python/2.7/site-packages/Cython-0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/Main.py", > line 425, in run_pipeline > err, enddata = Pipeline.run_pipeline(pipeline, source) > File > "/Library/Python/2.7/site-packages/Cython-0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/Pipeline.py", > line 322, in run_pipeline > data = phase(data) > File > "/Library/Python/2.7/site-packages/Cython-0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/Pipeline.py", > line 51, in generate_pyx_code_stage > module_node.process_implementation(options, result) > File > "/Library/Python/2.7/site-packages/Cython-0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/ModuleNode.py", > line 111, in process_implementation > self.generate_c_code(env, options, result) > File > "/Library/Python/2.7/site-packages/Cython-0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/ModuleNode.py", > line 342, in generate_c_code > self.body.generate_function_definitions(env, code) > File > "/Library/Python/2.7/site-packages/Cython-0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/Nodes.py", > line 393, in generate_function_definitions > stat.generate_function_definitions(env, code) > File > "/Library/Python/2.7/site-packages/Cython-0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/Nodes.py", > line 4267, in generate_function_definitions > self.body.generate_function_definitions(self.scope, code) > File > "/Library/Python/2.7/site-packages/Cython-0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/Nodes.py", > line 393, in generate_function_definitions > stat.generate_function_definitions(env, code) > File > "/Library/Python/2.7/site-packages/Cython-0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/Nodes.py", > line 1704, in generate_function_definitions > self.generate_argument_parsing_code(env, code) > File > "/Library/Python/2.7/site-packages/Cython-0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/Nodes.py", > line 2297, in generate_argument_parsing_code > self.type.opt_arg_cname(declarator.name))) > File > "/Library/Python/2.7/site-packages/Cython-0.19.1-py2.7-macosx-10.8-intel.egg/Cython/Compiler/PyrexTypes.py", > line 2684, in opt_arg_cname > return self.op_arg_struct.base_type.scope.lookup(arg_name).cname > AttributeError: 'NoneType' object has no attribute 'cname' > > Thanks, > John. > > _______________________________________________ > cython-devel mailing list > cython-devel at python.org > http://mail.python.org/mailman/listinfo/cython-devel > From torsten.landschoff at dynamore.de Thu Jul 11 00:10:02 2013 From: torsten.landschoff at dynamore.de (Torsten Landschoff) Date: Thu, 11 Jul 2013 00:10:02 +0200 Subject: [Cython] Surprising behaviour wrt. generated tp_clear and tp_dealloc functions In-Reply-To: <51752CD1.6070408@behnel.de> References: <517460BB.5040307@dynamore.de> <5174CB1F.50202@behnel.de> <517519FF.1010206@dynamore.de> <51752407.3070405@behnel.de> <51752560.4080304@dynamore.de> <51752CD1.6070408@behnel.de> Message-ID: Hi Stefan, sorry for the delay, I was in the US for parental leave. I had hoped to find some time to work on the promised patch over there, but I fell short. On 04/22/2013 02:28 PM, Stefan Behnel wrote: > Please do. Just ask back on this list if there's anything that's not > clear to you. I attached my current (trivial) patch. Currently I only support a decorator @cython.noclear cdef class ... to inhibit generation of tp_clear. Before I continue with this approach I am wondering about the API. Is noclear usable as a name? I think not because nobody will know what it is talking about. But I do not know how to press the information "do not generate the tp_clear slot which will clear references to break reference cycles during GC" into a short name. Perhaps something along the lines of "@cython.gc_keep_references" or "@cython.exempt_from_gc_cycle_breaker"!? How should the decorator to completely disable GC for a class be called? @cython.nogc? @cython.refcounted (because the class will only support cleanup via reference counting)? Any input appreciated. Greetings, Torsten -- DYNAmore Gesellschaft fuer Ingenieurdienstleistungen mbH Torsten Landschoff Office Dresden Tel: +49-(0)351-4519587 Fax: +49-(0)351-4519561 mailto:torsten.landschoff at dynamore.de http://www.dynamore.de DYNAmore Gesellschaft f?r FEM Ingenieurdienstleistungen mbH Registration court: Stuttgart, HRB 733694 Managing director: Prof. Dr. Karl Schweizerhof, Dipl.-Math. Ulrich Franz -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 0001-Inhibit-tp_clear-generation-via-class-decorator.patch Type: text/x-patch Size: 7607 bytes Desc: not available URL: From stefan_ml at behnel.de Thu Jul 11 19:52:38 2013 From: stefan_ml at behnel.de (Stefan Behnel) Date: Thu, 11 Jul 2013 19:52:38 +0200 Subject: [Cython] Py3 annotation syntax for static typing Message-ID: <51DEF0E6.9040907@behnel.de> [CC-ing cython-dev] Hi Jukka, I stumbled over your blog post at http://mypy-lang.blogspot.de/2013/07/mypy-switches-to-python-compatible.html It says that you are starting to adopt the Py3 function annotation syntax for mypy. I think we should try to keep that in sync for both mypy and Cython. It would be bad to have two ways to do it. When we discussed this for Cython (see the trac ticket [1] and Wiki entry [2] below), also on the mailing list (back in 2008 I guess), the main problem was that it's not immediately clear what should be considered type annotations by the compiler and what's arbitrary "other stuff" that users put into declarations. There's also no obvious way to allow for multiple independent annotations. Should they be in a tuple? In a dict? What would be dict key be? How can we prevent collisions and conflicts when multiple tools base their features on annotations? It's mainly a namespace problem. The annotation namespace is essentially flat, and I think that only practical usage can eventually establish suitable conventions. I'd like to discuss this, maybe we can come up with a suitable Best Practice. What's your opinion on the issue so far? There's also our current Pure Python syntax mode for everything that cannot be expressed with function annotations: http://docs.cython.org/src/tutorial/pure.html I would guess that mypy will eventually need something similar. Stefan [1] http://trac.cython.org/cython_trac/ticket/752 [2] http://wiki.cython.org/enhancements/pure From stefan_ml at behnel.de Sat Jul 13 07:32:38 2013 From: stefan_ml at behnel.de (Stefan Behnel) Date: Sat, 13 Jul 2013 07:32:38 +0200 Subject: [Cython] Py3 annotation syntax for static typing In-Reply-To: References: <51DEF0E6.9040907@behnel.de> Message-ID: <51E0E676.1010703@behnel.de> Jukka Lehtosalo, 12.07.2013 14:03: >> On Thu, Jul 11, 2013 at 6:52 PM, Stefan Behnel wrote: > http://mypy-lang.blogspot.de/2013/07/mypy-switches-to-python-compatible.html > >> It says that you are starting to adopt the Py3 function annotation syntax >> for mypy. I think we should try to keep that in sync for both mypy and >> Cython. It would be bad to have two ways to do it. > > Agreed. > >> When we discussed this for Cython (see the trac ticket [1] and Wiki entry >> [2] below), also on the mailing list (back in 2008 I guess), the main >> problem was that it's not immediately clear what should be considered type >> annotations by the compiler and what's arbitrary "other stuff" that users >> put into declarations. There's also no obvious way to allow for multiple >> independent annotations. Should they be in a tuple? In a dict? What would >> be dict key be? How can we prevent collisions and conflicts when multiple >> tools base their features on annotations? > > I think that this general problem can approached from several > directions: > > (1) Allow different annotation styles to be used in different modules, > but expect a uniform annotation style within a module. Each tool > should be able to recognize modules that use its annotation > syntax and ignore annotations in other modules. That would still conflict with the case of more than one kind of annotation, e.g. a docstring and (one or potentially more) static types. > (2) Support a fallback annotation syntax in each tool that doesn't use > Python 3 annotations, such as function decorators. This would let > users continue to use arbitrary annotations, at the cost of an > uglier type annotation syntax. Also, the syntax must support > stacking multiple annotations. Function decorators are nice in > this respect, but annotations in comments may be more troublesome. Cython has that in its Pure Python Mode syntax. I think it's ok to use tool specific decorators approach here, although it would be nice if we could keep at least the names similar for both Cython and MyPy. So, please take a look through the Pure Mode page if you want to go this route, and let's discuss anything that doesn't fit one of the tools for some reason. > (3) Support mixing multiple independent annotation styles within a > single function, using the Python 3 annotation syntax (as you > mentioned, if I understood your point correctly). Yes, I think there should be a way to do that. I haven't really seen any tools out in the wild that use the annotation syntax for any of their features, so this still seems to be uncovered ground. I think using a dict would be a good idea. Or a multi-level approach, where the annotation value can be a) a single value, which one tool would recognise and other tools would reject (or ignore? sounds fragile to ignore unknown values...). Maybe just ignore plain (doc-)strings and reject all other values that a tool cannot recognise. b) a 2-tuple containing one (doc-)string and one tool specific annotation value, as in a). Maybe require the docstring to appear after the annotation value, as the docstring is likely to be longer, and the annotation value might also be in a string. So, a 2-tuple (type, docstring). c) a dict that maps tool specific keys to annotation values. Each tool would only care about its own keys and ignore all others. I think this also provides for an easy upgrade path: start with a single value, optionally add a docstring to it in a tuple. If you need more tools, switch to an explicit dict. People will usually know which tool the original annotations belong to (and if not, they can just rip them out, right?), so the transition towards a more verbose syntax should be quite straight forward, and the multi-level approach helps in keeping things simple in the simple case. The dict keys should then be either an identifier that multiple tools understand (e.g. "type": list), or use a tool specific prefix, e.g. "cy:", "my:", "jy:", etc. Note that "type": int might be ambiguous, as Cython would consider it a C int. But I guess "ctype": int would solve that, and still be usable by multiple tools. An alternative would be to allow for (or require?) prefixed string values as simple/2-tuple annotation values, e.g. "ctype:int" or "cy:type:int". That would also allow to distinguish simple docstrings from simple annotation values in strings. Or, well, rely on module namespaces and imports to distinguish between annotation values. I see that mypy has a "typing" module which you can import. http://mypy-lang.org/tutorial.html#typing In Cython, there's the special "cython" module namespace. Personally, I find "typing" a bit broad as a top-level module name. There are also the ABCs in Py3, spread over the modules "abc", "collections.abc" (in Py3.3, previously in "collections") and "types" in the standard library. Some of your annotations seem to overlap with those. If the goal is to avoid duplication, it would be good to integrate what's there anyway. > (4) Agree on a single common type annotation style that can be used by > multiple projects. > > I think (1) is essential, (2) is important, and I'm undecied about > (3). I'm not sure whether (4) is practically feasible, as different > projects have different goals and features they want to support. See above. I think there can and should be a mix, as long as different projects keep an eye on each other. Keep common what we can, diverge where we it makes sense. >> It's mainly a namespace problem. The annotation namespace is essentially >> flat, and I think that only practical usage can eventually establish >> suitable conventions. > >> I'd like to discuss this, maybe we can come up with a suitable Best >> Practice. What's your opinion on the issue so far? > > I think we should first focus on (1) above, as it should be easy to > solve. As for (3), we should come up with some real-world use cases > or examples. For example, here are a few potential scenarios: > > (A) A programmer uses Python 3 annotations for internal purposes, > using an ad-hoc syntax. The programmer would like to speed up the > code by using Cython, but she wants the code to remain > Python-compatible, and she wants to continue using Python 3 > annotations for internal purposes. > > (B) A programmer uses mypy for type checking his program. He'd like to > speed up a critical function using Cython, but he also wants to > retain mypy type checking. I think both cases are covered by the approach above. Although users might end up having (or wanting?) to use both or all three syntaxes in the same module, in order to cover all cases. This might have an impact on code readability. My guess is that users who start using the dict based annotation style in a module at some point are best advised to convert the entire module at that point, in order to keep it simpler to read. An automated conversion tool based on 2to3 could then do the trick. >> There's also our current Pure Python syntax mode for everything that >> cannot be expressed with function annotations: > >> http://docs.cython.org/src/tutorial/pure.html > > I'm already somewhat familiar with the Pure Python syntax, though I > haven't tried it in practice. > >> I would guess that mypy will eventually need something similar. > > Mypy already (only) has a pure Python syntax, as all mypy constructs > are now syntactically valid Python, and valid mypy programs are > basically also runnable Python programs. In fact, currently a Python > 3.2 or later VM is the only supported way of running mypy programs. Ok, so you basically have a way to statically type variables during assignments. http://mypy-lang.org/tutorial.html#collectiontypes Cython has a slightly different approach here in that it requires either an external decorator (@cython.locals()) or an explicit function call during assignments (cython.declare()). I must say, I find neither of the three approaches really pretty, but I guess there simply is no perfect way to do these things, once you accept that you have to put static type annotations *somewhere* in your code. Stefan From stefan_ml at behnel.de Sun Jul 14 08:47:46 2013 From: stefan_ml at behnel.de (Stefan Behnel) Date: Sun, 14 Jul 2013 08:47:46 +0200 Subject: [Cython] getting rid of redundancy in the docs Message-ID: <51E24992.70500@behnel.de> Hi all, I recently added a couple of things to the documentation and constantly failed to find the right place to add it at first (or sometimes even second) guess. The problem is that the documentation is highly redundant. We discussed this before, but it's really getting in the way now. What I would like to see eventually (and let's discuss this openly on the cython-users list), is just a tutorial part to explain the major features, and a reference part that explains all the rest in detail. The user guide is highly redundant with both. However, since the current user guide carries a whole load of text, I think it's easier to make it the new reference part and merge the (few) current reference pages back into the user guide. And then extract the sections that actually belong into the tutorial part, i.e. delete the Overview, move the Basic Tutorial over, and most likely some other sections, too. Does everyone agree about that approach? Is anyone able to help with this? Stefan From stefan_ml at behnel.de Sun Jul 14 10:39:47 2013 From: stefan_ml at behnel.de (Stefan Behnel) Date: Sun, 14 Jul 2013 10:39:47 +0200 Subject: [Cython] Surprising behaviour wrt. generated tp_clear and tp_dealloc functions In-Reply-To: References: <517460BB.5040307@dynamore.de> <5174CB1F.50202@behnel.de> <517519FF.1010206@dynamore.de> <51752407.3070405@behnel.de> <51752560.4080304@dynamore.de> <51752CD1.6070408@behnel.de> Message-ID: <51E263D3.3020408@behnel.de> Torsten Landschoff, 11.07.2013 00:10: > I attached my current (trivial) patch. Currently I only support a decorator > > @cython.noclear > cdef class ... > > to inhibit generation of tp_clear. Thanks, looks ok to me. Please open a pull request on github for it. > Before I continue with this approach I am wondering about the API. Is > noclear usable as a name? I think not because nobody will know what it > is talking about. I think that name is as good as any. It could be "no_gc_clear", if you find that clearer. It hints at GC, at least. I think that should be part of the name somehow. Please also document it somewhere in the extension types doc section (in the user guide): https://github.com/cython/cython/blob/master/docs/src/userguide/extension_types.rst > But I do not know how to press the information "do not generate the > tp_clear slot which will clear references to break reference cycles > during GC" into a short name. It doesn't have to say it all, especially since it's a very special purpose thing. Eventually, Cython should be able to reduce the need even further, so users will only have to apply it when they already know that they need it. The docs should answer the question "is there a way to do this?" > How should the decorator to completely disable GC for a class be called? > @cython.nogc? +1 People who apply it will already have to know the difference between reference counting and cyclic garbage collection, otherwise, they wouldn't even know why to use it. > @cython.refcounted (because the class will only support > cleanup via reference counting)? No, all Python objects are ref-counted. Stefan From stefan_ml at behnel.de Sun Jul 14 13:19:21 2013 From: stefan_ml at behnel.de (Stefan Behnel) Date: Sun, 14 Jul 2013 13:19:21 +0200 Subject: [Cython] next releases Message-ID: <51E28939.1080706@behnel.de> Hi, I'm preparing another bug-fix point release and very conservatively merged over a few change sets into 0.19.x that seemed to be safe fixes. Please take a look and also propose or pick more changes if you feel that I left something out. Regarding the next master release, the only larger new feature I can see is Vitja's updated type inference engine. I think it should go in, but it needs some testing against "real" code (both Python and Cython, I guess). https://github.com/cython/cython/pull/233 The exttype noclear and nogc options should also go in, preferably with some more enhancements that make Cython smarter internally. Reducing the GC overhead by avoiding unnecessary support functions sounds valuable. Is there anything else that you have pending? I would also like to clean up the documentation for the next major release, see my e-mail to cython-users. Stefan From scopatz at gmail.com Sun Jul 14 19:07:10 2013 From: scopatz at gmail.com (Anthony Scopatz) Date: Sun, 14 Jul 2013 12:07:10 -0500 Subject: [Cython] getting rid of redundancy in the docs In-Reply-To: <51E24992.70500@behnel.de> References: <51E24992.70500@behnel.de> Message-ID: Hello Stefan, I think that this is a good idea and I just thought that I'd throw the following out there. We got rid of a lot of redundancy by eliminating our wiki for PyTables. It was a little bit of a hassle to move all of the moin pages that we had to sphinx, but ultimately I think that it was the right decision. It made it way easier on developers and -- hopefully -- users as well to just have a single place where the documentation kept. (As a Cython user I have definitely forgotten whether I read something on the sphinx doc or the wiki.) So +1 to this idea, but unfortunately, I can't volunteer any time to actually work on it :(. Be Well Anthony On Sun, Jul 14, 2013 at 1:47 AM, Stefan Behnel wrote: > Hi all, > > I recently added a couple of things to the documentation and constantly > failed to find the right place to add it at first (or sometimes even > second) guess. The problem is that the documentation is highly redundant. > We discussed this before, but it's really getting in the way now. > > What I would like to see eventually (and let's discuss this openly on the > cython-users list), is just a tutorial part to explain the major features, > and a reference part that explains all the rest in detail. The user guide > is highly redundant with both. > > However, since the current user guide carries a whole load of text, I think > it's easier to make it the new reference part and merge the (few) current > reference pages back into the user guide. And then extract the sections > that actually belong into the tutorial part, i.e. delete the Overview, move > the Basic Tutorial over, and most likely some other sections, too. > > Does everyone agree about that approach? Is anyone able to help with this? > > Stefan > _______________________________________________ > cython-devel mailing list > cython-devel at python.org > http://mail.python.org/mailman/listinfo/cython-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From yury at shurup.com Mon Jul 15 17:03:24 2013 From: yury at shurup.com (Yury V. Zaytsev) Date: Mon, 15 Jul 2013 17:03:24 +0200 Subject: [Cython] Difference between Cython / Python isinstance() vs. NumPy scalars, whose bug is this? Message-ID: <1373900604.2814.14.camel@newpride> Hi, I've debugged my problem with passing NumPy scalars via Cython into C++ down to the fact that inside Cython code `isinstance(x, float)` returns True on floating point NumPy scalars, but `isinstance(x, (int, long))` returns False on integer NumPy scalars. However, when I run the same code from within Python, both checks work just fine. So is this a genuine bug in Cython, or, rather, NumPy is doing some black magic behind the scenes, that prevents Cython-compiled modules from working correctly? Here is a minimal reproducer for your convenience (tested with Cython 0.19.1, Python 2.7.1 and NumPy 1.7.1): # bt.pyx def foo(x): if isinstance(x, (int, long)): print("I'm an integer number") elif isinstance(x, float): print("I'm a floating point number") else: print("I don't know who I am") # IPython transcript In [1]: import numpy as np In [2]: import bt In [3]: bt.foo(np.array( (1., 2., 3.), dtype=np.float )[2]) I'm a floating point number In [4]: bt.foo(np.array( (1., 2., 3.), dtype=np.int )[2]) I don't know who I am In [5]: %paste def bar(x): if isinstance(x, (int, long)): print("I'm an integer number") elif isinstance(x, float): print("I'm a floating point number") else: print("I don't know who I am") ## -- End pasted text -- In [6]: bar(np.array( (1., 2., 3.), dtype=np.float )[2]) I'm a floating point number In [7]: bar(np.array( (1., 2., 3.), dtype=np.int )[2]) I'm an integer number Thanks in advance for pointing me in the right direction! -- Sincerely yours, Yury V. Zaytsev From yury at shurup.com Mon Jul 15 17:54:04 2013 From: yury at shurup.com (Yury V. Zaytsev) Date: Mon, 15 Jul 2013 17:54:04 +0200 Subject: [Cython] Difference between Cython / Python isinstance() vs. NumPy scalars, Python bug? In-Reply-To: <1373900604.2814.14.camel@newpride> References: <1373900604.2814.14.camel@newpride> Message-ID: <1373903644.2814.28.camel@newpride> On Mon, 2013-07-15 at 17:03 +0200, Yury V. Zaytsev wrote: > However, when I run the same code from within Python, both checks work > just fine. So is this a genuine bug in Cython, or, rather, NumPy is > doing some black magic behind the scenes, that prevents > Cython-compiled modules from working correctly? I think I've found the problem, but still, it's not clear to me how do go about solving it, so your guidance would be very much appreciated. As it appears, this could be then a bug in Python, and C-API specifically? Cython optimizer in _handle_simple_function_isinstance() replaces `isinstance(x, (int, long, float))` with corresponding C-API function calls (which are actually macros) like this (this is perfectly legal!): #define PyInt_Check(op) \ PyType_FastSubclass((op)->ob_type, Py_TPFLAGS_INT_SUBCLASS) #define PyLong_Check(op) \ PyType_FastSubclass(Py_TYPE(op), Py_TPFLAGS_LONG_SUBCLASS) #define PyFloat_Check(op) PyObject_TypeCheck(op, &PyFloat_Type) However, it seems that PyType_FastSubclass doesn't really work correctly, because when I replace the following in the generated code everything works perfectly: (a) __pyx_t_4 = PyInt_Check(__pyx_v_x, &PyInt_Type); <-> (b) __pyx_t_4 = PyObject_TypeCheck(__pyx_v_x, &PyInt_Type); So, I guess, my question can be now really split into two questions, a practical one and a philosophical one: 1) What's the best way to get Cython to generate (b) instead of (a)? 2) Shall I open a bug against Python and suggest to investigate why PyType_FastSubclass is not really equivalent to PyObject_TypeCheck as it should be for the built-in types? Many thanks, -- Sincerely yours, Yury V. Zaytsev From robertwb at gmail.com Tue Jul 16 06:29:29 2013 From: robertwb at gmail.com (Robert Bradshaw) Date: Mon, 15 Jul 2013 21:29:29 -0700 Subject: [Cython] getting rid of redundancy in the docs In-Reply-To: References: <51E24992.70500@behnel.de> Message-ID: Great idea re wiki, especially as github makes doc edits and pull requests nearly as easy. I agree the docs need cleaning, I'll see what I can do once I get to real internet. On Jul 14, 2013 10:07 AM, "Anthony Scopatz" wrote: > Hello Stefan, > > I think that this is a good idea and I just thought that I'd throw the > following out there. We got rid of a lot of redundancy by eliminating our > wiki for PyTables. It was a little bit of a hassle to move all of the moin > pages that we had to sphinx, but ultimately I think that it was the right > decision. It made it way easier on developers and -- hopefully -- users as > well to just have a single place where the documentation kept. (As a > Cython user I have definitely forgotten whether I read something on the > sphinx doc or the wiki.) > > So +1 to this idea, but unfortunately, I can't volunteer any time to > actually work on it :(. > > Be Well > Anthony > > > > > On Sun, Jul 14, 2013 at 1:47 AM, Stefan Behnel wrote: > >> Hi all, >> >> I recently added a couple of things to the documentation and constantly >> failed to find the right place to add it at first (or sometimes even >> second) guess. The problem is that the documentation is highly redundant. >> We discussed this before, but it's really getting in the way now. >> >> What I would like to see eventually (and let's discuss this openly on the >> cython-users list), is just a tutorial part to explain the major features, >> and a reference part that explains all the rest in detail. The user guide >> is highly redundant with both. >> >> However, since the current user guide carries a whole load of text, I >> think >> it's easier to make it the new reference part and merge the (few) current >> reference pages back into the user guide. And then extract the sections >> that actually belong into the tutorial part, i.e. delete the Overview, >> move >> the Basic Tutorial over, and most likely some other sections, too. >> >> Does everyone agree about that approach? Is anyone able to help with this? >> >> Stefan >> _______________________________________________ >> cython-devel mailing list >> cython-devel at python.org >> http://mail.python.org/mailman/listinfo/cython-devel >> > > > _______________________________________________ > cython-devel mailing list > cython-devel at python.org > http://mail.python.org/mailman/listinfo/cython-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From torsten.landschoff at dynamore.de Tue Jul 16 11:15:06 2013 From: torsten.landschoff at dynamore.de (Torsten Landschoff) Date: Tue, 16 Jul 2013 11:15:06 +0200 Subject: [Cython] Surprising behaviour wrt. generated tp_clear and tp_dealloc functions In-Reply-To: <51E263D3.3020408@behnel.de> References: <517460BB.5040307@dynamore.de> <5174CB1F.50202@behnel.de> <517519FF.1010206@dynamore.de> <51752407.3070405@behnel.de> <51752560.4080304@dynamore.de> <51752CD1.6070408@behnel.de> <51DDDBBA.3040702@dynamore.de> <51E263D3.3020408@behnel.de> Message-ID: Hi Stefan, On 07/14/2013 10:39 AM, Stefan Behnel wrote: > Torsten Landschoff, 11.07.2013 00:10: >> I attached my current (trivial) patch. Currently I only support a decorator >> >> @cython.noclear >> cdef class ... >> >> to inhibit generation of tp_clear. > Thanks, looks ok to me. Please open a pull request on github for it. I'd like to remove some code duplication I introduced in my changes and add the feature to exclude specific attributes from clearing. >> Before I continue with this approach I am wondering about the API. Is >> noclear usable as a name? I think not because nobody will know what it >> is talking about. > I think that name is as good as any. It could be "no_gc_clear", if you find > that clearer. It hints at GC, at least. I think that should be part of the > name somehow. Agreed. So there will be the following decorator: @cython.no_gc_clear class ... to disable generation of the clear method and @cython.no_gc class ... to disable GC entirely. Okay for you? I'd also like to support @cython.no_gc_clear("keepthis", "keepthat") to omit specific attributes from clear handling. I am not sure though how to declare that decorator in directive_types. Would something like this work: directive_types = { 'final' : bool, # final cdef classes and methods # ... 'no_gc_clear': one_of('bool', 'list'), # ... } ?? How would I tell the parser that it should accept the strings as *args of the decorator (if you know what I mean)? > Please also document it somewhere in the extension types doc section (in > the user guide): > > https://github.com/cython/cython/blob/master/docs/src/userguide/extension_types.rst Thanks for the pointer, I was wondering about the right documentation set to change. Greetings, Torsten -- DYNAmore Gesellschaft fuer Ingenieurdienstleistungen mbH Torsten Landschoff Office Dresden Tel: +49-(0)351-4519587 Fax: +49-(0)351-4519561 mailto:torsten.landschoff at dynamore.de http://www.dynamore.de DYNAmore Gesellschaft f?r FEM Ingenieurdienstleistungen mbH Registration court: Stuttgart, HRB 733694 Managing director: Prof. Dr. Karl Schweizerhof, Dipl.-Math. Ulrich Franz -------------- next part -------------- An HTML attachment was scrubbed... URL: From yury at shurup.com Tue Jul 16 21:12:15 2013 From: yury at shurup.com (Yury V. Zaytsev) Date: Tue, 16 Jul 2013 21:12:15 +0200 Subject: [Cython] Difference between Cython / Python isinstance() vs. NumPy scalars, NumPy bug! In-Reply-To: <1373903644.2814.28.camel@newpride> References: <1373900604.2814.14.camel@newpride> <1373903644.2814.28.camel@newpride> Message-ID: <1374001935.2705.4.camel@newpride> On Mon, 2013-07-15 at 17:54 +0200, Yury V. Zaytsev wrote: > > 1) What's the best way to get Cython to generate (b) instead of (a)? As it appears, it doesn't really matter, because I've realized that I can import numpy without actually introducing a compile-time dependency on numpy as long as I don't cimport numpy. NumPy objects have a very handy item() method, which "copies an element of an array to a standard Python scalar and returns it", so at this point I can use the standard isinstance() checks against built-ins! > 2) Shall I open a bug against Python and suggest to investigate why > PyType_FastSubclass is not really equivalent to PyObject_TypeCheck as > it should be for the built-in types? FYI, this seems to be a problem with NumPy: https://github.com/numpy/numpy/pull/3526 Hope this helps, -- Sincerely yours, Yury V. Zaytsev From yury at shurup.com Wed Jul 17 13:59:48 2013 From: yury at shurup.com (Yury V. Zaytsev) Date: Wed, 17 Jul 2013 13:59:48 +0200 Subject: [Cython] Bug: C++ strings defined in a PXD file are empty Message-ID: <1374062388.2806.53.camel@newpride> Hi, Please find a minimal reproducer attached: In [1]: import pxdbug This should show 'bad value': This should show 'good value': good value Is this a real bug or I'm missing something fundamental about the PXD files? If it's the former, shall I create an issue on the Trac or GitHub? Thanks! -- Sincerely yours, Yury V. Zaytsev -------------- next part -------------- from libcpp.string cimport string cdef string STR_BAD = "bad value" -------------- next part -------------- # distutils: language = c++ from libcpp.string cimport string cdef string STR_GOOD = "good value" print("This should show 'bad value': {0}".format(STR_BAD)) print("This should show 'good value': {0}".format(STR_GOOD)) From stefan_ml at behnel.de Wed Jul 17 14:17:50 2013 From: stefan_ml at behnel.de (Stefan Behnel) Date: Wed, 17 Jul 2013 14:17:50 +0200 Subject: [Cython] Bug: C++ strings defined in a PXD file are empty In-Reply-To: <1374062388.2806.53.camel@newpride> References: <1374062388.2806.53.camel@newpride> Message-ID: <51E68B6E.1040402@behnel.de> Yury V. Zaytsev, 17.07.2013 13:59: > Please find a minimal reproducer attached: > > In [1]: import pxdbug > This should show 'bad value': > This should show 'good value': good value > > Is this a real bug or I'm missing something fundamental about the PXD > files? There is only one case where we allow code in .pxd files, and that's for inlined functions/methods. Your code should result in a compile error. > If it's the former, shall I create an issue on the Trac or > GitHub? Trac for bugs, github for fixes. If you don't have a trac account, please send an htaccess encrypted password to Robert. Stefan From hroest_nospam2333 at quantentunnel.de Wed Jul 17 14:45:28 2013 From: hroest_nospam2333 at quantentunnel.de (=?UTF-8?B?SGFubmVzIFLDtnN0?=) Date: Wed, 17 Jul 2013 14:45:28 +0200 Subject: [Cython] Cython compiler directives: c_string_encoding Message-ID: Dear mailing list I am trying to compile a program with Cython using the compiler directives and I am running into some trouble. Specifically, I am trying to port a Cython 0.18 program to Cython 0.19 and due to changes how char * and Python str are handled, I need to set the c_string_encoding directive. Unfortunately, this fails for my project and also in the following testcase when I try to do it locally: cimport cython cdef class TestClass: def foo(self): with cython.c_string_encoding("ascii"): return If I replace the directive with "with cython.boundscheck(True):" the program compiles fine. Also adding "#cython: c_string_encoding=ascii" as the first line of the file works fine. However, adding a decorator '@cython.c_string_encoding("ascii")' to foo also crashes the compiler. Finally, compiling with "-X c_string_encoding=ascii" also works. I was following the documentation provided here http://docs.cython.org/src/reference/compilation.html . The error message that I get is attached at the end. Did I do something wrong or can somebody point me in the right direction? Best regards Hannes Roest Error message: inimalcase.pyx:6:13: Compiler crash in InterpretCompilerDirectives ModuleNode.body = StatListNode(minimalcase.pyx:1:0) StatListNode.stats[1] = StatListNode(minimalcase.pyx:3:5) StatListNode.stats[0] = CClassDefNode(minimalcase.pyx:3:5, as_name = u'TestClass', class_name = u'TestClass', module_name = '', visibility = 'private') CClassDefNode.body = StatListNode(minimalcase.pyx:5:4) StatListNode.stats[0] = DefNode(minimalcase.pyx:5:4, modifiers = [...]/0, name = u'foo', num_required_args = 1, py_wrapper_required = True, reqd_kw_flags_cname = '0') DefNode.body = StatListNode(minimalcase.pyx:6:13) StatListNode.stats[0] = WithStatNode(minimalcase.pyx:6:13) Compiler crash traceback from this point on: File "/home/hr/lib/Cython-0.19.1/Cython/Compiler/Visitor.py", line 168, in _visit return handler_method(obj) File "/home/hr/lib/Cython-0.19.1/Cython/Compiler/ParseTreeTransforms.py", line 1002, in visit_WithStatNode for directive in self.try_to_parse_directives(node.manager) or []: File "/home/hr/lib/Cython-0.19.1/Cython/Compiler/ParseTreeTransforms.py", line 861, in try_to_parse_directives directives.append(self.try_to_parse_directive(optname, args, kwds, node.function.pos)) File "/home/hr/lib/Cython-0.19.1/Cython/Compiler/ParseTreeTransforms.py", line 913, in try_to_parse_directive assert False AssertionError: From stefan_ml at behnel.de Wed Jul 17 18:24:29 2013 From: stefan_ml at behnel.de (Stefan Behnel) Date: Wed, 17 Jul 2013 18:24:29 +0200 Subject: [Cython] Cython compiler directives: c_string_encoding In-Reply-To: References: Message-ID: <51E6C53D.2010806@behnel.de> Hannes R?st, 17.07.2013 14:45: > I am trying to compile a program with Cython using the compiler directives and > I am running into some trouble. Specifically, I am trying to port a Cython 0.18 > program to Cython 0.19 and due to changes how char * and Python str are > handled, I need to set the c_string_encoding directive. This may or may not be so. The fact that it gives an error in 0.19 might also hint at problems in your code. There's some documentation available: http://docs.cython.org/src/tutorial/strings.html If you need further help, then the cython-users mailing list is the right place to ask. > Unfortunately, this > fails for my project and also in the following testcase when I try to do it > locally: > > cimport cython > cdef class TestClass: > def foo(self): > with cython.c_string_encoding("ascii"): > return This can't work. The string encoding is a module global option. > The error message that > I get is attached at the end. Did I do something wrong or can somebody point me > in the right direction? > > Error message: > > inimalcase.pyx:6:13: Compiler crash in InterpretCompilerDirectives > [...] > File "/home/hr/lib/Cython-0.19.1/Cython/Compiler/ParseTreeTransforms.py", > line 913, in try_to_parse_directive > assert False > AssertionError: That's a bug. It shouldn't crash and instead give a compile error. Thanks for the report. Stefan From lev at columbia.edu Thu Jul 18 23:07:00 2013 From: lev at columbia.edu (Lev Givon) Date: Thu, 18 Jul 2013 17:07:00 -0400 Subject: [Cython] typo in buffers.pxd? Message-ID: <20130718210700.GT2810@avicenna.ee.columbia.edu> The PyBuffer_FillInfo() function definition in the latest revision of Cython/Includes/cpython/buffer.pxd appears to be missing a parameter. Shouldn't it be something like the following? int PyBuffer_FillInfo(Py_buffer *view, object obj, void *buf, Py_ssize_t len, int readonly, int flags) except -1 -- Lev Givon http://www.columbia.edu/~lev/ http://lebedov.github.com/ From hroest_nospam2333 at quantentunnel.de Fri Jul 19 10:38:39 2013 From: hroest_nospam2333 at quantentunnel.de (=?UTF-8?B?SGFubmVzIFLDtnN0?=) Date: Fri, 19 Jul 2013 10:38:39 +0200 Subject: [Cython] Cython compiler directives: c_string_encoding In-Reply-To: <51E6C53D.2010806@behnel.de> References: <51E6C53D.2010806@behnel.de> Message-ID: Hi Stefan Thank for your fast and detailed answer, this is helping us a lot. On 17 July 2013 18:24, Stefan Behnel wrote: > Hannes R?st, 17.07.2013 14:45: >> I am trying to compile a program with Cython using the compiler directives and >> I am running into some trouble. Specifically, I am trying to port a Cython 0.18 >> program to Cython 0.19 and due to changes how char * and Python str are >> handled, I need to set the c_string_encoding directive. > > This may or may not be so. The fact that it gives an error in 0.19 might > also hint at problems in your code. There's some documentation available: > > http://docs.cython.org/src/tutorial/strings.html > > If you need further help, then the cython-users mailing list is the right > place to ask. > > Thanks, I will have a look at it. Currently we have strong guarantees that our strings only contain ascii since they are part of a controlled vocabulary which does not contain non-ASCII. Also there is no other option to convert since our code is talking to external C code. However, we might want to test for the assumptions that we make here. >> Unfortunately, this >> fails for my project and also in the following testcase when I try to do it >> locally: >> >> cimport cython >> cdef class TestClass: >> def foo(self): >> with cython.c_string_encoding("ascii"): >> return > > This can't work. The string encoding is a module global option. > Thank you, I wasn?t aware of that. Maybe the documentation could be more clear on this point and specifically state the that the "Locally" method at the bottom of the page will only work with certain flags (and maybe list which ones work). > >> The error message that >> I get is attached at the end. Did I do something wrong or can somebody point me >> in the right direction? >> >> Error message: >> >> inimalcase.pyx:6:13: Compiler crash in InterpretCompilerDirectives >> [...] >> File "/home/hr/lib/Cython-0.19.1/Cython/Compiler/ParseTreeTransforms.py", >> line 913, in try_to_parse_directive >> assert False >> AssertionError: > > That's a bug. It shouldn't crash and instead give a compile error. Thanks > for the report. > > Stefan > > _______________________________________________ > cython-devel mailing list > cython-devel at python.org > http://mail.python.org/mailman/listinfo/cython-devel From tds333 at gmail.com Thu Jul 18 15:24:21 2013 From: tds333 at gmail.com (Wolfgang) Date: Thu, 18 Jul 2013 15:24:21 +0200 Subject: [Cython] Windows Debug build improvement Message-ID: <51E7EC85.9020209@gmail.com> Hi, I tried to submit a improvement for the Windows build but the tracker is not accessible without a login. On Windows if someone does a Debug build of an extension the flag _DEBUG is set and so the Python Interpreter sets Py_DEBUG and for all extension modules "_d" is appended to load the debug version of a module. This is not really practical because then all modules and the Python Interpreter must be build in Debug mode. For some modules this is even not possible for Windows. :-( To do a debug build for a Cython generated extension with a normal Python Interpreter (none Debug) I have to patch the pyconfig.h file and undef _DEBUG or I must patch the generated c file from Cython to undef _DEBUG before pyconfig.h or Python.h is included. (and enable it afterwards) Is it possible to add a flag to Cython to generate code that does this ? Something like described in Boost.Python: http://hepunx.rl.ac.uk/BFROOT/dist/releases/26.0.0/boost/libs/python/doc/building.html It is enough to have a new Preprocessor Flag, if set, then surround the Python.h inclusion with a disabled _DEBUG. My workarround is to disable it before pyconfig.h (Python.h) include: #ifdef _DEBUG #undef _DEBUG #define _RESTORE #endif and enable it afterwards #ifdef _RESTORE #define _DEBUG #undef _RESTORE #endif Regards, Wolfgang From yury at shurup.com Mon Jul 22 16:45:30 2013 From: yury at shurup.com (Yury V. Zaytsev) Date: Mon, 22 Jul 2013 16:45:30 +0200 Subject: [Cython] [Fwd: [cython-users] Conditional compilation of Cython code / preprocessor / external conditions] Message-ID: <1374504330.2887.46.camel@newpride> Hi, After a bit more research, I found two options that I initially overlooked: 1) (Apparently undocumented) provisions to provide external attributes in cython_compile_time_env 2) Generation of the PXI file from the external build system I don't like (2) very much, so I've got an idea to extend (1) with the ability to pass -D options to cython compiler, like: cython -DHAVE_LIBFOO=1 Will any such pull request be accepted? -- Sincerely yours, Yury V. Zaytsev -------- Forwarded Message -------- From: Yury V. Zaytsev Reply-to: cython-users at googlegroups.com To: cython-users at googlegroups.com Subject: [cython-users] Conditional compilation of Cython code / preprocessor / external conditions Date: Mon, 22 Jul 2013 16:17:58 +0200 Hi folks, Is it possible to conditionally compile Cython code depending on some externally defined value? >From reading the documentation, I've got an impression that presently Cython doesn't have any kind of preprocessor, and I hope that I'm just missing something here: http://docs.cython.org/src/userguide/language_basics.html My use case is as follows: The application that I'm wrapping links against a number of libraries, and some might be unavailable at compile time, so the corresponding parts of the Cython code shouldn't be compiled either. During the build, a preprocessor constant is defined, such as HAVE_LIBFOO that I can get by including config.h or pass via the autotools build system to the external tools. However, I can't figure out how to use that from Cython... So far, I was able to come up with an idea of replicating the API of the missing libraries with dummies on the C++ level, such that the Cython code compiles either way. Or else, I can make a poor man's preprocessor based on sed and just strip parts marked for conditional compilation from the PYX files. Both solutions look really disgusting and unmaintainable. I would greatly appreciate better ideas! To me it sounds like a rather common problem, so somebody must have come up with a solution already... Thanks, -- Sincerely yours, Yury V. Zaytsev -- --- You received this message because you are subscribed to the Google Groups "cython-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to cython-users+unsubscribe at googlegroups.com. For more options, visit https://groups.google.com/groups/opt_out. From felix at salfelder.org Tue Jul 23 11:25:15 2013 From: felix at salfelder.org (Felix Salfelder) Date: Tue, 23 Jul 2013 11:25:15 +0200 Subject: [Cython] patch for #655 In-Reply-To: <51CD1391.3070606@behnel.de> References: <20130625083406.GM11552@bin.d-labs.de> <51CBE294.50803@behnel.de> <20130627082610.GL3756@bin.d-labs.de> <20130627172556.GD3356@bin.d-labs.de> <20130627191832.GE3356@bin.d-labs.de> <20130627210602.GH3356@bin.d-labs.de> <51CD1391.3070606@behnel.de> Message-ID: <20130723092515.GG3696@bin.d-labs.de> On Fri, Jun 28, 2013 at 06:39:45AM +0200, Stefan Behnel wrote: > > the only thing that will affect the user (and which i should mention) > > is: make uses file extensions to determine file types. particularly, i > > have not found a portable hack that allows the use of .pyx for both .c > > and .cpp within the scope of one makefile yet... > > That's unfortunate, but not too serious either. its bearable, and it's just automake. manually created makefiles are not affected. why does cython use .pyx for two different things? what would be the canonical extension for C++ cython files? i've chosen to use .pyxx, any better options? > Hmm, does that mean you either have to create them manually before the > first run, and/or you have to manually collect all dependency files for the > "include" list? And then keep track of them yourself when you add new files? my goal is to use autotools for that sort of stuff. but yes, you can always do everything manually. > I suppose you could also use a wildcard file search to build the list of > include files? wildcards are seen as bad [1]. might be a matter of taste, of course. with manually created makefiles, you can do what you like best, of course. > > (automake will call the linker seperately, to increase portability or > > something) > If you want this feature to go in, I think you should write up > documentation for it, so that other people can actually use it as well. i have. type cython --help. this should be copied into the manpage, but we need to agree on the choice of option names first (and some other details, see the push request on github). > Even writing a correct makefile requires a huge amount of digging into > distutils. makefiles are documented in the make manual of your favourite make implementation. I don't know why you would want to dig into distutils. regards felix [1] http://www.gnu.org/software/automake/manual/html_node/Wildcards.html From nikita at nemkin.ru Wed Jul 24 17:15:49 2013 From: nikita at nemkin.ru (Nikita Nemkin) Date: Wed, 24 Jul 2013 21:15:49 +0600 Subject: [Cython] Windows Debug build improvement In-Reply-To: <51E7EC85.9020209@gmail.com> References: <51E7EC85.9020209@gmail.com> Message-ID: On Thu, 18 Jul 2013 19:24:21 +0600, Wolfgang wrote: > Hi, > > I tried to submit a improvement for the Windows build but the tracker is > not > accessible without a login. > > On Windows if someone does a Debug build of an extension the flag _DEBUG > is > set and so the Python Interpreter sets Py_DEBUG and for all extension > modules > "_d" is appended to load the debug version of a module. > This is not really practical because then all modules and the Python > Interpreter must be build in Debug mode. For some modules this is even > not > possible for Windows. :-( To debug my extensions on Windows (in Visual Studio), I just add the appropriate compiler flags: extension = Extension( ... extra_compile_args=['/Zi', '/Od'], # generate PDB, disable optimization extra_link_args=['/DEBUG']) # preserve debug info Add to that symbol files for the Python release you are using ("program database" links on this page http://www.python.org/getit/releases/2.7.5/) and you will have a comfortable debugging environment. Best regards, Nikita Nemkin From tds333 at gmail.com Thu Jul 25 17:46:15 2013 From: tds333 at gmail.com (Wolfgang) Date: Thu, 25 Jul 2013 17:46:15 +0200 Subject: [Cython] Windows Debug build improvement In-Reply-To: References: Message-ID: <51F14847.8080201@gmail.com> Hi, thank you. But I don't use setup.py/distutils for the build. I use cython as a code generator and include the generated files into a Visual Studio project. But that's no problem, VS has the ability of custom build steps. A simple build with distutils is not possible. Because I use not the VC version used to build the python executable. The external library has other dependencies. Also the project uses over 15 other Python libraries and none of them is available as a Windows debug version with "_d" extension. Basically this is a flaw in Python itself. For a Linux build of Python there is the "Py_DEBUG" flag used as indicator, set during config. For Windows this is not explicit, it is set if "_DEBUG" is set. And I think this implicit setting is not good. (because this flag is widely used elsewhere) But I have to workaround it, even if Python 3.4 or later changes it I still use Python 2.7. Boost Python does this in its configuration, see documentation: --- copied from boost documentation On unix-variant platforms, the debugging versions of Python's data structures will only be used if the symbol Py_DEBUG is defined. On many windows compilers, when extension modules are built with the preprocessor symbol _DEBUG, Python defaults to force linking with a special debugging version of the Python DLL. Since that symbol is very commonly used even when Python is not present, Boost.Python temporarily undefines _DEBUG when Python.h is #included from boost/python/detail/wrap_python.hpp - unless BOOST_DEBUG_PYTHON is defined. The upshot is that if you want ?python debugging?and you aren't using Boost.Build, you should make sure BOOST_DEBUG_PYTHON is defined, or python debugging will be suppressed. --- Regards, Wolfgang > Message: 1 > Date: Wed, 24 Jul 2013 21:15:49 +0600 > From: "Nikita Nemkin" > To: "Core developer mailing list of the Cython compiler" > > Subject: Re: [Cython] Windows Debug build improvement > Message-ID: > Content-Type: text/plain; charset=windows-1251; format=flowed; > delsp=yes > > On Thu, 18 Jul 2013 19:24:21 +0600, Wolfgang wrote: > >> Hi, >> >> I tried to submit a improvement for the Windows build but the tracker is >> not >> accessible without a login. >> >> On Windows if someone does a Debug build of an extension the flag _DEBUG >> is >> set and so the Python Interpreter sets Py_DEBUG and for all extension >> modules >> "_d" is appended to load the debug version of a module. >> This is not really practical because then all modules and the Python >> Interpreter must be build in Debug mode. For some modules this is even >> not >> possible for Windows. :-( > > To debug my extensions on Windows (in Visual Studio), I just add > the appropriate compiler flags: > > extension = Extension( > ... > extra_compile_args=['/Zi', '/Od'], # generate PDB, disable > optimization > extra_link_args=['/DEBUG']) # preserve debug info > > Add to that symbol files for the Python release you are using > ("program database" links on this page > http://www.python.org/getit/releases/2.7.5/) > and you will have a comfortable debugging environment. > > > Best regards, > Nikita Nemkin > > > ------------------------------ > > Subject: Digest Footer > > _______________________________________________ > cython-devel mailing list > cython-devel at python.org > http://mail.python.org/mailman/listinfo/cython-devel > > > ------------------------------ > > End of cython-devel Digest, Vol 30, Issue 17 > ******************************************** > From robertwb at gmail.com Thu Jul 25 18:01:21 2013 From: robertwb at gmail.com (Robert Bradshaw) Date: Thu, 25 Jul 2013 09:01:21 -0700 Subject: [Cython] Windows Debug build improvement In-Reply-To: <51E7EC85.9020209@gmail.com> References: <51E7EC85.9020209@gmail.com> Message-ID: On Thu, Jul 18, 2013 at 6:24 AM, Wolfgang wrote: > Hi, > > I tried to submit a improvement for the Windows build but the tracker is not > accessible without a login. This is to prevent spam. > On Windows if someone does a Debug build of an extension the flag _DEBUG is > set and so the Python Interpreter sets Py_DEBUG and for all extension modules > "_d" is appended to load the debug version of a module. > This is not really practical because then all modules and the Python > Interpreter must be build in Debug mode. For some modules this is even not > possible for Windows. :-( > > To do a debug build for a Cython generated extension with a normal Python > Interpreter (none Debug) I have to patch the pyconfig.h file and undef _DEBUG > or I must patch the generated c file from Cython to undef _DEBUG before > pyconfig.h or Python.h is included. (and enable it afterwards) > > Is it possible to add a flag to Cython to generate code that does this ? > > Something like described in Boost.Python: > http://hepunx.rl.ac.uk/BFROOT/dist/releases/26.0.0/boost/libs/python/doc/building.html > > It is enough to have a new Preprocessor Flag, if set, then surround the > Python.h inclusion with a disabled _DEBUG. > > My workarround is to disable it before pyconfig.h (Python.h) include: > > #ifdef _DEBUG > #undef _DEBUG > #define _RESTORE > #endif > > and enable it afterwards > > #ifdef _RESTORE > #define _DEBUG > #undef _RESTORE > #endif Seems like a fairly global change. At the very least it should be guarded with an #if [windows/msvc/?], and _RESTORE probably named __PYX_RESTORE_DEBUG or something less likely to clash. But it'd be really helpful for someone who uses and knows windows well to comment on the possible implications of this, as I don't even have a way to try it out. From tds333 at gmail.com Thu Jul 25 21:11:00 2013 From: tds333 at gmail.com (WL) Date: Thu, 25 Jul 2013 21:11:00 +0200 Subject: [Cython] Windows Debug build improvement In-Reply-To: References: <51E7EC85.9020209@gmail.com> Message-ID: <51F17844.7010108@gmail.com> On 25.07.2013 18:01, Robert Bradshaw wrote: > On Thu, Jul 18, 2013 at 6:24 AM, Wolfgang wrote: >> Hi, >> >> I tried to submit a improvement for the Windows build but the tracker is not >> accessible without a login. > This is to prevent spam. > >> On Windows if someone does a Debug build of an extension the flag _DEBUG is >> set and so the Python Interpreter sets Py_DEBUG and for all extension modules >> "_d" is appended to load the debug version of a module. >> This is not really practical because then all modules and the Python >> Interpreter must be build in Debug mode. For some modules this is even not >> possible for Windows. :-( >> >> To do a debug build for a Cython generated extension with a normal Python >> Interpreter (none Debug) I have to patch the pyconfig.h file and undef _DEBUG >> or I must patch the generated c file from Cython to undef _DEBUG before >> pyconfig.h or Python.h is included. (and enable it afterwards) >> >> Is it possible to add a flag to Cython to generate code that does this ? >> >> Something like described in Boost.Python: >> http://hepunx.rl.ac.uk/BFROOT/dist/releases/26.0.0/boost/libs/python/doc/building.html >> >> It is enough to have a new Preprocessor Flag, if set, then surround the >> Python.h inclusion with a disabled _DEBUG. >> >> My workarround is to disable it before pyconfig.h (Python.h) include: >> >> #ifdef _DEBUG >> #undef _DEBUG >> #define _RESTORE >> #endif >> >> and enable it afterwards >> >> #ifdef _RESTORE >> #define _DEBUG >> #undef _RESTORE >> #endif > Seems like a fairly global change. At the very least it should be > guarded with an #if [windows/msvc/?], and _RESTORE probably named > __PYX_RESTORE_DEBUG or something less likely to clash. But it'd be > really helpful for someone who uses and knows windows well to comment > on the possible implications of this, as I don't even have a way to > try it out. Yes it should be guarded with special naming and only enabled if a new special option is set. If this new flag is not set everything is as now. A check for Windows is not needed, but don't bother. Something like: #ifdef __PYX_WIN_DEBUG # ifdef _DEBUG # undef _DEBUG # define __PYX_DEBUG_RESTORE # endif #endif ... Or test at code generation time for the flag __PYX_WIN_DEBUG and only then generate this surrounding code. Regards, Wolfgang From robertwb at gmail.com Fri Jul 26 20:34:37 2013 From: robertwb at gmail.com (Robert Bradshaw) Date: Fri, 26 Jul 2013 11:34:37 -0700 Subject: [Cython] Windows Debug build improvement In-Reply-To: <51F17844.7010108@gmail.com> References: <51E7EC85.9020209@gmail.com> <51F17844.7010108@gmail.com> Message-ID: Who sets _DEBUG? Wouldn't manually undeffing _DEBUG be just as easy as manually setting __PYX_WIN_DEBUG? Disclaimer: I've never developed Python on Windows, but it seems that Nikita's solution is simpler. On Thu, Jul 25, 2013 at 12:11 PM, WL wrote: > On 25.07.2013 18:01, Robert Bradshaw wrote: >> >> On Thu, Jul 18, 2013 at 6:24 AM, Wolfgang wrote: >>> >>> Hi, >>> >>> I tried to submit a improvement for the Windows build but the tracker is >>> not >>> accessible without a login. >> >> This is to prevent spam. >> >>> On Windows if someone does a Debug build of an extension the flag _DEBUG >>> is >>> set and so the Python Interpreter sets Py_DEBUG and for all extension >>> modules >>> "_d" is appended to load the debug version of a module. >>> This is not really practical because then all modules and the Python >>> Interpreter must be build in Debug mode. For some modules this is even >>> not >>> possible for Windows. :-( >>> >>> To do a debug build for a Cython generated extension with a normal Python >>> Interpreter (none Debug) I have to patch the pyconfig.h file and undef >>> _DEBUG >>> or I must patch the generated c file from Cython to undef _DEBUG before >>> pyconfig.h or Python.h is included. (and enable it afterwards) >>> >>> Is it possible to add a flag to Cython to generate code that does this ? >>> >>> Something like described in Boost.Python: >>> >>> http://hepunx.rl.ac.uk/BFROOT/dist/releases/26.0.0/boost/libs/python/doc/building.html >>> >>> It is enough to have a new Preprocessor Flag, if set, then surround the >>> Python.h inclusion with a disabled _DEBUG. >>> >>> My workarround is to disable it before pyconfig.h (Python.h) include: >>> >>> #ifdef _DEBUG >>> #undef _DEBUG >>> #define _RESTORE >>> #endif >>> >>> and enable it afterwards >>> >>> #ifdef _RESTORE >>> #define _DEBUG >>> #undef _RESTORE >>> #endif >> >> Seems like a fairly global change. At the very least it should be >> guarded with an #if [windows/msvc/?], and _RESTORE probably named >> __PYX_RESTORE_DEBUG or something less likely to clash. But it'd be >> really helpful for someone who uses and knows windows well to comment >> on the possible implications of this, as I don't even have a way to >> try it out. > > > Yes it should be guarded with special naming and only enabled if a new > special option is set. > If this new flag is not set everything is as now. A check for Windows is not > needed, but don't bother. > > Something like: > > #ifdef __PYX_WIN_DEBUG > # ifdef _DEBUG > # undef _DEBUG > # define __PYX_DEBUG_RESTORE > # endif > #endif > > ... > > Or test at code generation time for the flag __PYX_WIN_DEBUG and only then > generate this surrounding code. > > > > Regards, > > Wolfgang > _______________________________________________ > cython-devel mailing list > cython-devel at python.org > http://mail.python.org/mailman/listinfo/cython-devel From stefan_ml at behnel.de Sun Jul 28 13:33:51 2013 From: stefan_ml at behnel.de (Stefan Behnel) Date: Sun, 28 Jul 2013 13:33:51 +0200 Subject: [Cython] patch for #655 In-Reply-To: <20130723092515.GG3696@bin.d-labs.de> References: <20130625083406.GM11552@bin.d-labs.de> <51CBE294.50803@behnel.de> <20130627082610.GL3756@bin.d-labs.de> <20130627172556.GD3356@bin.d-labs.de> <20130627191832.GE3356@bin.d-labs.de> <20130627210602.GH3356@bin.d-labs.de> <51CD1391.3070606@behnel.de> <20130723092515.GG3696@bin.d-labs.de> Message-ID: <51F5019F.7000001@behnel.de> Felix Salfelder, 23.07.2013 11:25: > On Fri, Jun 28, 2013 at 06:39:45AM +0200, Stefan Behnel wrote: >> Even writing a correct makefile requires a huge amount of digging into >> distutils. > > makefiles are documented in the make manual of your favourite make > implementation. I don't know why you would want to dig into distutils. In order to figure out how to build C extensions in a portable way, e.g. https://github.com/cython/cython/blob/master/Demos/embed/Makefile The sysconfig module seems to be amongst the even more underdocumented parts of distutils. Stefan