From robertwb at gmail.com Tue Apr 5 02:18:58 2016 From: robertwb at gmail.com (Robert Bradshaw) Date: Mon, 4 Apr 2016 23:18:58 -0700 Subject: [Cython] Cython 0.24 released Message-ID: I'm happy to announce the release of Cython 0.24. Downloads can be found at http://cython.org/release/Cython-0.24.tar.gz http://cython.org/release/Cython-0.24.zip and of course on pypi. This release contains numerous features and bugfixes, most notably keeping up with several enhancements coming in Python 3.6 and deprecation of the old-style property syntax now that @property is fully supported. Some other significant changes are enumerated at https://github.com/cython/cython/blob/8b592122f23e0ce458521e5a828431c95a0d84e8/CHANGES.rst Thanks to all those who contributed code, ideas, and testing! - Robert -------------- next part -------------- An HTML attachment was scrubbed... URL: From erik.m.bray at gmail.com Fri Apr 8 07:01:33 2016 From: erik.m.bray at gmail.com (Erik Bray) Date: Fri, 8 Apr 2016 13:01:33 +0200 Subject: [Cython] Fwd: Question about how best require compiler options for C sources In-Reply-To: References: Message-ID: Hi all, I'd like to call attention to an issue I've been looking into for the past couple days: https://github.com/cython/cython/pull/360 To summarize the discussion, when building DLLs for Windows, functions that should be exported by that DLL must be marked __declspec(dllexport) in their declaration. However, when using the same header file in a project that links to that DLL the same function must be declared __declspec(dllimport). It's common practice to have a macro for this, whose value is controlled by whether or not a macro (often called something like "DLL_EXPORT"). When compiling the DLL we would define -DDLL_EXPORT to output __declspec(dllexport). Otherwise it outputs __declspec(dllimport). Cython currently handles this with such a macro called DL_IMPORT which comes from Python. However, this macro was deprecated some time ago, and is removed in current Python 3 versions. So Cython must replace it with its own. I'm working on a patch for this--to reduce confusion the macro is named specifically for the Cython module it's associated with. For example, for a Cython module named "foo.bar" there are two macros DLL_EXPORT__foo__bar and EXPORT__foo__bar. If the latter is defined then the former outputs dllexport, otherwise it outputs dllimport. I've attached the patch in progress. I'm open to comment on this, but where I'm stuck now is that in order for the "foo.bar" module to be compiled correctly it needs EXPORT__foo__bar to be defined at compile time. And it's not clear to me what the best way is for the Cython compiler to pass down options that are passed to the C/C++ compiler. In general the best way would be to attach this to the define_macros attribute of the associated distutils/setuptools Extension object and let the distutils compiler class generate the right compiler options. But it's not clear to me from Cython's internals where the best place to do that would be. Thanks, Erik -------------- next part -------------- A non-text attachment was scrubbed... Name: dll_export.patch Type: application/octet-stream Size: 4161 bytes Desc: not available URL: From njs at vorpus.org Fri Apr 8 11:49:26 2016 From: njs at vorpus.org (Nathaniel Smith) Date: Fri, 8 Apr 2016 08:49:26 -0700 Subject: [Cython] Fwd: Question about how best require compiler options for C sources In-Reply-To: References: Message-ID: Can you give a tiny concrete example? My questions are basic enough that I feel like I'm missing something fundamental :-) My first question is why you even need this, since AFAIK there are no cases where it is correct to have a cython module dllexporting symbols that appear in header files. This is what cimport is for, right? My second question is why you would want to do this via the command line, when compiling the dll means that you are compiling some cython-generated .c, which means that you can put the #define directly in the source code, no? -n On Apr 8, 2016 5:35 AM, "Erik Bray" wrote: > Hi all, > > I'd like to call attention to an issue I've been looking into for the > past couple days: > > https://github.com/cython/cython/pull/360 > > To summarize the discussion, when building DLLs for Windows, functions > that should be exported by that DLL must be marked > __declspec(dllexport) in their declaration. However, when using the > same header file in a project that links to that DLL the same function > must be declared __declspec(dllimport). > > It's common practice to have a macro for this, whose value is > controlled by whether or not a macro (often called something like > "DLL_EXPORT"). When compiling the DLL we would define -DDLL_EXPORT to > output __declspec(dllexport). Otherwise it outputs > __declspec(dllimport). > > Cython currently handles this with such a macro called DL_IMPORT which > comes from Python. However, this macro was deprecated some time ago, > and is removed in current Python 3 versions. So Cython must replace > it with its own. > > I'm working on a patch for this--to reduce confusion the macro is > named specifically for the Cython module it's associated with. For > example, for a Cython module named "foo.bar" there are two macros > DLL_EXPORT__foo__bar and EXPORT__foo__bar. If the latter is defined > then the former outputs dllexport, otherwise it outputs dllimport. > I've attached the patch in progress. > > I'm open to comment on this, but where I'm stuck now is that in order > for the "foo.bar" module to be compiled correctly it needs > EXPORT__foo__bar to be defined at compile time. And it's not clear to > me what the best way is for the Cython compiler to pass down options > that are passed to the C/C++ compiler. In general the best way would > be to attach this to the define_macros attribute of the associated > distutils/setuptools Extension object and let the distutils compiler > class generate the right compiler options. But it's not clear to me > from Cython's internals where the best place to do that would be. > > Thanks, > Erik > > _______________________________________________ > cython-devel mailing list > cython-devel at python.org > https://mail.python.org/mailman/listinfo/cython-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From erik.m.bray at gmail.com Mon Apr 11 07:18:04 2016 From: erik.m.bray at gmail.com (Erik Bray) Date: Mon, 11 Apr 2016 13:18:04 +0200 Subject: [Cython] Fwd: Question about how best require compiler options for C sources In-Reply-To: References: Message-ID: On Fri, Apr 8, 2016 at 5:49 PM, Nathaniel Smith wrote: > Can you give a tiny concrete example? My questions are basic enough that I > feel like I'm missing something fundamental :-) Yes, I think you might be missing something, but I'm not sure exactly where. In the issue I provided a tiny example which you can see here: https://gist.github.com/embray/12a67edb82b213217e31f408007898e6 The C code generated by this example currently does not compile on Windows, because of how Cython uses DL_IMPORT incorrectly. Regardless of what it *does* do, Cython should not be using the DL_IMPORT macro at all actually since that no longer even exists in Python. > My first question is why you even need this, since AFAIK there are no cases > where it is correct to have a cython module dllexporting symbols that appear > in header files. This is what cimport is for, right? I don't think this has anything to do with cimport. Could you explain what you mean? > My second question is why you would want to do this via the command line, > when compiling the dll means that you are compiling some cython-generated > .c, which means that you can put the #define directly in the source code, > no? Not necessarily. As you can see in my example the file fooutil.c is hand-written C code that was not generated by Cython, but which uses a function in the Cython-generated code. It includes "foo.h". In principle you're right--the hand-written C code could set the proper #defines but it would have to do so *before* including "foo.h". It's not very obvious that this would be needed. Most other code I've seen that addresses this issue--including cpython itself--do so by passing an appropriate define to the compiler via the command-line, and that seems the clearest to me. Best, Erik > On Apr 8, 2016 5:35 AM, "Erik Bray" wrote: >> >> Hi all, >> >> I'd like to call attention to an issue I've been looking into for the >> past couple days: >> >> https://github.com/cython/cython/pull/360 >> >> To summarize the discussion, when building DLLs for Windows, functions >> that should be exported by that DLL must be marked >> __declspec(dllexport) in their declaration. However, when using the >> same header file in a project that links to that DLL the same function >> must be declared __declspec(dllimport). >> >> It's common practice to have a macro for this, whose value is >> controlled by whether or not a macro (often called something like >> "DLL_EXPORT"). When compiling the DLL we would define -DDLL_EXPORT to >> output __declspec(dllexport). Otherwise it outputs >> __declspec(dllimport). >> >> Cython currently handles this with such a macro called DL_IMPORT which >> comes from Python. However, this macro was deprecated some time ago, >> and is removed in current Python 3 versions. So Cython must replace >> it with its own. >> >> I'm working on a patch for this--to reduce confusion the macro is >> named specifically for the Cython module it's associated with. For >> example, for a Cython module named "foo.bar" there are two macros >> DLL_EXPORT__foo__bar and EXPORT__foo__bar. If the latter is defined >> then the former outputs dllexport, otherwise it outputs dllimport. >> I've attached the patch in progress. >> >> I'm open to comment on this, but where I'm stuck now is that in order >> for the "foo.bar" module to be compiled correctly it needs >> EXPORT__foo__bar to be defined at compile time. And it's not clear to >> me what the best way is for the Cython compiler to pass down options >> that are passed to the C/C++ compiler. In general the best way would >> be to attach this to the define_macros attribute of the associated >> distutils/setuptools Extension object and let the distutils compiler >> class generate the right compiler options. But it's not clear to me >> from Cython's internals where the best place to do that would be. >> >> Thanks, >> Erik >> >> _______________________________________________ >> cython-devel mailing list >> cython-devel at python.org >> https://mail.python.org/mailman/listinfo/cython-devel >> > > _______________________________________________ > cython-devel mailing list > cython-devel at python.org > https://mail.python.org/mailman/listinfo/cython-devel > From njs at vorpus.org Mon Apr 11 08:51:34 2016 From: njs at vorpus.org (Nathaniel Smith) Date: Mon, 11 Apr 2016 05:51:34 -0700 Subject: [Cython] Fwd: Question about how best require compiler options for C sources In-Reply-To: References: Message-ID: On Apr 11, 2016 04:18, "Erik Bray" wrote: > > On Fri, Apr 8, 2016 at 5:49 PM, Nathaniel Smith wrote: > > Can you give a tiny concrete example? My questions are basic enough that I > > feel like I'm missing something fundamental :-) > > Yes, I think you might be missing something, but I'm not sure exactly > where. In the issue I provided a tiny example which you can see here: > > https://gist.github.com/embray/12a67edb82b213217e31f408007898e6 > > The C code generated by this example currently does not compile on > Windows, because of how Cython uses DL_IMPORT incorrectly. Regardless > of what it *does* do, Cython should not be using the DL_IMPORT macro > at all actually since that no longer even exists in Python. Sure. For that example, the correct thing to do is to *not* export the function. Backing up, to make sure we're on the same page: There are three levels of symbol visibility in C: file-internal, shared library internal (different .c files that make up the library can see it, but users of the shared library can't see it), and shared library exported (everyone can see it; can also carry other consequences, e.g. on Linux then internal calls will become noticeably slower, and it becomes possible for weird symbol interposition issues to occur). So the rule of thumb is to make everything as private as you can get away with. Making this more interesting: - vanilla C only includes 2 ways to mark symbol visibility, which is not enough to make a 3-way distinction. Hence the need for an extra attribute thingummy. - everyone agrees that symbols marked 'static' should be file-internal, but different platforms disagree about what should happen if the extra attribute thingummy is missing. So on Windows, the convention is: 'static' -> file internal no marking -> shared library internal 'dllexport' -> public And on Linux it's: 'static' -> file internal 'visibility (hidden)' -> shared library internal no marking -> public It's generally agreed that Linux got this wrong and that you should always use '-fvisibility=hidden' to switch it to the windows style, but cython doesn't control compiler options and thus should probably generate code that works correctly regardless of such compiler settings. Fortunately, Linux does provide some markings to explicitly make things public: you can mark a symbol 'visibility (default)' (which means public), or you can use the dllexport syntax, just like Windows, because gcc is helpful like that. OTOH, windows is annoying because of this dllimport thing that started this whole thread: on other systems just marking symbols as extern is enough to handle both shared-object-internal and shared-library-exported symbols, and the linker will sort it out. On Windows, you have to explicitly distinguish between these. (And annoyingly, if you accidentally leave out the dllimport making on functions then it will use some fallback hack that works but silently degrades performance; on other symbols it just doesn't work, and ditto for if you use it when it isn't needed.) So final conclusion: for non-static symbols, cython should first decide whether they are supposed to be shared-library-internal or actually exported from the shared library. For shared-library-internal symbols: their definition should be marked 'visibility(hidden)' on Linux, and unmarked on Windows. This is easy using some preprocessor gunk. (Or maybe simplest is: marked that everywhere except if using msvc, because I think everyone else will understand 'visibility (hidden)' even if it's a no op.) Their declaration in the header file should just be 'extern' everywhere. For shared-library-exported symbols: I am dubious about whether cython should even support these at all. But if it does, then the definitions should be marked 'dllexport' (no macro trickery needed, because everyone understands this syntax), and their declaration in the header file needs some extra hack that is the subject of this thread. Now, back to your example: Here the caller and callee are both compiled into the same shared library, so you don't want dllexport/dllimport at all, you just want a shared-library-internal symbol, which as we see is much easier. NumPy also ran into some problems with this in our experiments with using cython internally. Our temporary solution was to use the preprocessor to monkeypatch DL_IMPORT into expanding to the appropriate shared-library-internal thing :-). > > My first question is why you even need this, since AFAIK there are no cases > > where it is correct to have a cython module dllexporting symbols that appear > > in header files. This is what cimport is for, right? > > I don't think this has anything to do with cimport. Could you explain > what you mean? We only need to solve the dllimport issue if we want to support shared-library-exported symbols from cython extensions. This is only useful if you have different extensions that are directly linking to each other using the platform linker. But this simply doesn't work in almost any cases, because platform linkers have all kinds of quirks that don't play well with python packages -- to start with, your runtime linker probably does not follow Python's rules for which directories to search to find a shared library... To solve these problems, the Cython devs invented the cimport mechanism, which is basically a portable, python-centric way of doing shared-library-exports while avoiding all the problems caused by using the platform linker. So my question was, what's your use case that's better served by linking directly against an extension module and using the platform's shared-library-export functionality, instead of cython's? > > My second question is why you would want to do this via the command line, > > when compiling the dll means that you are compiling some cython-generated > > .c, which means that you can put the #define directly in the source code, > > no? > > Not necessarily. As you can see in my example the file fooutil.c is > hand-written C code that was not generated by Cython, but which uses a > function in the Cython-generated code. It includes "foo.h". In > principle you're right--the hand-written C code could set the proper > #defines but it would have to do so *before* including "foo.h". It's > not very obvious that this would be needed. Most other code I've seen > that addresses this issue--including cpython itself--do so by passing > an appropriate define to the compiler via the command-line, and that > seems the clearest to me. I see, right. I guess my suggestion would be that if a symbol really does need to be marked for shared-library-export *and simultaneously* used by different files within the same shared library -- which is the only case where this arises -- then possibly the simplest and most robust thing is to set up the header file so that external users just do #include "foo.h", and the internal users do #define FOO_INTERNAL #include "foo.h" (This is how numpy's include files work, actually, though for reasons unrelated to symbol visibility.) -- bottom line -- I think the obvious thing is that cython should provide a natural way to mark a function as being shared-library-internal; that covers 99% of real use cases *and* just works without any further macros needed. Probably the existing "public" annotation should be changed to mean this. (Obviously it wasn't quite fully thought through in the first place and has few if any users, since it got this very wrong without anyone noticing. So fixing it seems viable to me.) And then *maybe* there should also be a way to make a symbol shared-library-exported, if that really is useful, but as a non-default experts-only kind of thing, and as such it would be OK to require these rare expert users to be a bit more careful about how they #include the resulting header within their own project. -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From erik.m.bray at gmail.com Mon Apr 11 09:23:06 2016 From: erik.m.bray at gmail.com (Erik Bray) Date: Mon, 11 Apr 2016 15:23:06 +0200 Subject: [Cython] Fwd: Question about how best require compiler options for C sources In-Reply-To: References: Message-ID: On Mon, Apr 11, 2016 at 2:51 PM, Nathaniel Smith wrote: > Now, back to your example: Here the caller and callee are both compiled into > the same shared library, so you don't want dllexport/dllimport at all, you > just want a shared-library-internal symbol, which as we see is much easier. Sorry, I'll respond more to your (helpfully detailed and precise) message in a second. But something I wanted to point out is that my example is incomplete and I should have arranged a more complete example. In this case I really do want the symbol "hello" to be exported by the DLL, as well as be understood between translation units making up the same library. A more complete example would have shown a separate library which links with "foo" and uses the "hello" function in it. Yes, it's arguable that exporting anything more than then initmodule function from a Python extension module is not best practice, but the possibility should not be ruled out either. So I think later on you hit correctly on the deeper problem, which is that Cython currently doesn't have a great way to distinguish between intra-library visibility and *inter*-library visibility. And if both are needed then some DL_IMPORT-like macro is needed that sets the visibility for a symbol correctly depending on the context in which it's being used. (And yes, this is not a problem on *nix, but it is on Windows due to the way __declspec(dllimport) causes name mangling :( Thanks, Erik From erik.m.bray at gmail.com Mon Apr 11 09:44:53 2016 From: erik.m.bray at gmail.com (Erik Bray) Date: Mon, 11 Apr 2016 15:44:53 +0200 Subject: [Cython] Fwd: Question about how best require compiler options for C sources In-Reply-To: References: Message-ID: On Mon, Apr 11, 2016 at 2:51 PM, Nathaniel Smith wrote: > On Apr 11, 2016 04:18, "Erik Bray" wrote: >> >> On Fri, Apr 8, 2016 at 5:49 PM, Nathaniel Smith wrote: >> > Can you give a tiny concrete example? My questions are basic enough that >> > I >> > feel like I'm missing something fundamental :-) >> >> Yes, I think you might be missing something, but I'm not sure exactly >> where. In the issue I provided a tiny example which you can see here: >> >> https://gist.github.com/embray/12a67edb82b213217e31f408007898e6 >> >> The C code generated by this example currently does not compile on >> Windows, because of how Cython uses DL_IMPORT incorrectly. Regardless >> of what it *does* do, Cython should not be using the DL_IMPORT macro >> at all actually since that no longer even exists in Python. > > Sure. > > For that example, the correct thing to do is to *not* export the function. So as I wrote in my previous message, my example was incomplete. But indeed if the intent was *only* to share the declaration between TUs of the same library then I agree the *most* correct thing would be to not use dllexport. Unfortunately there isn't currently a way in Cython to make this distinction. > Backing up, to make sure we're on the same page: Yep, I agree with your analysis that follows, and it's helpful to have all the cases laid out in one place, so thanks for that! There are three levels of > symbol visibility in C: file-internal, shared library internal (different .c > files that make up the library can see it, but users of the shared library > can't see it), and shared library exported (everyone can see it; can also > carry other consequences, e.g. on Linux then internal calls will become > noticeably slower, and it becomes possible for weird symbol interposition > issues to occur). So the rule of thumb is to make everything as private as > you can get away with. > > Making this more interesting: > - vanilla C only includes 2 ways to mark symbol visibility, which is not > enough to make a 3-way distinction. Hence the need for an extra attribute > thingummy. > - everyone agrees that symbols marked 'static' should be file-internal, but > different platforms disagree about what should happen if the extra attribute > thingummy is missing. > > So on Windows, the convention is: > 'static' -> file internal > no marking -> shared library internal > 'dllexport' -> public > > And on Linux it's: > 'static' -> file internal > 'visibility (hidden)' -> shared library internal > no marking -> public > > It's generally agreed that Linux got this wrong and that you should always > use '-fvisibility=hidden' to switch it to the windows style, but cython > doesn't control compiler options and thus should probably generate code that > works correctly regardless of such compiler settings. Fortunately, Linux > does provide some markings to explicitly make things public: you can mark a > symbol 'visibility (default)' (which means public), or you can use the > dllexport syntax, just like Windows, because gcc is helpful like that. > > OTOH, windows is annoying because of this dllimport thing that started this > whole thread: on other systems just marking symbols as extern is enough to > handle both shared-object-internal and shared-library-exported symbols, and > the linker will sort it out. On Windows, you have to explicitly distinguish > between these. (And annoyingly, if you accidentally leave out the dllimport > making on functions then it will use some fallback hack that works but > silently degrades performance; on other symbols it just doesn't work, and > ditto for if you use it when it isn't needed.) Yep. Agreed with all of the above. > So final conclusion: for non-static symbols, cython should first decide > whether they are supposed to be shared-library-internal or actually exported > from the shared library. > > For shared-library-internal symbols: their definition should be marked > 'visibility(hidden)' on Linux, and unmarked on Windows. This is easy using > some preprocessor gunk. (Or maybe simplest is: marked that everywhere except > if using msvc, because I think everyone else will understand 'visibility > (hidden)' even if it's a no op.) Their declaration in the header file should > just be 'extern' everywhere. > > For shared-library-exported symbols: I am dubious about whether cython > should even support these at all. But if it does, then the definitions > should be marked 'dllexport' (no macro trickery needed, because everyone > understands this syntax), and their declaration in the header file needs > some extra hack that is the subject of this thread. I understand your doubts here, but for the purpose of discussion let's just assume that it should be supported? :) And yes, the issue that the header file needs some hacky context-dependent declarations. A possible alternative, which in fact is exactly the work-around I'm using right now, would be for Cython to generate two headers. In the case of my example they would be "foo.h" and "foo_internal.h". They would be almost exactly the same except that the former declares the function dllimport and the latter declares the function dllexport, and only the _internal.h should be used to share a declaration between TUs. I'm currently doing this by writing the "foo_internal.h" manually. > Now, back to your example: Here the caller and callee are both compiled into > the same shared library, so you don't want dllexport/dllimport at all, you > just want a shared-library-internal symbol, which as we see is much easier. Well, again, in this case I do want dllexport/dllimport but I should have been more clear about that. But supposing I didn't want it, then this is true. > NumPy also ran into some problems with this in our experiments with using > cython internally. Our temporary solution was to use the preprocessor to > monkeypatch DL_IMPORT into expanding to the appropriate > shared-library-internal thing :-). Ah, how did you manage that? Do you have to make sure to do it before Python.h is ever included? >> > My first question is why you even need this, since AFAIK there are no >> > cases >> > where it is correct to have a cython module dllexporting symbols that >> > appear >> > in header files. This is what cimport is for, right? >> >> I don't think this has anything to do with cimport. Could you explain >> what you mean? > > We only need to solve the dllimport issue if we want to support > shared-library-exported symbols from cython extensions. This is only useful > if you have different extensions that are directly linking to each other > using the platform linker. But this simply doesn't work in almost any cases, > because platform linkers have all kinds of quirks that don't play well with > python packages -- to start with, your runtime linker probably does not > follow Python's rules for which directories to search to find a shared > library... To solve these problems, the Cython devs invented the cimport > mechanism, which is basically a portable, python-centric way of doing > shared-library-exports while avoiding all the problems caused by using the > platform linker. So my question was, what's your use case that's better > served by linking directly against an extension module and using the > platform's shared-library-export functionality, instead of cython's? Well, yes, but you can't use cimport from C code :) As for the use case I'll have to get back to you on that. I'm trying to help the Sage project fix some issues related to that and my understanding is that it is needed for C code to be able to link directly against code that's compiled into an extension module. I could be wrong about that in which case we're free to ignore that case (unless someone does come up with a use case). So yes, I had better double-check on that :) Still, I'm not necessarily talking about linking Cython modules together, which is why I don't think cimport really comes into it. >> > My second question is why you would want to do this via the command >> > line, >> > when compiling the dll means that you are compiling some >> > cython-generated >> > .c, which means that you can put the #define directly in the source >> > code, >> > no? >> >> Not necessarily. As you can see in my example the file fooutil.c is >> hand-written C code that was not generated by Cython, but which uses a >> function in the Cython-generated code. It includes "foo.h". In >> principle you're right--the hand-written C code could set the proper >> #defines but it would have to do so *before* including "foo.h". It's >> not very obvious that this would be needed. Most other code I've seen >> that addresses this issue--including cpython itself--do so by passing >> an appropriate define to the compiler via the command-line, and that >> seems the clearest to me. > > I see, right. I guess my suggestion would be that if a symbol really does > need to be marked for shared-library-export *and simultaneously* used by > different files within the same shared library -- which is the only case > where this arises -- then possibly the simplest and most robust thing is to > set up the header file so that external users just do #include "foo.h", and > the internal users do > > #define FOO_INTERNAL > #include "foo.h" Okay, this is similar to my above suggestion of using separate headers. I for one prefer the separate headers better, as I think it's easy to forget to set a #define like that--and to anyone not working on Windows it's not at all clear why that would be needed. In fact on Linux it will "just work" without the "#define FOO_INTERNAL" so without regular testing on Windows it will be too easy to forget. I'm not sure if having a separate _internal.h header is any better. But it *might* be--in particular if it's generated by Cython then it would force the developer to ask what the difference is between "foo.h" and "foo_internal.h". And they can both contain comments explaining when to use them. > I think the obvious thing is that cython should provide a natural way to > mark a function as being shared-library-internal; that covers 99% of real > use cases *and* just works without any further macros needed. Probably the > existing "public" annotation should be changed to mean this. Probably, yeah. Ignoring the DL_IMPORT stuff the other effect of marking a symbol "public" in Cython is to add the appropriate __PYX_EXTERN_C. And if that were *all* it did then its behavior would be consistent between platforms :) > (Obviously it > wasn't quite fully thought through in the first place and has few if any > users, since it got this very wrong without anyone noticing. So fixing it > seems viable to me.) +1 > And then *maybe* there should also be a way to make a symbol > shared-library-exported, if that really is useful, but as a non-default > experts-only kind of thing, and as such it would be OK to require these rare > expert users to be a bit more careful about how they #include the resulting > header within their own project. Okay. My belief is that there is a case for this, but I should substantiate it better. Would you be amenable to the generation of a "_internal.h"? The more I think about it the more I'm convinced this would be the simplest way to handle this, and would simplify matters by not requiring Cython to impose any compiler flags (making my original quesition moot). Agreed that it could be non-default. Thanks again, Erik From njs at vorpus.org Mon Apr 11 13:05:49 2016 From: njs at vorpus.org (Nathaniel Smith) Date: Mon, 11 Apr 2016 10:05:49 -0700 Subject: [Cython] Fwd: Question about how best require compiler options for C sources In-Reply-To: References: Message-ID: On Apr 11, 2016 06:23, "Erik Bray" wrote: > > On Mon, Apr 11, 2016 at 2:51 PM, Nathaniel Smith wrote: > > Now, back to your example: Here the caller and callee are both compiled into > > the same shared library, so you don't want dllexport/dllimport at all, you > > just want a shared-library-internal symbol, which as we see is much easier. > > Sorry, I'll respond more to your (helpfully detailed and precise) > message in a second. But something I wanted to point out is that my > example is incomplete and I should have arranged a more complete > example. In this case I really do want the symbol "hello" to be > exported by the DLL, as well as be understood between translation > units making up the same library. A more complete example would have > shown a separate library which links with "foo" and uses the "hello" > function in it. > > Yes, it's arguable that exporting anything more than then initmodule > function from a Python extension module is not best practice, but the > possibility should not be ruled out either. > > So I think later on you hit correctly on the deeper problem, which is > that Cython currently doesn't have a great way to distinguish between > intra-library visibility and *inter*-library visibility. > > And if both are needed then some DL_IMPORT-like macro is needed that > sets the visibility for a symbol correctly depending on the context in > which it's being used. (And yes, this is not a problem on *nix, but it > is on Windows due to the way __declspec(dllimport) causes name > mangling :( This is highly tangential to the main conversation, but FYI and in the general interests of demystifying this stuff: the reason for all this rigmarole on Windows is that dllimport doesn't just cause name mangling, it causes *type* mangling. The way windows symbol relocations work, is that you can only import pointers. So code like __declspec(dllimport) extern void f(...); __declspec(dllimport) extern int myvalue; is syntactic sugar for something like: // pointer that will be filled in by loader void (*__imp_f)(...); // desugar direct calls into indirect calls #define f(...) (*__imp_f)(...) // similar int *__imp_myint; #define myint (*__imp_myint) ...except that (a) instead of using the preprocessor to perform the substitution, it happens in the compiler frontend, so it can correctly follow scoping rules and things that the preprocessor can't, and (b) the linker will also automagically generate a function like void f(...) { return (*__imp_f)(...); } So there shouldn't still be any mentions of 'f' in the resulting file -- they should all be replaced by mentions of (*__imp_f) -- but just in case we missed any your code will still work, albeit at the cost of some icache pollution and an extra indirect jump at every call. Note in particular that this __imp_ nonsense *isn't* an approximation on my part, there really will be a shared-library-internal symbol called __imp_whatever whose value is a pointer that gets filled in my the loader. (Except possibly I got my underscores wrong -- on phone so too lazy to check.) You could literally write the code I wrote above with the #define's and it would actually work on windows... -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From jdemeyer at cage.ugent.be Mon Apr 11 13:38:35 2016 From: jdemeyer at cage.ugent.be (Jeroen Demeyer) Date: Mon, 11 Apr 2016 19:38:35 +0200 Subject: [Cython] Fwd: Question about how best require compiler options for C sources In-Reply-To: References: Message-ID: <570BE11B.6060500@cage.ugent.be> On 2016-04-11 15:23, Erik Bray wrote: > In this case I really do want the symbol "hello" to be > exported by the DLL, as well as be understood between translation > units making up the same library. Are you really sure that you want this? I doubt that this is supported on OS X: on OS X, there are two kinds of files: shared libraries (with .dylib extension) and loadable modules or bundles (usually with .so extension but officially with .bundle extension). C extensions for Python are compiled as a loadable module and I don't think you can link against these. See http://stackoverflow.com/questions/2339679/what-are-the-differences-between-so-and-dylib-on-osx Some details in the above might be wrong, but I remember running into this issue in some early version of cysignals. Jeroen. From insertinterestingnamehere at gmail.com Mon Apr 11 13:49:56 2016 From: insertinterestingnamehere at gmail.com (Ian Henriksen) Date: Mon, 11 Apr 2016 17:49:56 +0000 Subject: [Cython] Fwd: Question about how best require compiler options for C sources In-Reply-To: References: Message-ID: On Mon, Apr 11, 2016 at 7:46 AM Erik Bray wrote: > On Mon, Apr 11, 2016 at 2:51 PM, Nathaniel Smith wrote: > > On Apr 11, 2016 04:18, "Erik Bray" wrote: > >> > >> On Fri, Apr 8, 2016 at 5:49 PM, Nathaniel Smith wrote: > >> > Can you give a tiny concrete example? My questions are basic enough > that > >> > I > >> > feel like I'm missing something fundamental :-) > >> > >> Yes, I think you might be missing something, but I'm not sure exactly > >> where. In the issue I provided a tiny example which you can see here: > >> > >> https://gist.github.com/embray/12a67edb82b213217e31f408007898e6 > >> > >> The C code generated by this example currently does not compile on > >> Windows, because of how Cython uses DL_IMPORT incorrectly. Regardless > >> of what it *does* do, Cython should not be using the DL_IMPORT macro > >> at all actually since that no longer even exists in Python. > > > > Sure. > > > > For that example, the correct thing to do is to *not* export the > function. > > So as I wrote in my previous message, my example was incomplete. But > indeed if the intent was *only* to share the declaration between TUs > of the same library then I agree the *most* correct thing would be to > not use dllexport. Unfortunately there isn't currently a way in Cython > to make this distinction. > > > Backing up, to make sure we're on the same page: > > Yep, I agree with your analysis that follows, and it's helpful to have > all the cases laid out in one place, so thanks for that! > > There are three levels of > > symbol visibility in C: file-internal, shared library internal > (different .c > > files that make up the library can see it, but users of the shared > library > > can't see it), and shared library exported (everyone can see it; can also > > carry other consequences, e.g. on Linux then internal calls will become > > noticeably slower, and it becomes possible for weird symbol interposition > > issues to occur). So the rule of thumb is to make everything as private > as > > you can get away with. > > > > Making this more interesting: > > - vanilla C only includes 2 ways to mark symbol visibility, which is not > > enough to make a 3-way distinction. Hence the need for an extra attribute > > thingummy. > > - everyone agrees that symbols marked 'static' should be file-internal, > but > > different platforms disagree about what should happen if the extra > attribute > > thingummy is missing. > > > > So on Windows, the convention is: > > 'static' -> file internal > > no marking -> shared library internal > > 'dllexport' -> public > > > > And on Linux it's: > > 'static' -> file internal > > 'visibility (hidden)' -> shared library internal > > no marking -> public > > > > It's generally agreed that Linux got this wrong and that you should > always > > use '-fvisibility=hidden' to switch it to the windows style, but cython > > doesn't control compiler options and thus should probably generate code > that > > works correctly regardless of such compiler settings. Fortunately, Linux > > does provide some markings to explicitly make things public: you can > mark a > > symbol 'visibility (default)' (which means public), or you can use the > > dllexport syntax, just like Windows, because gcc is helpful like that. > > > > OTOH, windows is annoying because of this dllimport thing that started > this > > whole thread: on other systems just marking symbols as extern is enough > to > > handle both shared-object-internal and shared-library-exported symbols, > and > > the linker will sort it out. On Windows, you have to explicitly > distinguish > > between these. (And annoyingly, if you accidentally leave out the > dllimport > > making on functions then it will use some fallback hack that works but > > silently degrades performance; on other symbols it just doesn't work, and > > ditto for if you use it when it isn't needed.) > > Yep. Agreed with all of the above. > > > So final conclusion: for non-static symbols, cython should first decide > > whether they are supposed to be shared-library-internal or actually > exported > > from the shared library. > > > > For shared-library-internal symbols: their definition should be marked > > 'visibility(hidden)' on Linux, and unmarked on Windows. This is easy > using > > some preprocessor gunk. (Or maybe simplest is: marked that everywhere > except > > if using msvc, because I think everyone else will understand 'visibility > > (hidden)' even if it's a no op.) Their declaration in the header file > should > > just be 'extern' everywhere. > > > > For shared-library-exported symbols: I am dubious about whether cython > > should even support these at all. But if it does, then the definitions > > should be marked 'dllexport' (no macro trickery needed, because everyone > > understands this syntax), and their declaration in the header file needs > > some extra hack that is the subject of this thread. > > I understand your doubts here, but for the purpose of discussion let's > just assume that it should be supported? :) > > And yes, the issue that the header file needs some hacky > context-dependent declarations. A possible alternative, which in fact > is exactly the work-around I'm using right now, would be for Cython to > generate two headers. In the case of my example they would be "foo.h" > and "foo_internal.h". They would be almost exactly the same except > that the former declares the function dllimport and the latter > declares the function dllexport, and only the _internal.h should be > used to share a declaration between TUs. I'm currently doing this by > writing the "foo_internal.h" manually. > > > Now, back to your example: Here the caller and callee are both compiled > into > > the same shared library, so you don't want dllexport/dllimport at all, > you > > just want a shared-library-internal symbol, which as we see is much > easier. > > Well, again, in this case I do want dllexport/dllimport but I should > have been more clear about that. But supposing I didn't want it, then > this is true. > > > NumPy also ran into some problems with this in our experiments with using > > cython internally. Our temporary solution was to use the preprocessor to > > monkeypatch DL_IMPORT into expanding to the appropriate > > shared-library-internal thing :-). > > Ah, how did you manage that? Do you have to make sure to do it before > Python.h is ever included? > > >> > My first question is why you even need this, since AFAIK there are no > >> > cases > >> > where it is correct to have a cython module dllexporting symbols that > >> > appear > >> > in header files. This is what cimport is for, right? > >> > >> I don't think this has anything to do with cimport. Could you explain > >> what you mean? > > > > We only need to solve the dllimport issue if we want to support > > shared-library-exported symbols from cython extensions. This is only > useful > > if you have different extensions that are directly linking to each other > > using the platform linker. But this simply doesn't work in almost any > cases, > > because platform linkers have all kinds of quirks that don't play well > with > > python packages -- to start with, your runtime linker probably does not > > follow Python's rules for which directories to search to find a shared > > library... To solve these problems, the Cython devs invented the cimport > > mechanism, which is basically a portable, python-centric way of doing > > shared-library-exports while avoiding all the problems caused by using > the > > platform linker. So my question was, what's your use case that's better > > served by linking directly against an extension module and using the > > platform's shared-library-export functionality, instead of cython's? > > Well, yes, but you can't use cimport from C code :) As for the use > case I'll have to get back to you on that. I'm trying to help the > Sage project fix some issues related to that and my understanding is > that it is needed for C code to be able to link directly against code > that's compiled into an extension module. I could be wrong about that > in which case we're free to ignore that case (unless someone does come > up with a use case). So yes, I had better double-check on that :) > > Still, I'm not necessarily talking about linking Cython modules > together, which is why I don't think cimport really comes into it. > > >> > My second question is why you would want to do this via the command > >> > line, > >> > when compiling the dll means that you are compiling some > >> > cython-generated > >> > .c, which means that you can put the #define directly in the source > >> > code, > >> > no? > >> > >> Not necessarily. As you can see in my example the file fooutil.c is > >> hand-written C code that was not generated by Cython, but which uses a > >> function in the Cython-generated code. It includes "foo.h". In > >> principle you're right--the hand-written C code could set the proper > >> #defines but it would have to do so *before* including "foo.h". It's > >> not very obvious that this would be needed. Most other code I've seen > >> that addresses this issue--including cpython itself--do so by passing > >> an appropriate define to the compiler via the command-line, and that > >> seems the clearest to me. > > > > I see, right. I guess my suggestion would be that if a symbol really does > > need to be marked for shared-library-export *and simultaneously* used by > > different files within the same shared library -- which is the only case > > where this arises -- then possibly the simplest and most robust thing is > to > > set up the header file so that external users just do #include "foo.h", > and > > the internal users do > > > > #define FOO_INTERNAL > > #include "foo.h" > > Okay, this is similar to my above suggestion of using separate > headers. I for one prefer the separate headers better, as I think > it's easy to forget to set a #define like that--and to anyone not > working on Windows it's not at all clear why that would be needed. In > fact on Linux it will "just work" without the "#define FOO_INTERNAL" > so without regular testing on Windows it will be too easy to forget. > > I'm not sure if having a separate _internal.h header is any better. > But it *might* be--in particular if it's generated by Cython then it > would force the developer to ask what the difference is between > "foo.h" and "foo_internal.h". And they can both contain comments > explaining when to use them. > > > I think the obvious thing is that cython should provide a natural way to > > mark a function as being shared-library-internal; that covers 99% of real > > use cases *and* just works without any further macros needed. Probably > the > > existing "public" annotation should be changed to mean this. > > Probably, yeah. Ignoring the DL_IMPORT stuff the other effect of > marking a symbol "public" in Cython is to add the appropriate > __PYX_EXTERN_C. And if that were *all* it did then its behavior would > be consistent between platforms :) > > > (Obviously it > > wasn't quite fully thought through in the first place and has few if any > > users, since it got this very wrong without anyone noticing. So fixing it > > seems viable to me.) > > +1 > > > And then *maybe* there should also be a way to make a symbol > > shared-library-exported, if that really is useful, but as a non-default > > experts-only kind of thing, and as such it would be OK to require these > rare > > expert users to be a bit more careful about how they #include the > resulting > > header within their own project. > > Okay. My belief is that there is a case for this, but I should > substantiate it better. Would you be amenable to the generation of a > "_internal.h"? The more I think about it the more I'm > convinced this would be the simplest way to handle this, and would > simplify matters by not requiring Cython to impose any compiler flags > (making my original quesition moot). > > Agreed that it could be non-default. > > Thanks again, > > Erik > > To answer the original question about define macros, it appears that the canonical way to pass preprocessor defines through distutils is to use the define_macros keyword when constructing your Extension class. You should also be able to do this within a Cython source file by including a directive like: # distutils: define_macros = MY_DEFINE, MY_DEFINE_2=2 Unfortunately, it looks like there's a bug in that that's making it so that these macros are undef'ed rather than being defined, so, for now, just pass the appropriate flags to your Extension object. That aside, I agree with Nathaniel that exporting public declarations as a part of the shared object interface was a design mistake. That aside, however, using an api declaration lets you get equivalent results without exposing anything as a part of the shared object API. Here's how this all works: public declarations: export things to C/C++ through the shared object interface. Provide a header that exports this interface. api declarations: export things to C/C++ through capsule objects. Provide a header for the Python module that exports that interface. cimports: Use capsule objects and parsing of pxd files to share things like external declarations, header includes, inline Cython functions, and Cython functions exported by modules between Cython modules. The public and api use cases are essentially the same most of the time, but api declarations use capsules rather than the OS's linker. There are still some trade-offs between public and api functions. Technically, the api functions require that the initialization routine exported in the api header be called for each translation unit that uses them. The public api just requires that the module already be initialized. In cases where no Python functionality is used in a public function, you may be able to get away with using the function without initializing the module, though I really wouldn't recommend that. There are some more subtle issues here though. The reason api functions need to be initialized on a per-translation unit basis is that things exported as api declarations are exported as translation-unit-local (static) function pointers. They aren't shared by the different translation units within a module built from multiple source files. I think that's a mistake. It'd be ideal if we could have api interfaces (or something like them) provide things with shared object local visibility rather than translation unit local visibility. This would require that the API headers have more carefully structured ifdef directives so that a macro could be set in a given translation unit to designate when to emit the actual declarations for the needed pointers rather than just forward declaring them. It would also require that the main generated c/cpp file define the pointers it uses as shared-object-local rather rather than static. In dynd-python we currently solve this problem by defining shared object local wrappers for the api exported function pointers and then using those instead, but I'm not a huge fan of that approach. It works well, but results in another unnecessary layer of indirection through the source files to connect the C++ code back to its Python bindings. With regards to the dllexporting/dllimporting of things: given that public declarations are already designed to export things through the shared object interface, we may as well fix the current setup to export the right things. It's a bad design that probably ought to be deprecated or at least documented better so people know not to use it unless their case actually requires sidestepping best practices. On the other hand, it's also a supported interface, so there's value in "fixing" it. I think the best way to do that is the following: - mark public symbols as dllimport unless a given (module specific) preprocessor define is set. - people using the public header outside of the module exporting the symbols should not have to set the define at all. - people using the public header to compile other source files that are linked in to the same Python module should set the preprocessor flag for that module. On top of that, at some point we still need to fix our api and public headers so that they still work when included into the translation unit for the main Cython-generated c/cpp file. This use-case should just forward-declare everything since the needed symbols are all defined later on in the Cython module. Since static variables cannot be forward declared in C, this will require that api declarations use shared object local symbols or that the main generated c/cpp file use some ifdef guards when it initializes the various pointers in question. As far as making an additional header goes, I personally prefer the extra preprocessor define. On the other hand, if people think an additional header is easier to use, then why not make it do something like #define USE_DLLEXPORT_NOT_DLLIMPORT_FOR_PARTICULAR_MODULE #include I think, that'd cover all the use cases better. Anyway, sorry for the length of my remarks on this. There are several issues here that have been bothering me for quite some time. Best, -Ian Henriksen -------------- next part -------------- An HTML attachment was scrubbed... URL: From insertinterestingnamehere at gmail.com Mon Apr 11 14:36:38 2016 From: insertinterestingnamehere at gmail.com (Ian Henriksen) Date: Mon, 11 Apr 2016 18:36:38 +0000 Subject: [Cython] Fwd: Question about how best require compiler options for C sources In-Reply-To: References: Message-ID: On Mon, Apr 11, 2016 at 11:50 AM Ian Henriksen < insertinterestingnamehere at gmail.com> wrote: > To answer the original question about define macros, it appears that the > canonical > way to pass preprocessor defines through distutils is to use the > define_macros > keyword when constructing your Extension class. You should also be able to > do > this within a Cython source file by including a directive like: > > # distutils: define_macros = MY_DEFINE, MY_DEFINE_2=2 > > Unfortunately, it looks like there's a bug in that that's making it so > that these > macros are undef'ed rather than being defined, so, for now, just pass the > appropriate flags to your Extension object. > > Small update on this, it looks like the issue with undef/define only applies when a define is specified this way without a value, so I either haven't gotten the syntax quite right, or that's not supported yet. Specifying an actual value for the macro works fine. Best, Ian Henriksen -------------- next part -------------- An HTML attachment was scrubbed... URL: From erik.m.bray at gmail.com Tue Apr 12 04:08:07 2016 From: erik.m.bray at gmail.com (Erik Bray) Date: Tue, 12 Apr 2016 10:08:07 +0200 Subject: [Cython] Fwd: Question about how best require compiler options for C sources In-Reply-To: <570BE11B.6060500@cage.ugent.be> References: <570BE11B.6060500@cage.ugent.be> Message-ID: On Mon, Apr 11, 2016 at 7:38 PM, Jeroen Demeyer wrote: > On 2016-04-11 15:23, Erik Bray wrote: >> >> In this case I really do want the symbol "hello" to be >> exported by the DLL, as well as be understood between translation >> units making up the same library. > > > Are you really sure that you want this? I doubt that this is supported on OS > X: on OS X, there are two kinds of files: shared libraries (with .dylib > extension) and loadable modules or bundles (usually with .so extension but > officially with .bundle extension). > > C extensions for Python are compiled as a loadable module and I don't think > you can link against these. See > http://stackoverflow.com/questions/2339679/what-are-the-differences-between-so-and-dylib-on-osx > > Some details in the above might be wrong, but I remember running into this > issue in some early version of cysignals. OSX issues aside, I was under the impression that this is needed for cysignals in particular. If I'm wrong on that then this simplifies matters a good deal, and per njs Cython could just drop the use of DL_IMPORT and we'd be fine. From erik.m.bray at gmail.com Tue Apr 12 04:16:12 2016 From: erik.m.bray at gmail.com (Erik Bray) Date: Tue, 12 Apr 2016 10:16:12 +0200 Subject: [Cython] Fwd: Question about how best require compiler options for C sources In-Reply-To: References: Message-ID: On Mon, Apr 11, 2016 at 8:36 PM, Ian Henriksen wrote: > On Mon, Apr 11, 2016 at 11:50 AM Ian Henriksen > wrote: >> >> To answer the original question about define macros, it appears that the >> canonical >> way to pass preprocessor defines through distutils is to use the >> define_macros >> keyword when constructing your Extension class. You should also be able to >> do >> this within a Cython source file by including a directive like: >> >> # distutils: define_macros = MY_DEFINE, MY_DEFINE_2=2 >> >> Unfortunately, it looks like there's a bug in that that's making it so >> that these >> macros are undef'ed rather than being defined, so, for now, just pass the >> appropriate flags to your Extension object. >> > > Small update on this, it looks like the issue with undef/define only applies > when a > define is specified this way without a value, so I either haven't gotten the > syntax quite > right, or that's not supported yet. Specifying an actual value for the macro > works fine. There is an issue specifically about this that I wanted to bring up, though it's a bit tangential to my original issue (but maybe not entirely?) I know about putting "# distutils: " comments in a Cython source. But I felt that requiring this in order to get dllexport/dllimport working correctly on Windows is too much of a burden for a developer (who normally might not be thinking about this). The content of these comments *do* get written out to the generated C sources under a "distutils" property in the JSON metadata at the top of the file. So I thought one way I might be able to influence the compiler flags would be to modify the code that generates the JSON metadata to include a `"distutils: {"define_macros": [ ... ]}` entry in that metadata. However, this was wrong because Cython itself never actually does anything with that metadata, and it instead reads the distutils directives (a second time) directly out of the Cython sources during cythonize() in order to update the Extension attributes. I'm thinking it would make more sense if the distutils options were read from the C sources, since those are what will *actually* be handled by distutils, even if in the majority of cases the JSON metadata is being generated directly from the Cython sources. Changing this would have also provided a solution to my original problem, even if we've agreed that it's mostly moot. That said, I think it makes more sense for cythonize() to read the distutils options from the C source instead of the Cython source, though in practice I don't know if it's a worthwhile change or not. Thanks, Erik From jdemeyer at cage.ugent.be Tue Apr 12 04:27:38 2016 From: jdemeyer at cage.ugent.be (Jeroen Demeyer) Date: Tue, 12 Apr 2016 10:27:38 +0200 Subject: [Cython] Fwd: Question about how best require compiler options for C sources In-Reply-To: References: Message-ID: <570CB17A.30702@cage.ugent.be> On 2016-04-12 10:16, Erik Bray wrote: > That said, I > think it makes more sense for cythonize() to read the distutils > options from the C source instead of the Cython source, though in > practice I don't know if it's a worthwhile change or not. I don't quite get what you mean. The C file is only read by distutils (which compiles the .c files to .so), not by Cython (which compiles the .pyx files to .c). From jdemeyer at cage.ugent.be Tue Apr 12 04:25:07 2016 From: jdemeyer at cage.ugent.be (Jeroen Demeyer) Date: Tue, 12 Apr 2016 10:25:07 +0200 Subject: [Cython] Fwd: Question about how best require compiler options for C sources In-Reply-To: References: <570BE11B.6060500@cage.ugent.be> Message-ID: <570CB0E3.6010301@cage.ugent.be> On 2016-04-12 10:08, Erik Bray wrote: > OSX issues aside, I was under the impression that this is needed for > cysignals in particular. No. cysignals is complicated, but it doesn't need dynamic linking to Python modules (.so files). It does need "internal" linking: it needs to link a non-Cython-generated .c file together with a Cython-generated .c file into one Python module. It uses "cdef public" for that, so it would run in exactly the problem that this thread is about. So I personally would not mind if Cython would completely drop support for exporting symbols from C extensions (which, AFAIK, doesn't work on OS X anyway). From erik.m.bray at gmail.com Tue Apr 12 04:34:53 2016 From: erik.m.bray at gmail.com (Erik Bray) Date: Tue, 12 Apr 2016 10:34:53 +0200 Subject: [Cython] Fwd: Question about how best require compiler options for C sources In-Reply-To: References: Message-ID: On Mon, Apr 11, 2016 at 7:49 PM, Ian Henriksen wrote: > That aside, I agree with Nathaniel that exporting public declarations as a > part of the > shared object interface was a design mistake. That aside, however, using an > api > declaration lets you get equivalent results without exposing anything as a > part of > the shared object API. Here's how this all works: I don't know if I would outright call it a "mistake", but in retrospect I think we're all in agreement now that maybe the behavior of "public" should be curtailed? Or should we still try to keep its current behavior and "fix" it as I've been trying to do? > public declarations: export things to C/C++ through the shared object > interface. > Provide a header that exports this interface. > api declarations: export things to C/C++ through capsule objects. Provide a > header > for the Python module that exports that interface. > cimports: Use capsule objects and parsing of pxd files to share things like > external > declarations, header includes, inline Cython functions, and Cython functions > exported by modules between Cython modules. > > The public and api use cases are essentially the same most of the time, but > api > declarations use capsules rather than the OS's linker. > > There are still some trade-offs between public and api functions. > Technically, the > api functions require that the initialization routine exported in the api > header be > called for each translation unit that uses them. The public api just > requires that the > module already be initialized. In cases where no Python functionality is > used in a > public function, you may be able to get away with using the function without > initializing the module, though I really wouldn't recommend that. Yes, this is the main problem I have with the "api" declaration. Don't get me wrong, it's a very clever approach and I think for a lot of cases it's the right thing to do. But it absolutely requires and assumes initialization of the module. In a general case that would be the right thing to do. But in theory one could have some plain C code that does not use any of the Python API, and that one can be sure doesn't require anything initialized by the module init function. > There are some more subtle issues here though. The reason api functions need > to > be initialized on a per-translation unit basis is that things exported as > api > declarations are exported as translation-unit-local (static) function > pointers. They > aren't shared by the different translation units within a module built from > multiple > source files. I think that's a mistake. It'd be ideal if we could have api > interfaces (or > something like them) provide things with shared object local visibility > rather than > translation unit local visibility. This would require that the API headers > have more > carefully structured ifdef directives so that a macro could be set in a > given > translation unit to designate when to emit the actual declarations for the > needed > pointers rather than just forward declaring them. It would also require that > the main > generated c/cpp file define the pointers it uses as shared-object-local > rather rather > than static. Yep. This is a problem cysignals is having if I recall correctly. It would be good to be able to have it both ways. > In dynd-python we currently solve this problem by defining shared object > local > wrappers for the api exported function pointers and then using those > instead, but > I'm not a huge fan of that approach. It works well, but results in another > unnecessary layer of indirection through the source files to connect the C++ > code > back to its Python bindings. Yes, I think some kind of context-dependent declarations using carefully crafted preprocessor directives, or just using separate header files entirely for the two contexts, would be best. > With regards to the dllexporting/dllimporting of things: given that public > declarations > are already designed to export things through the shared object interface, > we may > as well fix the current setup to export the right things. It's a bad design > that > probably ought to be deprecated or at least documented better so people know > not > to use it unless their case actually requires sidestepping best practices. > On the > other hand, it's also a supported interface, so there's value in "fixing" > it. > > I think the best way to do that is the following: > - mark public symbols as dllimport unless a given (module specific) > preprocessor > define is set. > - people using the public header outside of the module exporting the symbols > should not have to set the define at all. > - people using the public header to compile other source files that are > linked in to > the same Python module should set the preprocessor flag for that module. That's the approach I was trying to take originally, yes. But I think it's a burden to require developers to set that preprocessor flag--instead it should happen automatically, (or not at all, with preference instead for using a completely separate header for intra-module use). > On top of that, at some point we still need to fix our api and public > headers so that > they still work when included into the translation unit for the main > Cython-generated > c/cpp file. This use-case should just forward-declare everything since the > needed > symbols are all defined later on in the Cython module. Since static > variables > cannot be forward declared in C, this will require that api declarations use > shared > object local symbols or that the main generated c/cpp file use some ifdef > guards > when it initializes the various pointers in question. Sounds good--I see no problem with this. > As far as making an additional header goes, I personally prefer the extra > preprocessor define. On the other hand, if people think an additional header > is > easier to use, then why not make it do something like > > #define USE_DLLEXPORT_NOT_DLLIMPORT_FOR_PARTICULAR_MODULE > #include > > I think, that'd cover all the use cases better. That would be fine too and reduce duplication. I think such a header should still be generated though and it should be documented when to use which header. > Anyway, sorry for the length of my remarks on this. There are several issues > here > that have been bothering me for quite some time. No, not at all. I'm glad I brought it up--it's been productive :) Best, Erik From jdemeyer at cage.ugent.be Tue Apr 12 04:36:48 2016 From: jdemeyer at cage.ugent.be (Jeroen Demeyer) Date: Tue, 12 Apr 2016 10:36:48 +0200 Subject: [Cython] cdef public declarations Message-ID: <570CB3A0.2030108@cage.ugent.be> (this thread is related to the thread "Question about how best require compiler options for C sources") I have a question for Cython users and developers: are you sure that "cdef public" actually works as documented at http://docs.cython.org/src/userguide/external_C_code.html#public-declarations I couldn't find it tested in the Cython testsuite, although it's not easy to know what to grep for. I hightly doubt that it works on OS X because of the difference between shared libraries (with .dylib extension) and loadable modules/bundles (with .so extension). From erik.m.bray at gmail.com Tue Apr 12 04:43:56 2016 From: erik.m.bray at gmail.com (Erik Bray) Date: Tue, 12 Apr 2016 10:43:56 +0200 Subject: [Cython] Fwd: Question about how best require compiler options for C sources In-Reply-To: <570CB17A.30702@cage.ugent.be> References: <570CB17A.30702@cage.ugent.be> Message-ID: On Tue, Apr 12, 2016 at 10:27 AM, Jeroen Demeyer wrote: > On 2016-04-12 10:16, Erik Bray wrote: >> >> That said, I >> think it makes more sense for cythonize() to read the distutils >> options from the C source instead of the Cython source, though in >> practice I don't know if it's a worthwhile change or not. > > > I don't quite get what you mean. The C file is only read by distutils (which > compiles the .c files to .so), not by Cython (which compiles the .pyx files > to .c). Right. What I'm saying is that when you call cythonize() the output is a list of Extension objects, which include instructions for how to distutils should compile the C sources (in this case, the define_macros attribute in particular). cythonize(), at some step along the way, reads "# distutils:" directives directly out of the .pyx sources, but then applies them to the Extension object representing the C sources to be compiled. What I'm suggesting is that there should be a clearer separation--the Cython compiler should generate the C sources first--including outputting their JSON metadata which may include some distutils directives, but otherwise remain completely agnostic as to how the C sources will ultimately be compiled (currently this *is* true insofar as how Cython's compile() works--the confusion is in cythonize()). After the Cython sources have been compiled to C sources, the cythonize() function should then inspect the resulting C sources (which are given explicitly in CompilerResult objects, which are currently ignored by cythonize()) and use the metadata in the C sources to tell distutils what to do with them. The Cython sources shouldn't be consulted any further at this point. It's a subtle difference which currently wouldn't affect the end result. The reason I thought of this is that I wanted to be able to set "distutils" directives directly in the C sources without them needing to be manually declared in the Cython sources. But even without that use case it's a clearer separation of concerns. Erik From erik.m.bray at gmail.com Tue Apr 12 04:49:00 2016 From: erik.m.bray at gmail.com (Erik Bray) Date: Tue, 12 Apr 2016 10:49:00 +0200 Subject: [Cython] cdef public declarations In-Reply-To: <570CB3A0.2030108@cage.ugent.be> References: <570CB3A0.2030108@cage.ugent.be> Message-ID: On Tue, Apr 12, 2016 at 10:36 AM, Jeroen Demeyer wrote: > (this thread is related to the thread "Question about how best require > compiler options for C sources") > > I have a question for Cython users and developers: are you sure that "cdef > public" actually works as documented at > http://docs.cython.org/src/userguide/external_C_code.html#public-declarations > > I couldn't find it tested in the Cython testsuite, although it's not easy to > know what to grep for. I hightly doubt that it works on OS X because of the > difference between shared libraries (with .dylib extension) and loadable > modules/bundles (with .so extension). Do you have a reference about this distinction on hand? I've seen .dylib files on OSX before but never understood the distinction. I tend to irrationally avoid learning anything about OSX for as long as possible, the way most OSS people do with Windows :) Seems like a good question though. From jdemeyer at cage.ugent.be Tue Apr 12 05:46:23 2016 From: jdemeyer at cage.ugent.be (Jeroen Demeyer) Date: Tue, 12 Apr 2016 11:46:23 +0200 Subject: [Cython] cdef public declarations In-Reply-To: References: <570CB3A0.2030108@cage.ugent.be> Message-ID: <570CC3EF.6020901@cage.ugent.be> On 2016-04-12 10:49, Erik Bray wrote: > Do you have a reference about this distinction on hand? I think these two links are useful: [1] http://docstore.mik.ua/orelly/unix3/mac/ch05_03.htm [2] http://stackoverflow.com/questions/2339679/what-are-the-differences-between-so-and-dylib-on-osx From insertinterestingnamehere at gmail.com Tue Apr 12 12:30:50 2016 From: insertinterestingnamehere at gmail.com (Ian Henriksen) Date: Tue, 12 Apr 2016 16:30:50 +0000 Subject: [Cython] Fwd: Question about how best require compiler options for C sources In-Reply-To: References: Message-ID: On Tue, Apr 12, 2016 at 2:35 AM Erik Bray wrote: > On Mon, Apr 11, 2016 at 7:49 PM, Ian Henriksen > wrote: > > That aside, I agree with Nathaniel that exporting public declarations as > a > > part of the > > shared object interface was a design mistake. That aside, however, using > an > > api > > declaration lets you get equivalent results without exposing anything as > a > > part of > > the shared object API. Here's how this all works: > > I don't know if I would outright call it a "mistake", but in > retrospect I think we're all in agreement now that maybe the behavior > of "public" should be curtailed? Or should we still try to keep its > current behavior and "fix" it as I've been trying to do? > Great question, I don't know for sure. Input from others here would be nice. My thoughts are basically just that we should do one or the other. If it's supported and we want to keep supporting it, let's make it work right on platforms other than Linux. If we want to stop supporting it, let's deprecate it. Best, -Ian -------------- next part -------------- An HTML attachment was scrubbed... URL: From insertinterestingnamehere at gmail.com Tue Apr 12 12:42:34 2016 From: insertinterestingnamehere at gmail.com (Ian Henriksen) Date: Tue, 12 Apr 2016 16:42:34 +0000 Subject: [Cython] Fwd: Question about how best require compiler options for C sources In-Reply-To: References: Message-ID: On Tue, Apr 12, 2016 at 2:35 AM Erik Bray wrote: > On Mon, Apr 11, 2016 at 7:49 PM, Ian Henriksen wrote: > > With regards to the dllexporting/dllimporting of things: given that > public > > declarations > > are already designed to export things through the shared object > interface, > > we may > > as well fix the current setup to export the right things. It's a bad > design > > that > > probably ought to be deprecated or at least documented better so people > know > > not > > to use it unless their case actually requires sidestepping best > practices. > > On the > > other hand, it's also a supported interface, so there's value in "fixing" > > it. > > > > I think the best way to do that is the following: > > - mark public symbols as dllimport unless a given (module specific) > > preprocessor > > define is set. > > - people using the public header outside of the module exporting the > symbols > > should not have to set the define at all. > > - people using the public header to compile other source files that are > > linked in to > > the same Python module should set the preprocessor flag for that module. > > That's the approach I was trying to take originally, yes. But I think > it's a burden to require developers to set that preprocessor > flag--instead it should happen automatically, (or not at all, with > preference instead for using a completely separate header for > intra-module use). > > > On top of that, at some point we still need to fix our api and public > > headers so that > > they still work when included into the translation unit for the main > > Cython-generated > > c/cpp file. This use-case should just forward-declare everything since > the > > needed > > symbols are all defined later on in the Cython module. Since static > > variables > > cannot be forward declared in C, this will require that api declarations > use > > shared > > object local symbols or that the main generated c/cpp file use some ifdef > > guards > > when it initializes the various pointers in question. > > Sounds good--I see no problem with this. > > > As far as making an additional header goes, I personally prefer the extra > > preprocessor define. On the other hand, if people think an additional > header > > is > > easier to use, then why not make it do something like > > > > #define USE_DLLEXPORT_NOT_DLLIMPORT_FOR_PARTICULAR_MODULE > > #include > > > > I think, that'd cover all the use cases better. > > That would be fine too and reduce duplication. I think such a header > should still be generated though and it should be documented when to > use which header. > > After thinking this over a bit more, it seems like we're discussing things that really have more to do with the API chosen for a particular module than anything else. Maybe these options should all be made available as Compiler flags to Cython instead. As far as I can tell there are two separate options here: - Whether to provide distinct headers for defining the needed importing functionality, or declaring it as extern - Whether to make the preprocessor define flag for a given module's header work as an opt-in or an opt-out for defining the symbols needed to import and use the module. (numpy's headers use an opt-out system. DyND doesn't currently have this set up all of the way, but we'd like to use an opt-in model for working with our headers). Thoughts? Best, -Ian -------------- next part -------------- An HTML attachment was scrubbed... URL: From insertinterestingnamehere at gmail.com Tue Apr 12 18:12:28 2016 From: insertinterestingnamehere at gmail.com (Ian Henriksen) Date: Tue, 12 Apr 2016 22:12:28 +0000 Subject: [Cython] Fwd: Question about how best require compiler options for C sources In-Reply-To: References: Message-ID: On Mon, Apr 11, 2016 at 12:36 PM Ian Henriksen < insertinterestingnamehere at gmail.com> wrote: > On Mon, Apr 11, 2016 at 11:50 AM Ian Henriksen < > insertinterestingnamehere at gmail.com> wrote: > >> To answer the original question about define macros, it appears that the >> canonical >> way to pass preprocessor defines through distutils is to use the >> define_macros >> keyword when constructing your Extension class. You should also be able >> to do >> this within a Cython source file by including a directive like: >> >> # distutils: define_macros = MY_DEFINE, MY_DEFINE_2=2 >> >> Unfortunately, it looks like there's a bug in that that's making it so >> that these >> macros are undef'ed rather than being defined, so, for now, just pass the >> appropriate flags to your Extension object. >> >> > Small update on this, it looks like the issue with undef/define only > applies when a > define is specified this way without a value, so I either haven't gotten > the syntax quite > right, or that's not supported yet. Specifying an actual value for the > macro works fine. > > Best, > Ian Henriksen > Should be fixed in https://github.com/cython/cython/pull/509. Best, Ian Henriksen -------------- next part -------------- An HTML attachment was scrubbed... URL: From manuel.nuno.melo at gmail.com Wed Apr 13 15:35:28 2016 From: manuel.nuno.melo at gmail.com (Manuel Nuno Melo) Date: Wed, 13 Apr 2016 21:35:28 +0200 Subject: [Cython] Cannot cythonize subclasses of setuptools.extension._Extension Message-ID: Hello devs, I'm developing the setup.py for a scientific package, MDAnalysis (see PR #799 ). We depend on distutils and setuptool. Namely, we use setuptools.extension.Extension class for our extensions. Some older versions of setuptools (<18.0) do filename cythonization themselves upon initialization of the Extension object. Because we want to control name cythonization ourselves I try to directly use distutils.extension.Extension, which has none of setuptools' cythonization. However, this doesn't work because setuptools patches distutils, so that distutils.extension.Extension effectively becomes setuptools.extension.Extension. setuptools does provide a copy of the unpatched Extension class, via setuptools.extension._Extension. Yet, if I attempt to pass instances of this class to Cython.cythonize it fails because my extension parent no longer conforms to one of the expected classes: File "./setup.py", line 283, in cythonize new_ext_modules = Cython.cythonize(self.ext_modules) File "/scratch/virtualenv/lib/python2.7/site-packages/Cython-0.24-py2.7-linux-x86_64.egg/Cython/Build/Dependencies.py", line 796, in cythonize aliases=aliases) File "/scratch/virtualenv/lib/python2.7/site-packages/Cython-0.24-py2.7-linux-x86_64.egg/Cython/Build/Dependencies.py", line 686, in create_extension_list raise TypeError(msg) TypeError: pattern is not of type str nor subclass of Extension () but of type and class distutils.extension.Extension I believe cythonize should be more permissive with the extension parent classes it accepts, or at least have the setuptools _Extension one in the whitelist, since it's the only way to access the original distutils class. What do you think? In the meantime I'll solve this by subclassing setuptools.extension.Extension and making sure it doesn't cythonize anything. Cheers, Manuel -------------- next part -------------- An HTML attachment was scrubbed... URL: From erik.m.bray at gmail.com Thu Apr 14 09:08:04 2016 From: erik.m.bray at gmail.com (Erik Bray) Date: Thu, 14 Apr 2016 15:08:04 +0200 Subject: [Cython] Cannot cythonize subclasses of setuptools.extension._Extension In-Reply-To: References: Message-ID: On Wed, Apr 13, 2016 at 9:35 PM, Manuel Nuno Melo wrote: > Hello devs, > > I'm developing the setup.py for a scientific package, MDAnalysis (see PR > #799). We depend on distutils and setuptool. Namely, we use > setuptools.extension.Extension class for our extensions. > > Some older versions of setuptools (<18.0) do filename cythonization > themselves upon initialization of the Extension object. > > Because we want to control name cythonization ourselves I try to directly > use distutils.extension.Extension, which has none of setuptools' > cythonization. However, this doesn't work because setuptools patches > distutils, so that distutils.extension.Extension effectively becomes > setuptools.extension.Extension. I'm wondering what it is specifically you need to do in your subclass--might it still be possible to do with a subclass of the setuptools Extension? Not saying I disagree with the overall idea, but I also wonder if there isn't a better way. Erik From stefan_ml at behnel.de Thu Apr 14 12:15:40 2016 From: stefan_ml at behnel.de (Stefan Behnel) Date: Thu, 14 Apr 2016 18:15:40 +0200 Subject: [Cython] PEP 509: detect dict modification by version tag Message-ID: <570FC22C.6020102@behnel.de> Hi! This new PEP seems interesting for Cython optimisations, too: https://www.python.org/dev/peps/pep-0509/ Essentially, it adds a 64 bit modification counter to dicts that allows detecting unmodified dicts, e.g. during lookups of methods or globals. It's currently proposed for Py3.6. Stefan From matthew.brett at gmail.com Thu Apr 14 14:16:30 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Thu, 14 Apr 2016 11:16:30 -0700 Subject: [Cython] Cannot cythonize subclasses of setuptools.extension._Extension In-Reply-To: References: Message-ID: On Thu, Apr 14, 2016 at 6:08 AM, Erik Bray wrote: > On Wed, Apr 13, 2016 at 9:35 PM, Manuel Nuno Melo > wrote: >> Hello devs, >> >> I'm developing the setup.py for a scientific package, MDAnalysis (see PR >> #799). We depend on distutils and setuptool. Namely, we use >> setuptools.extension.Extension class for our extensions. >> >> Some older versions of setuptools (<18.0) do filename cythonization >> themselves upon initialization of the Extension object. >> >> Because we want to control name cythonization ourselves I try to directly >> use distutils.extension.Extension, which has none of setuptools' >> cythonization. However, this doesn't work because setuptools patches >> distutils, so that distutils.extension.Extension effectively becomes >> setuptools.extension.Extension. > > I'm wondering what it is specifically you need to do in your > subclass--might it still be possible to do with a subclass of the > setuptools Extension? Not saying I disagree with the overall idea, > but I also wonder if there isn't a better way. I know this is a terrible and ugly hack, but the projects I work in have a 'fake_pyrex' directory, that fools setuptools into thinking that 'pyrex' is installed, and therefore prevents it from doing the .pyx -> .c filename conversions in the extension: https://github.com/regreg/regreg/blob/master/setup.py#L33 Cheers, Matthew From manuel.nuno.melo at gmail.com Thu Apr 14 15:07:01 2016 From: manuel.nuno.melo at gmail.com (Manuel Nuno Melo) Date: Thu, 14 Apr 2016 21:07:01 +0200 Subject: [Cython] Cannot cythonize subclasses of setuptools.extension._Extension In-Reply-To: References: Message-ID: Our need to control cythonization comes from the fact that we implement cython as a lazy and optional dependency. Lazy in the sense that we delay as much as possible cythonization so that setuptools or pip have time to install cython, if needed. Optional because we distribute both .pyx and cythonized .c files, and decide on which to use based on user flags. We, therefore, need an Extension class that only cythonizes if we decide to. Thanks for the feedback, Manel On Apr 14, 2016 8:17 PM, "Matthew Brett" wrote: > > On Thu, Apr 14, 2016 at 6:08 AM, Erik Bray wrote: > > On Wed, Apr 13, 2016 at 9:35 PM, Manuel Nuno Melo > > wrote: > >> Hello devs, > >> > >> I'm developing the setup.py for a scientific package, MDAnalysis (see PR > >> #799). We depend on distutils and setuptool. Namely, we use > >> setuptools.extension.Extension class for our extensions. > >> > >> Some older versions of setuptools (<18.0) do filename cythonization > >> themselves upon initialization of the Extension object. > >> > >> Because we want to control name cythonization ourselves I try to directly > >> use distutils.extension.Extension, which has none of setuptools' > >> cythonization. However, this doesn't work because setuptools patches > >> distutils, so that distutils.extension.Extension effectively becomes > >> setuptools.extension.Extension. > > > > I'm wondering what it is specifically you need to do in your > > subclass--might it still be possible to do with a subclass of the > > setuptools Extension? Not saying I disagree with the overall idea, > > but I also wonder if there isn't a better way. > > I know this is a terrible and ugly hack, but the projects I work in > have a 'fake_pyrex' directory, that fools setuptools into thinking > that 'pyrex' is installed, and therefore prevents it from doing the > .pyx -> .c filename conversions in the extension: > > https://github.com/regreg/regreg/blob/master/setup.py#L33 > > Cheers, > > Matthew > _______________________________________________ > cython-devel mailing list > cython-devel at python.org > https://mail.python.org/mailman/listinfo/cython-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From robertwb at gmail.com Thu Apr 14 19:05:19 2016 From: robertwb at gmail.com (Robert Bradshaw) Date: Thu, 14 Apr 2016 16:05:19 -0700 Subject: [Cython] PEP 509: detect dict modification by version tag In-Reply-To: <570FC22C.6020102@behnel.de> References: <570FC22C.6020102@behnel.de> Message-ID: Cool! This was essentially my answer for "if you were to propose a Python PEP, what would it be" on the podcast. We should get Cython listed as a potential user as well. On Thu, Apr 14, 2016 at 9:15 AM, Stefan Behnel wrote: > Hi! > > This new PEP seems interesting for Cython optimisations, too: > > https://www.python.org/dev/peps/pep-0509/ > > Essentially, it adds a 64 bit modification counter to dicts that allows > detecting unmodified dicts, e.g. during lookups of methods or globals. > > It's currently proposed for Py3.6. > > Stefan > _______________________________________________ > cython-devel mailing list > cython-devel at python.org > https://mail.python.org/mailman/listinfo/cython-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From robertwb at gmail.com Fri Apr 15 00:19:21 2016 From: robertwb at gmail.com (Robert Bradshaw) Date: Thu, 14 Apr 2016 21:19:21 -0700 Subject: [Cython] Cannot cythonize subclasses of setuptools.extension._Extension In-Reply-To: References: Message-ID: I agree that we shouldn't disallow distutils' Extensions just because setuptools was imported. https://github.com/cython/cython/commit/d804bd2d8d9aac04b92d4bfb8dbc7f8a4c8079ac On Thu, Apr 14, 2016 at 12:07 PM, Manuel Nuno Melo < manuel.nuno.melo at gmail.com> wrote: > Our need to control cythonization comes from the fact that we implement > cython as a lazy and optional dependency. Lazy in the sense that we delay > as much as possible cythonization so that setuptools or pip have time to > install cython, if needed. Optional because we distribute both .pyx and > cythonized .c files, and decide on which to use based on user flags. > > We, therefore, need an Extension class that only cythonizes if we decide > to. > > Thanks for the feedback, > Manel > > On Apr 14, 2016 8:17 PM, "Matthew Brett" wrote: > > > > On Thu, Apr 14, 2016 at 6:08 AM, Erik Bray > wrote: > > > On Wed, Apr 13, 2016 at 9:35 PM, Manuel Nuno Melo > > > wrote: > > >> Hello devs, > > >> > > >> I'm developing the setup.py for a scientific package, MDAnalysis (see > PR > > >> #799). We depend on distutils and setuptool. Namely, we use > > >> setuptools.extension.Extension class for our extensions. > > >> > > >> Some older versions of setuptools (<18.0) do filename cythonization > > >> themselves upon initialization of the Extension object. > > >> > > >> Because we want to control name cythonization ourselves I try to > directly > > >> use distutils.extension.Extension, which has none of setuptools' > > >> cythonization. However, this doesn't work because setuptools patches > > >> distutils, so that distutils.extension.Extension effectively becomes > > >> setuptools.extension.Extension. > > > > > > I'm wondering what it is specifically you need to do in your > > > subclass--might it still be possible to do with a subclass of the > > > setuptools Extension? Not saying I disagree with the overall idea, > > > but I also wonder if there isn't a better way. > > > > I know this is a terrible and ugly hack, but the projects I work in > > have a 'fake_pyrex' directory, that fools setuptools into thinking > > that 'pyrex' is installed, and therefore prevents it from doing the > > .pyx -> .c filename conversions in the extension: > > > > https://github.com/regreg/regreg/blob/master/setup.py#L33 > > > > Cheers, > > > > Matthew > > _______________________________________________ > > cython-devel mailing list > > cython-devel at python.org > > https://mail.python.org/mailman/listinfo/cython-devel > > _______________________________________________ > cython-devel mailing list > cython-devel at python.org > https://mail.python.org/mailman/listinfo/cython-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From erik.m.bray at gmail.com Sat Apr 16 07:10:57 2016 From: erik.m.bray at gmail.com (Erik Bray) Date: Sat, 16 Apr 2016 13:10:57 +0200 Subject: [Cython] Cannot cythonize subclasses of setuptools.extension._Extension In-Reply-To: References: Message-ID: On Apr 14, 2016 21:07, "Manuel Nuno Melo" wrote: > > Our need to control cythonization comes from the fact that we implement cython as a lazy and optional dependency. Lazy in the sense that we delay as much as possible cythonization so that setuptools or pip have time to install cython, if needed. Optional because we distribute both .pyx and cythonized .c files, and decide on which to use based on user flags. > > We, therefore, need an Extension class that only cythonizes if we decide to. > > Thanks for the feedback, > Manel I have a solution for exactly this in astropy_helpers which makes it possible, for example, to pull in Cython via setup_requires, but only if it's needed. Remind me on Monday and I can point out how it works. > On Apr 14, 2016 8:17 PM, "Matthew Brett" wrote: > > > > On Thu, Apr 14, 2016 at 6:08 AM, Erik Bray wrote: > > > On Wed, Apr 13, 2016 at 9:35 PM, Manuel Nuno Melo > > > wrote: > > >> Hello devs, > > >> > > >> I'm developing the setup.py for a scientific package, MDAnalysis (see PR > > >> #799). We depend on distutils and setuptool. Namely, we use > > >> setuptools.extension.Extension class for our extensions. > > >> > > >> Some older versions of setuptools (<18.0) do filename cythonization > > >> themselves upon initialization of the Extension object. > > >> > > >> Because we want to control name cythonization ourselves I try to directly > > >> use distutils.extension.Extension, which has none of setuptools' > > >> cythonization. However, this doesn't work because setuptools patches > > >> distutils, so that distutils.extension.Extension effectively becomes > > >> setuptools.extension.Extension. > > > > > > I'm wondering what it is specifically you need to do in your > > > subclass--might it still be possible to do with a subclass of the > > > setuptools Extension? Not saying I disagree with the overall idea, > > > but I also wonder if there isn't a better way. > > > > I know this is a terrible and ugly hack, but the projects I work in > > have a 'fake_pyrex' directory, that fools setuptools into thinking > > that 'pyrex' is installed, and therefore prevents it from doing the > > .pyx -> .c filename conversions in the extension: > > > > https://github.com/regreg/regreg/blob/master/setup.py#L33 > > > > Cheers, > > > > Matthew > > _______________________________________________ > > cython-devel mailing list > > cython-devel at python.org > > https://mail.python.org/mailman/listinfo/cython-devel > > > _______________________________________________ > cython-devel mailing list > cython-devel at python.org > https://mail.python.org/mailman/listinfo/cython-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From manuel.nuno.melo at gmail.com Sat Apr 16 07:29:28 2016 From: manuel.nuno.melo at gmail.com (Manuel Nuno Melo) Date: Sat, 16 Apr 2016 13:29:28 +0200 Subject: [Cython] Cannot cythonize subclasses of setuptools.extension._Extension In-Reply-To: References: Message-ID: Hi Erik, Please post your solution; I'm curious to see it. Currently, we're also using setup_requires but are moving away from it because of two main issues: 1- there's no flexibility to hook into it via setuptools.Distribution subclassing (unless you rewrite the entire __init__); 2- (and more serious) setuptools installs setup_requires dependencies in the local directory, not as a system-wide install. Even worse, it then puts a path link to the current directory under the system-wide easy_install.pth. This means that if you install from the source directory via 'sudo ./setup.py install', you get a Cython egg downloaded into that directory, and its path added to sys.path (!!). Needless to say this can break in many nasty ways... On Sat, Apr 16, 2016 at 1:10 PM, Erik Bray wrote: > On Apr 14, 2016 21:07, "Manuel Nuno Melo" > wrote: > > > > Our need to control cythonization comes from the fact that we implement > cython as a lazy and optional dependency. Lazy in the sense that we delay > as much as possible cythonization so that setuptools or pip have time to > install cython, if needed. Optional because we distribute both .pyx and > cythonized .c files, and decide on which to use based on user flags. > > > > We, therefore, need an Extension class that only cythonizes if we decide > to. > > > > Thanks for the feedback, > > Manel > > I have a solution for exactly this in astropy_helpers which makes it > possible, for example, to pull in Cython via setup_requires, but only if > it's needed. > > Remind me on Monday and I can point out how it works. > > > On Apr 14, 2016 8:17 PM, "Matthew Brett" > wrote: > > > > > > On Thu, Apr 14, 2016 at 6:08 AM, Erik Bray > wrote: > > > > On Wed, Apr 13, 2016 at 9:35 PM, Manuel Nuno Melo > > > > wrote: > > > >> Hello devs, > > > >> > > > >> I'm developing the setup.py for a scientific package, MDAnalysis > (see PR > > > >> #799). We depend on distutils and setuptool. Namely, we use > > > >> setuptools.extension.Extension class for our extensions. > > > >> > > > >> Some older versions of setuptools (<18.0) do filename cythonization > > > >> themselves upon initialization of the Extension object. > > > >> > > > >> Because we want to control name cythonization ourselves I try to > directly > > > >> use distutils.extension.Extension, which has none of setuptools' > > > >> cythonization. However, this doesn't work because setuptools patches > > > >> distutils, so that distutils.extension.Extension effectively becomes > > > >> setuptools.extension.Extension. > > > > > > > > I'm wondering what it is specifically you need to do in your > > > > subclass--might it still be possible to do with a subclass of the > > > > setuptools Extension? Not saying I disagree with the overall idea, > > > > but I also wonder if there isn't a better way. > > > > > > I know this is a terrible and ugly hack, but the projects I work in > > > have a 'fake_pyrex' directory, that fools setuptools into thinking > > > that 'pyrex' is installed, and therefore prevents it from doing the > > > .pyx -> .c filename conversions in the extension: > > > > > > https://github.com/regreg/regreg/blob/master/setup.py#L33 > > > > > > Cheers, > > > > > > Matthew > > > _______________________________________________ > > > cython-devel mailing list > > > cython-devel at python.org > > > https://mail.python.org/mailman/listinfo/cython-devel > > > > > > _______________________________________________ > > cython-devel mailing list > > cython-devel at python.org > > https://mail.python.org/mailman/listinfo/cython-devel > > > > _______________________________________________ > cython-devel mailing list > cython-devel at python.org > https://mail.python.org/mailman/listinfo/cython-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From erik.m.bray at gmail.com Mon Apr 18 05:16:08 2016 From: erik.m.bray at gmail.com (Erik Bray) Date: Mon, 18 Apr 2016 11:16:08 +0200 Subject: [Cython] Cannot cythonize subclasses of setuptools.extension._Extension In-Reply-To: References: Message-ID: On Sat, Apr 16, 2016 at 1:29 PM, Manuel Nuno Melo wrote: > Hi Erik, > Please post your solution; I'm curious to see it. Will do in a bit. I need to see if I can distill it somewhat from its original context so that it can be better understood. > Currently, we're also using setup_requires but are moving away from it > because of two main issues: > > 1- there's no flexibility to hook into it via setuptools.Distribution > subclassing (unless you rewrite the entire __init__); I don't really think that's a big issue but maybe you've encountered some reason that it is. The worse thing about it is the general chicken-egg problem that you need to call setup() to get your setup_requires, but you may need some packages pulled in from setup_requires in order to do anything useful with setup(). d2to1 [1] solves this problem quite well, but unfortunately development on it is stalled for the time being. Another alternative that can work quite well is some version of Daniel Holth's "setup-requires" hack [2]. > 2- (and more serious) setuptools installs setup_requires dependencies in the > local directory, not as a system-wide install. That's a feature, not a bug. Build requirements for a package might be in conflict with the versions of those same requirements you have already installed. It also prevents things from being installed that are *not* required at runtime. It makes it easier for projects to pin a little more strictly to their known-working build requirements, without the constraints imposed by supporting a wider range of runtime requirements. > Even worse, it then puts a > path link to the current directory under the system-wide easy_install.pth. > This means that if you install from the source directory via 'sudo > ./setup.py install', you get a Cython egg downloaded into that directory, > and its path added to sys.path (!!). Needless to say this can break in many > nasty ways... Huh?? I'm not sure we're talking about the same thing now. The local .eggs cache directory created for setup_requires downloads most certainly does not get added to easy_install.pth. Are you sure you're not thinking of `setup.py develop`? [1] https://pypi.python.org/pypi/d2to1 [2] https://bitbucket.org/dholth/setup-requires/src > On Sat, Apr 16, 2016 at 1:10 PM, Erik Bray wrote: >> >> On Apr 14, 2016 21:07, "Manuel Nuno Melo" >> wrote: >> > >> > Our need to control cythonization comes from the fact that we implement >> > cython as a lazy and optional dependency. Lazy in the sense that we delay as >> > much as possible cythonization so that setuptools or pip have time to >> > install cython, if needed. Optional because we distribute both .pyx and >> > cythonized .c files, and decide on which to use based on user flags. >> > >> > We, therefore, need an Extension class that only cythonizes if we decide >> > to. >> > >> > Thanks for the feedback, >> > Manel >> >> I have a solution for exactly this in astropy_helpers which makes it >> possible, for example, to pull in Cython via setup_requires, but only if >> it's needed. >> >> Remind me on Monday and I can point out how it works. >> >> > On Apr 14, 2016 8:17 PM, "Matthew Brett" >> > wrote: >> > > >> > > On Thu, Apr 14, 2016 at 6:08 AM, Erik Bray >> > > wrote: >> > > > On Wed, Apr 13, 2016 at 9:35 PM, Manuel Nuno Melo >> > > > wrote: >> > > >> Hello devs, >> > > >> >> > > >> I'm developing the setup.py for a scientific package, MDAnalysis >> > > >> (see PR >> > > >> #799). We depend on distutils and setuptool. Namely, we use >> > > >> setuptools.extension.Extension class for our extensions. >> > > >> >> > > >> Some older versions of setuptools (<18.0) do filename cythonization >> > > >> themselves upon initialization of the Extension object. >> > > >> >> > > >> Because we want to control name cythonization ourselves I try to >> > > >> directly >> > > >> use distutils.extension.Extension, which has none of setuptools' >> > > >> cythonization. However, this doesn't work because setuptools >> > > >> patches >> > > >> distutils, so that distutils.extension.Extension effectively >> > > >> becomes >> > > >> setuptools.extension.Extension. >> > > > >> > > > I'm wondering what it is specifically you need to do in your >> > > > subclass--might it still be possible to do with a subclass of the >> > > > setuptools Extension? Not saying I disagree with the overall idea, >> > > > but I also wonder if there isn't a better way. >> > > >> > > I know this is a terrible and ugly hack, but the projects I work in >> > > have a 'fake_pyrex' directory, that fools setuptools into thinking >> > > that 'pyrex' is installed, and therefore prevents it from doing the >> > > .pyx -> .c filename conversions in the extension: >> > > >> > > https://github.com/regreg/regreg/blob/master/setup.py#L33 >> > > >> > > Cheers, >> > > >> > > Matthew >> > > _______________________________________________ >> > > cython-devel mailing list >> > > cython-devel at python.org >> > > https://mail.python.org/mailman/listinfo/cython-devel >> > >> > >> > _______________________________________________ >> > cython-devel mailing list >> > cython-devel at python.org >> > https://mail.python.org/mailman/listinfo/cython-devel >> > >> >> >> _______________________________________________ >> cython-devel mailing list >> cython-devel at python.org >> https://mail.python.org/mailman/listinfo/cython-devel >> > > > _______________________________________________ > cython-devel mailing list > cython-devel at python.org > https://mail.python.org/mailman/listinfo/cython-devel > From manuel.nuno.melo at gmail.com Mon Apr 18 12:56:39 2016 From: manuel.nuno.melo at gmail.com (Manuel Nuno Melo) Date: Mon, 18 Apr 2016 18:56:39 +0200 Subject: [Cython] Cannot cythonize subclasses of setuptools.extension._Extension In-Reply-To: References: Message-ID: Ah, sorry Erik, you're absolutely right. I mixed my results a bit and must elaborate: 'setup_requires' on its own will indeed not generate the behavior I described. However, if you define the same dependency under 'setup_requires' AND 'install_requires', then you get the mess I mentioned. Essentially, when you reach 'install_requires' setuptools decides that the egg it has under .eggs suffices, and just links to it. This isn't an issue when depending on cython because it's really just a setup-time dependency. However, in our package we depend on numpy at both setup- and runtime, and therefore makes sense to have it under both 'requires' flags. This is why hooking in at the setup_requires step could be a potential workaround, if we could get it to do a full dependency install instead of local egg. I'm currently finishing up this approach. On Mon, Apr 18, 2016 at 11:16 AM, Erik Bray wrote: > On Sat, Apr 16, 2016 at 1:29 PM, Manuel Nuno Melo > wrote: > > Hi Erik, > > Please post your solution; I'm curious to see it. > > Will do in a bit. I need to see if I can distill it somewhat from its > original context so that it can be better understood. > > > Currently, we're also using setup_requires but are moving away from it > > because of two main issues: > > > > 1- there's no flexibility to hook into it via setuptools.Distribution > > subclassing (unless you rewrite the entire __init__); > > I don't really think that's a big issue but maybe you've encountered > some reason that it is. > The worse thing about it is the general chicken-egg problem that you > need to call setup() to get your setup_requires, but you may need some > packages pulled in from setup_requires in order to do anything useful > with setup(). > > d2to1 [1] solves this problem quite well, but unfortunately > development on it is stalled for the time being. Another alternative > that can work quite well is some version of Daniel Holth's > "setup-requires" hack [2]. > > > 2- (and more serious) setuptools installs setup_requires dependencies in > the > > local directory, not as a system-wide install. > > That's a feature, not a bug. Build requirements for a package might > be in conflict with the versions of those same requirements you have > already installed. It also prevents things from being installed that > are *not* required at runtime. It makes it easier for projects to pin > a little more strictly to their known-working build requirements, > without the constraints imposed by supporting a wider range of runtime > requirements. > > > Even worse, it then puts a > > path link to the current directory under the system-wide > easy_install.pth. > > This means that if you install from the source directory via 'sudo > > ./setup.py install', you get a Cython egg downloaded into that directory, > > and its path added to sys.path (!!). Needless to say this can break in > many > > nasty ways... > > Huh?? I'm not sure we're talking about the same thing now. The local > .eggs cache directory created for setup_requires downloads most > certainly does not get added to easy_install.pth. Are you sure you're > not thinking of `setup.py develop`? > > [1] https://pypi.python.org/pypi/d2to1 > [2] https://bitbucket.org/dholth/setup-requires/src > > > On Sat, Apr 16, 2016 at 1:10 PM, Erik Bray > wrote: > >> > >> On Apr 14, 2016 21:07, "Manuel Nuno Melo" > >> wrote: > >> > > >> > Our need to control cythonization comes from the fact that we > implement > >> > cython as a lazy and optional dependency. Lazy in the sense that we > delay as > >> > much as possible cythonization so that setuptools or pip have time to > >> > install cython, if needed. Optional because we distribute both .pyx > and > >> > cythonized .c files, and decide on which to use based on user flags. > >> > > >> > We, therefore, need an Extension class that only cythonizes if we > decide > >> > to. > >> > > >> > Thanks for the feedback, > >> > Manel > >> > >> I have a solution for exactly this in astropy_helpers which makes it > >> possible, for example, to pull in Cython via setup_requires, but only if > >> it's needed. > >> > >> Remind me on Monday and I can point out how it works. > >> > >> > On Apr 14, 2016 8:17 PM, "Matthew Brett" > >> > wrote: > >> > > > >> > > On Thu, Apr 14, 2016 at 6:08 AM, Erik Bray > >> > > wrote: > >> > > > On Wed, Apr 13, 2016 at 9:35 PM, Manuel Nuno Melo > >> > > > wrote: > >> > > >> Hello devs, > >> > > >> > >> > > >> I'm developing the setup.py for a scientific package, MDAnalysis > >> > > >> (see PR > >> > > >> #799). We depend on distutils and setuptool. Namely, we use > >> > > >> setuptools.extension.Extension class for our extensions. > >> > > >> > >> > > >> Some older versions of setuptools (<18.0) do filename > cythonization > >> > > >> themselves upon initialization of the Extension object. > >> > > >> > >> > > >> Because we want to control name cythonization ourselves I try to > >> > > >> directly > >> > > >> use distutils.extension.Extension, which has none of setuptools' > >> > > >> cythonization. However, this doesn't work because setuptools > >> > > >> patches > >> > > >> distutils, so that distutils.extension.Extension effectively > >> > > >> becomes > >> > > >> setuptools.extension.Extension. > >> > > > > >> > > > I'm wondering what it is specifically you need to do in your > >> > > > subclass--might it still be possible to do with a subclass of the > >> > > > setuptools Extension? Not saying I disagree with the overall > idea, > >> > > > but I also wonder if there isn't a better way. > >> > > > >> > > I know this is a terrible and ugly hack, but the projects I work in > >> > > have a 'fake_pyrex' directory, that fools setuptools into thinking > >> > > that 'pyrex' is installed, and therefore prevents it from doing the > >> > > .pyx -> .c filename conversions in the extension: > >> > > > >> > > https://github.com/regreg/regreg/blob/master/setup.py#L33 > >> > > > >> > > Cheers, > >> > > > >> > > Matthew > >> > > _______________________________________________ > >> > > cython-devel mailing list > >> > > cython-devel at python.org > >> > > https://mail.python.org/mailman/listinfo/cython-devel > >> > > >> > > >> > _______________________________________________ > >> > cython-devel mailing list > >> > cython-devel at python.org > >> > https://mail.python.org/mailman/listinfo/cython-devel > >> > > >> > >> > >> _______________________________________________ > >> cython-devel mailing list > >> cython-devel at python.org > >> https://mail.python.org/mailman/listinfo/cython-devel > >> > > > > > > _______________________________________________ > > cython-devel mailing list > > cython-devel at python.org > > https://mail.python.org/mailman/listinfo/cython-devel > > > _______________________________________________ > cython-devel mailing list > cython-devel at python.org > https://mail.python.org/mailman/listinfo/cython-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From erik.m.bray at gmail.com Tue Apr 19 05:48:41 2016 From: erik.m.bray at gmail.com (Erik Bray) Date: Tue, 19 Apr 2016 11:48:41 +0200 Subject: [Cython] Cannot cythonize subclasses of setuptools.extension._Extension In-Reply-To: References: Message-ID: On Mon, Apr 18, 2016 at 6:56 PM, Manuel Nuno Melo wrote: > Ah, sorry Erik, you're absolutely right. I mixed my results a bit and must > elaborate: > > 'setup_requires' on its own will indeed not generate the behavior I > described. > > However, if you define the same dependency under 'setup_requires' AND > 'install_requires', then you get the mess I mentioned. Essentially, when you > reach 'install_requires' setuptools decides that the egg it has under .eggs > suffices, and just links to it. > > This isn't an issue when depending on cython because it's really just a > setup-time dependency. However, in our package we depend on numpy at both > setup- and runtime, and therefore makes sense to have it under both > 'requires' flags. > > This is why hooking in at the setup_requires step could be a potential > workaround, if we could get it to do a full dependency install instead of > local egg. I'm currently finishing up this approach. Interesting. This sounds to me like a regression, as I don't think it used to do that. This should probably be brought up on distutils-sig or in the setuptools bug tracker, rather than spending a lot of time hacking together a workaround or requiring changes specifically in Cython (though I don't disagree that distutils.Extension should be allowed in the first place). IIRC it used to be that if the same package was specified in both setup_requires and install_requires it would be built/installed twice. This was recognized as inefficient, and that the copy installed in the local .eggs directory satisfies an install_requires requirement it should be copied from .eggs to the system site-packages (or whatever the install target is). I remember there at least being *discussion* about that, but not sure what changes were made. It sounds like in your case it's just adding Cython from your .eggs into easy-install.pth which I agree is wrong. If that's the case it should be fixed in setuptools over all else. I'll see if I can reproduce it. Jason Coombs who mantains setuptools is usually quick to release bug fix versions if he gets a patch. Erik > On Mon, Apr 18, 2016 at 11:16 AM, Erik Bray wrote: >> >> On Sat, Apr 16, 2016 at 1:29 PM, Manuel Nuno Melo >> wrote: >> > Hi Erik, >> > Please post your solution; I'm curious to see it. >> >> Will do in a bit. I need to see if I can distill it somewhat from its >> original context so that it can be better understood. >> >> > Currently, we're also using setup_requires but are moving away from it >> > because of two main issues: >> > >> > 1- there's no flexibility to hook into it via setuptools.Distribution >> > subclassing (unless you rewrite the entire __init__); >> >> I don't really think that's a big issue but maybe you've encountered >> some reason that it is. >> The worse thing about it is the general chicken-egg problem that you >> need to call setup() to get your setup_requires, but you may need some >> packages pulled in from setup_requires in order to do anything useful >> with setup(). >> >> d2to1 [1] solves this problem quite well, but unfortunately >> development on it is stalled for the time being. Another alternative >> that can work quite well is some version of Daniel Holth's >> "setup-requires" hack [2]. >> >> > 2- (and more serious) setuptools installs setup_requires dependencies in >> > the >> > local directory, not as a system-wide install. >> >> That's a feature, not a bug. Build requirements for a package might >> be in conflict with the versions of those same requirements you have >> already installed. It also prevents things from being installed that >> are *not* required at runtime. It makes it easier for projects to pin >> a little more strictly to their known-working build requirements, >> without the constraints imposed by supporting a wider range of runtime >> requirements. >> >> > Even worse, it then puts a >> > path link to the current directory under the system-wide >> > easy_install.pth. >> > This means that if you install from the source directory via 'sudo >> > ./setup.py install', you get a Cython egg downloaded into that >> > directory, >> > and its path added to sys.path (!!). Needless to say this can break in >> > many >> > nasty ways... >> >> Huh?? I'm not sure we're talking about the same thing now. The local >> .eggs cache directory created for setup_requires downloads most >> certainly does not get added to easy_install.pth. Are you sure you're >> not thinking of `setup.py develop`? >> >> [1] https://pypi.python.org/pypi/d2to1 >> [2] https://bitbucket.org/dholth/setup-requires/src >> >> > On Sat, Apr 16, 2016 at 1:10 PM, Erik Bray >> > wrote: >> >> >> >> On Apr 14, 2016 21:07, "Manuel Nuno Melo" >> >> wrote: >> >> > >> >> > Our need to control cythonization comes from the fact that we >> >> > implement >> >> > cython as a lazy and optional dependency. Lazy in the sense that we >> >> > delay as >> >> > much as possible cythonization so that setuptools or pip have time to >> >> > install cython, if needed. Optional because we distribute both .pyx >> >> > and >> >> > cythonized .c files, and decide on which to use based on user flags. >> >> > >> >> > We, therefore, need an Extension class that only cythonizes if we >> >> > decide >> >> > to. >> >> > >> >> > Thanks for the feedback, >> >> > Manel >> >> >> >> I have a solution for exactly this in astropy_helpers which makes it >> >> possible, for example, to pull in Cython via setup_requires, but only >> >> if >> >> it's needed. >> >> >> >> Remind me on Monday and I can point out how it works. >> >> >> >> > On Apr 14, 2016 8:17 PM, "Matthew Brett" >> >> > wrote: >> >> > > >> >> > > On Thu, Apr 14, 2016 at 6:08 AM, Erik Bray >> >> > > wrote: >> >> > > > On Wed, Apr 13, 2016 at 9:35 PM, Manuel Nuno Melo >> >> > > > wrote: >> >> > > >> Hello devs, >> >> > > >> >> >> > > >> I'm developing the setup.py for a scientific package, MDAnalysis >> >> > > >> (see PR >> >> > > >> #799). We depend on distutils and setuptool. Namely, we use >> >> > > >> setuptools.extension.Extension class for our extensions. >> >> > > >> >> >> > > >> Some older versions of setuptools (<18.0) do filename >> >> > > >> cythonization >> >> > > >> themselves upon initialization of the Extension object. >> >> > > >> >> >> > > >> Because we want to control name cythonization ourselves I try to >> >> > > >> directly >> >> > > >> use distutils.extension.Extension, which has none of setuptools' >> >> > > >> cythonization. However, this doesn't work because setuptools >> >> > > >> patches >> >> > > >> distutils, so that distutils.extension.Extension effectively >> >> > > >> becomes >> >> > > >> setuptools.extension.Extension. >> >> > > > >> >> > > > I'm wondering what it is specifically you need to do in your >> >> > > > subclass--might it still be possible to do with a subclass of the >> >> > > > setuptools Extension? Not saying I disagree with the overall >> >> > > > idea, >> >> > > > but I also wonder if there isn't a better way. >> >> > > >> >> > > I know this is a terrible and ugly hack, but the projects I work in >> >> > > have a 'fake_pyrex' directory, that fools setuptools into thinking >> >> > > that 'pyrex' is installed, and therefore prevents it from doing the >> >> > > .pyx -> .c filename conversions in the extension: >> >> > > >> >> > > https://github.com/regreg/regreg/blob/master/setup.py#L33 >> >> > > >> >> > > Cheers, >> >> > > >> >> > > Matthew >> >> > > _______________________________________________ >> >> > > cython-devel mailing list >> >> > > cython-devel at python.org >> >> > > https://mail.python.org/mailman/listinfo/cython-devel >> >> > >> >> > >> >> > _______________________________________________ >> >> > cython-devel mailing list >> >> > cython-devel at python.org >> >> > https://mail.python.org/mailman/listinfo/cython-devel >> >> > >> >> >> >> >> >> _______________________________________________ >> >> cython-devel mailing list >> >> cython-devel at python.org >> >> https://mail.python.org/mailman/listinfo/cython-devel >> >> >> > >> > >> > _______________________________________________ >> > cython-devel mailing list >> > cython-devel at python.org >> > https://mail.python.org/mailman/listinfo/cython-devel >> > >> _______________________________________________ >> cython-devel mailing list >> cython-devel at python.org >> https://mail.python.org/mailman/listinfo/cython-devel > > > > _______________________________________________ > cython-devel mailing list > cython-devel at python.org > https://mail.python.org/mailman/listinfo/cython-devel > From xuancong84 at gmail.com Tue Apr 19 05:13:39 2016 From: xuancong84 at gmail.com (Xuancong Wang) Date: Tue, 19 Apr 2016 17:13:39 +0800 Subject: [Cython] unsupported meta-programming-related features Message-ID: Dear Cython developers, Python supports meta-programming, in which a variable with name specified in a string can be created at run-time. One built-in library which make use of this is argparse. For example: parser.add_argument('-N', '--max_threads', help='maximum number of concurrent decoding threads', type=int, default=16) in this case, the variable max_threads is created from the string argument. And then Cython will generate an incorrect C program with the following error: smt.py:78:88: undeclared name not builtin: headtail smt.c:1:2: error: #error Do not use this file, it is the result of a failed Cython compilation. In comparison, I found that nuitka can convert this kind of Python programs sucessfully. I hope Cython can be improved. Thanks! Cheers, Xuancong From manuel.nuno.melo at gmail.com Tue Apr 19 08:35:03 2016 From: manuel.nuno.melo at gmail.com (Manuel Nuno Melo) Date: Tue, 19 Apr 2016 14:35:03 +0200 Subject: [Cython] Cannot cythonize subclasses of setuptools.extension._Extension In-Reply-To: References: Message-ID: This strayed a bit from the original topic, but yes, I should bring this up with setuptools project. Thanks for the feedback. (My feeling was that this was a somewhat stalled problem, reported in two issues and persisting after a couple of years: #209 and #391 ) As you said, even if it is a hackish workaround here, Cython should still allow setuptools.extension._Extension. On Tue, Apr 19, 2016 at 11:48 AM, Erik Bray wrote: > On Mon, Apr 18, 2016 at 6:56 PM, Manuel Nuno Melo > wrote: > > Ah, sorry Erik, you're absolutely right. I mixed my results a bit and > must > > elaborate: > > > > 'setup_requires' on its own will indeed not generate the behavior I > > described. > > > > However, if you define the same dependency under 'setup_requires' AND > > 'install_requires', then you get the mess I mentioned. Essentially, when > you > > reach 'install_requires' setuptools decides that the egg it has under > .eggs > > suffices, and just links to it. > > > > This isn't an issue when depending on cython because it's really just a > > setup-time dependency. However, in our package we depend on numpy at both > > setup- and runtime, and therefore makes sense to have it under both > > 'requires' flags. > > > > This is why hooking in at the setup_requires step could be a potential > > workaround, if we could get it to do a full dependency install instead of > > local egg. I'm currently finishing up this approach. > > Interesting. This sounds to me like a regression, as I don't think it > used to do that. This should probably be brought up on distutils-sig > or in the setuptools bug tracker, rather than spending a lot of time > hacking together a workaround or requiring changes specifically in > Cython (though I don't disagree that distutils.Extension should be > allowed in the first place). > > IIRC it used to be that if the same package was specified in both > setup_requires and install_requires it would be built/installed twice. > This was recognized as inefficient, and that the copy installed in the > local .eggs directory satisfies an install_requires requirement it > should be copied from .eggs to the system site-packages (or whatever > the install target is). I remember there at least being *discussion* > about that, but not sure what changes were made. > > It sounds like in your case it's just adding Cython from your .eggs > into easy-install.pth which I agree is wrong. If that's the case it > should be fixed in setuptools over all else. I'll see if I can > reproduce it. Jason Coombs who mantains setuptools is usually quick > to release bug fix versions if he gets a patch. > > Erik > > > On Mon, Apr 18, 2016 at 11:16 AM, Erik Bray > wrote: > >> > >> On Sat, Apr 16, 2016 at 1:29 PM, Manuel Nuno Melo > >> wrote: > >> > Hi Erik, > >> > Please post your solution; I'm curious to see it. > >> > >> Will do in a bit. I need to see if I can distill it somewhat from its > >> original context so that it can be better understood. > >> > >> > Currently, we're also using setup_requires but are moving away from it > >> > because of two main issues: > >> > > >> > 1- there's no flexibility to hook into it via setuptools.Distribution > >> > subclassing (unless you rewrite the entire __init__); > >> > >> I don't really think that's a big issue but maybe you've encountered > >> some reason that it is. > >> The worse thing about it is the general chicken-egg problem that you > >> need to call setup() to get your setup_requires, but you may need some > >> packages pulled in from setup_requires in order to do anything useful > >> with setup(). > >> > >> d2to1 [1] solves this problem quite well, but unfortunately > >> development on it is stalled for the time being. Another alternative > >> that can work quite well is some version of Daniel Holth's > >> "setup-requires" hack [2]. > >> > >> > 2- (and more serious) setuptools installs setup_requires dependencies > in > >> > the > >> > local directory, not as a system-wide install. > >> > >> That's a feature, not a bug. Build requirements for a package might > >> be in conflict with the versions of those same requirements you have > >> already installed. It also prevents things from being installed that > >> are *not* required at runtime. It makes it easier for projects to pin > >> a little more strictly to their known-working build requirements, > >> without the constraints imposed by supporting a wider range of runtime > >> requirements. > >> > >> > Even worse, it then puts a > >> > path link to the current directory under the system-wide > >> > easy_install.pth. > >> > This means that if you install from the source directory via 'sudo > >> > ./setup.py install', you get a Cython egg downloaded into that > >> > directory, > >> > and its path added to sys.path (!!). Needless to say this can break in > >> > many > >> > nasty ways... > >> > >> Huh?? I'm not sure we're talking about the same thing now. The local > >> .eggs cache directory created for setup_requires downloads most > >> certainly does not get added to easy_install.pth. Are you sure you're > >> not thinking of `setup.py develop`? > >> > >> [1] https://pypi.python.org/pypi/d2to1 > >> [2] https://bitbucket.org/dholth/setup-requires/src > >> > >> > On Sat, Apr 16, 2016 at 1:10 PM, Erik Bray > >> > wrote: > >> >> > >> >> On Apr 14, 2016 21:07, "Manuel Nuno Melo" < > manuel.nuno.melo at gmail.com> > >> >> wrote: > >> >> > > >> >> > Our need to control cythonization comes from the fact that we > >> >> > implement > >> >> > cython as a lazy and optional dependency. Lazy in the sense that we > >> >> > delay as > >> >> > much as possible cythonization so that setuptools or pip have time > to > >> >> > install cython, if needed. Optional because we distribute both .pyx > >> >> > and > >> >> > cythonized .c files, and decide on which to use based on user > flags. > >> >> > > >> >> > We, therefore, need an Extension class that only cythonizes if we > >> >> > decide > >> >> > to. > >> >> > > >> >> > Thanks for the feedback, > >> >> > Manel > >> >> > >> >> I have a solution for exactly this in astropy_helpers which makes it > >> >> possible, for example, to pull in Cython via setup_requires, but only > >> >> if > >> >> it's needed. > >> >> > >> >> Remind me on Monday and I can point out how it works. > >> >> > >> >> > On Apr 14, 2016 8:17 PM, "Matthew Brett" > >> >> > wrote: > >> >> > > > >> >> > > On Thu, Apr 14, 2016 at 6:08 AM, Erik Bray < > erik.m.bray at gmail.com> > >> >> > > wrote: > >> >> > > > On Wed, Apr 13, 2016 at 9:35 PM, Manuel Nuno Melo > >> >> > > > wrote: > >> >> > > >> Hello devs, > >> >> > > >> > >> >> > > >> I'm developing the setup.py for a scientific package, > MDAnalysis > >> >> > > >> (see PR > >> >> > > >> #799). We depend on distutils and setuptool. Namely, we use > >> >> > > >> setuptools.extension.Extension class for our extensions. > >> >> > > >> > >> >> > > >> Some older versions of setuptools (<18.0) do filename > >> >> > > >> cythonization > >> >> > > >> themselves upon initialization of the Extension object. > >> >> > > >> > >> >> > > >> Because we want to control name cythonization ourselves I try > to > >> >> > > >> directly > >> >> > > >> use distutils.extension.Extension, which has none of > setuptools' > >> >> > > >> cythonization. However, this doesn't work because setuptools > >> >> > > >> patches > >> >> > > >> distutils, so that distutils.extension.Extension effectively > >> >> > > >> becomes > >> >> > > >> setuptools.extension.Extension. > >> >> > > > > >> >> > > > I'm wondering what it is specifically you need to do in your > >> >> > > > subclass--might it still be possible to do with a subclass of > the > >> >> > > > setuptools Extension? Not saying I disagree with the overall > >> >> > > > idea, > >> >> > > > but I also wonder if there isn't a better way. > >> >> > > > >> >> > > I know this is a terrible and ugly hack, but the projects I work > in > >> >> > > have a 'fake_pyrex' directory, that fools setuptools into > thinking > >> >> > > that 'pyrex' is installed, and therefore prevents it from doing > the > >> >> > > .pyx -> .c filename conversions in the extension: > >> >> > > > >> >> > > https://github.com/regreg/regreg/blob/master/setup.py#L33 > >> >> > > > >> >> > > Cheers, > >> >> > > > >> >> > > Matthew > >> >> > > _______________________________________________ > >> >> > > cython-devel mailing list > >> >> > > cython-devel at python.org > >> >> > > https://mail.python.org/mailman/listinfo/cython-devel > >> >> > > >> >> > > >> >> > _______________________________________________ > >> >> > cython-devel mailing list > >> >> > cython-devel at python.org > >> >> > https://mail.python.org/mailman/listinfo/cython-devel > >> >> > > >> >> > >> >> > >> >> _______________________________________________ > >> >> cython-devel mailing list > >> >> cython-devel at python.org > >> >> https://mail.python.org/mailman/listinfo/cython-devel > >> >> > >> > > >> > > >> > _______________________________________________ > >> > cython-devel mailing list > >> > cython-devel at python.org > >> > https://mail.python.org/mailman/listinfo/cython-devel > >> > > >> _______________________________________________ > >> cython-devel mailing list > >> cython-devel at python.org > >> https://mail.python.org/mailman/listinfo/cython-devel > > > > > > > > _______________________________________________ > > cython-devel mailing list > > cython-devel at python.org > > https://mail.python.org/mailman/listinfo/cython-devel > > > _______________________________________________ > cython-devel mailing list > cython-devel at python.org > https://mail.python.org/mailman/listinfo/cython-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From robertwb at gmail.com Tue Apr 19 13:41:26 2016 From: robertwb at gmail.com (Robert Bradshaw) Date: Tue, 19 Apr 2016 10:41:26 -0700 Subject: [Cython] unsupported meta-programming-related features In-Reply-To: References: Message-ID: On Tue, Apr 19, 2016 at 2:13 AM, Xuancong Wang wrote: > Dear Cython developers, > > Python supports meta-programming, in which a variable with name > specified in a string can be created at run-time. One built-in library > which make use of this is argparse. > > For example: > > parser.add_argument('-N', '--max_threads', help='maximum number of > concurrent decoding threads', type=int, default=16) > > in this case, the variable max_threads is created from the string > argument. And then Cython will generate an incorrect C program with > the following error: > > smt.py:78:88: undeclared name not builtin: headtail > smt.c:1:2: error: #error Do not use this file, it is the result of a > failed Cython compilation. Argparse works just fine in Cython import argparse parser = argparse.ArgumentParser() parser.add_argument('-N', '--max_threads', help='maximum number of concurrent decoding threads', type=int, default=16) args = parser.parse_args() print args.max_threads Could you provide a full example that you think should work? Where is headtail defined? Is it another argument? > In comparison, I found that nuitka can convert this kind of Python > programs sucessfully. I hope Cython can be improved. Thanks! > argsparse dynamically adds the attributes to the returned object, which works fine in Cython. Unless you're doing something like http://stackoverflow.com/questions/19299635/python-argparse-parse-args-into-global-namespace-or-a-reason-this-is-a-bad-idea That's one of the few places we diverge, because people strongly preferred compile-time errors to runtime errors in this case. -------------- next part -------------- An HTML attachment was scrubbed... URL: From stefan_ml at behnel.de Wed Apr 20 01:36:16 2016 From: stefan_ml at behnel.de (Stefan Behnel) Date: Wed, 20 Apr 2016 07:36:16 +0200 Subject: [Cython] unsupported meta-programming-related features In-Reply-To: References: Message-ID: <57171550.8090704@behnel.de> Xuancong Wang schrieb am 19.04.2016 um 11:13: > Python supports meta-programming, in which a variable with name > specified in a string can be created at run-time. One built-in library > which make use of this is argparse. > > For example: > > parser.add_argument('-N', '--max_threads', help='maximum number of > concurrent decoding threads', type=int, default=16) > > in this case, the variable max_threads is created from the string > argument. Actually, no. What argparse returns on parsing is an object with dynamically created attributes. It doesn't magically inject any global variables into your module namespace. Or were you trying to compile argparse itself? Then I don't see a reason either why that shouldn't work. > And then Cython will generate an incorrect C program with > the following error: > > smt.py:78:88: undeclared name not builtin: headtail > smt.c:1:2: error: #error Do not use this file, it is the result of a > failed Cython compilation. > > In comparison, I found that nuitka can convert this kind of Python > programs sucessfully. I hope Cython can be improved. Thanks! Cython makes this an error because in almost all cases, with very few exceptions, it is an actual bug in your program. For me personally, this already helped me find several bugs in the past, both in my code as well as other people's code, including at least one in Python's standard library. If you think you know better, you can disable the check by setting Cython.Compiler.Options.error_on_unknown_names = False Stefan From xuancong84 at gmail.com Wed Apr 20 02:22:25 2016 From: xuancong84 at gmail.com (Xuancong Wang) Date: Wed, 20 Apr 2016 14:22:25 +0800 Subject: [Cython] unsupported meta-programming-related features In-Reply-To: References: Message-ID: Sorry, I have a few extra steps which map all parser symbols into the global space. parser = argparse.ArgumentParser(...) parser.add_argument('-ht', '--headtail', help='add headtail', default=False, action='store_true') opt=parser.parse_args() globals().update(vars(opt)) xuancong On Wed, Apr 20, 2016 at 1:41 AM, Robert Bradshaw wrote: > On Tue, Apr 19, 2016 at 2:13 AM, Xuancong Wang wrote: >> >> Dear Cython developers, >> >> Python supports meta-programming, in which a variable with name >> specified in a string can be created at run-time. One built-in library >> which make use of this is argparse. >> >> For example: >> >> parser.add_argument('-N', '--max_threads', help='maximum number of >> concurrent decoding threads', type=int, default=16) >> >> in this case, the variable max_threads is created from the string >> argument. And then Cython will generate an incorrect C program with >> the following error: >> >> smt.py:78:88: undeclared name not builtin: headtail >> smt.c:1:2: error: #error Do not use this file, it is the result of a >> failed Cython compilation. > > > Argparse works just fine in Cython > > import argparse > parser = argparse.ArgumentParser() > parser.add_argument('-N', '--max_threads', help='maximum number of > concurrent decoding threads', type=int, default=16) > args = parser.parse_args() > print args.max_threads > > Could you provide a full example that you think should work? Where is > headtail defined? Is it another argument? > >> >> In comparison, I found that nuitka can convert this kind of Python >> programs sucessfully. I hope Cython can be improved. Thanks! > > > argsparse dynamically adds the attributes to the returned object, which > works fine in Cython. Unless you're doing something like > > http://stackoverflow.com/questions/19299635/python-argparse-parse-args-into-global-namespace-or-a-reason-this-is-a-bad-idea > > That's one of the few places we diverge, because people strongly preferred > compile-time errors to runtime errors in this case. > > > _______________________________________________ > cython-devel mailing list > cython-devel at python.org > https://mail.python.org/mailman/listinfo/cython-devel > From robertwb at gmail.com Thu Apr 21 01:51:51 2016 From: robertwb at gmail.com (Robert Bradshaw) Date: Wed, 20 Apr 2016 22:51:51 -0700 Subject: [Cython] unsupported meta-programming-related features In-Reply-To: References: Message-ID: On Tue, Apr 19, 2016 at 11:22 PM, Xuancong Wang wrote: > Sorry, I have a few extra steps which map all parser symbols into the > global space. > > parser = argparse.ArgumentParser(...) > parser.add_argument('-ht', '--headtail', help='add headtail', > default=False, action='store_true') > opt=parser.parse_args() > globals().update(vars(opt)) > Symbols injected into globals() manually are not supported by default, see Stefan's comments on how to support this. Documented at https://github.com/cython/cython/wiki/Unsupported On Wed, Apr 20, 2016 at 1:41 AM, Robert Bradshaw wrote: > > On Tue, Apr 19, 2016 at 2:13 AM, Xuancong Wang > wrote: > >> > >> Dear Cython developers, > >> > >> Python supports meta-programming, in which a variable with name > >> specified in a string can be created at run-time. One built-in library > >> which make use of this is argparse. > >> > >> For example: > >> > >> parser.add_argument('-N', '--max_threads', help='maximum number of > >> concurrent decoding threads', type=int, default=16) > >> > >> in this case, the variable max_threads is created from the string > >> argument. And then Cython will generate an incorrect C program with > >> the following error: > >> > >> smt.py:78:88: undeclared name not builtin: headtail > >> smt.c:1:2: error: #error Do not use this file, it is the result of a > >> failed Cython compilation. > > > > > > Argparse works just fine in Cython > > > > import argparse > > parser = argparse.ArgumentParser() > > parser.add_argument('-N', '--max_threads', help='maximum number of > > concurrent decoding threads', type=int, default=16) > > args = parser.parse_args() > > print args.max_threads > > > > Could you provide a full example that you think should work? Where is > > headtail defined? Is it another argument? > > > >> > >> In comparison, I found that nuitka can convert this kind of Python > >> programs sucessfully. I hope Cython can be improved. Thanks! > > > > > > argsparse dynamically adds the attributes to the returned object, which > > works fine in Cython. Unless you're doing something like > > > > > http://stackoverflow.com/questions/19299635/python-argparse-parse-args-into-global-namespace-or-a-reason-this-is-a-bad-idea > > > > That's one of the few places we diverge, because people strongly > preferred > > compile-time errors to runtime errors in this case. > > > > > > _______________________________________________ > > cython-devel mailing list > > cython-devel at python.org > > https://mail.python.org/mailman/listinfo/cython-devel > > > _______________________________________________ > cython-devel mailing list > cython-devel at python.org > https://mail.python.org/mailman/listinfo/cython-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Thu Apr 21 13:47:06 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Thu, 21 Apr 2016 10:47:06 -0700 Subject: [Cython] Manylinux wheels for Cython In-Reply-To: References: Message-ID: Hi, On Mon, Mar 28, 2016 at 10:54 AM, Matthew Brett wrote: > Hi, > > On Fri, Mar 25, 2016 at 11:46 AM, Matthew Brett wrote: >> Hi, >> >> On Tue, Mar 15, 2016 at 2:58 PM, Matthew Brett wrote: >>> Hi, >>> >>> On Mon, Mar 7, 2016 at 5:47 PM, Matthew Brett wrote: >>>> Hi, >>>> >>>> I don't know whether y'all have been following over at distutils-sig, >>>> but there's a new distutils PEP that defines a `manylinux` format for >>>> Linux wheels that work on many different x86 Linux distributions: >>>> >>>> https://www.python.org/dev/peps/pep-0513/ >>>> https://github.com/pypa/manylinux >>>> >>>> The latest version of pip will install these, if the client Linux >>>> system is compatible with the manylinux spec: >>>> >>>> https://pip.pypa.io/en/stable/news/ >>>> >>>> I've already built and used manylinux Cython wheels, which y'all are >>>> welcome to test with: >>>> >>>> pip install -f https://nipy.bic.berkeley.edu/manylinux cython >>>> >>>> (The wheels there don't have the right manylinux filenames yet, but >>>> they have the same contents as the ones that would go up to pypi). >>>> >>>> I've already had good use from these wheels in speeding up project >>>> builds into docker containers and virtualenvs, and I'd love to upload >>>> these to pypi. I have permissions on pypi to do this, but I wanted >>>> to check in with y'all first... >>> >>> There is now a test wheel for Cython 0.23.4 and Python 3.5 on the >>> testpypi server. >>> >>> This is me downloading and installing - a matter of a few seconds: >>> >>> $ python -m pip install -U pip >>> Downloading/unpacking pip from >>> https://pypi.python.org/packages/py2.py3/p/pip/pip-8.1.0-py2.py3-none-any.whl#md5=c6eca6736b2b8f7280fb25e44be7c51b >>> Downloading pip-8.1.0-py2.py3-none-any.whl (1.2MB): 1.2MB downloaded >>> Installing collected packages: pip >>> Found existing installation: pip 1.5.6 >>> Uninstalling pip: >>> Successfully uninstalled pip >>> Successfully installed pip >>> Cleaning up... >>> $ pip install -i https://testpypi.python.org/pypi/ cython >>> Collecting cython >>> Using cached https://testpypi.python.org/packages/cp35/C/Cython/Cython-0.23.4-cp35-cp35m-manylinux1_x86_64.whl >>> Installing collected packages: cython >>> Successfully installed cython-0.23.4 >>> $ cython --version >>> Cython version 0.23.4 >>> >>> The installed Cython version compiles all the Demo *.pyx files OK. >>> >>> See also : https://mail.python.org/pipermail/wheel-builders/2016-March/000050.html >> >> A manylinux wheel (gevent) is already the current most-downloaded >> binary wheel from pypi. >> >> A reminder that y'all can test the Cython wheels with: >> >> python -m pip install --upgrade pip # You need latest pip >> pip install -f https://nipy.bic.berkeley.edu/manylinux cython >> >> If I don't hear any objections, I plan to upload the Cython manylinux >> wheels on Monday 28th. > > I uploaded manylinux wheels for Cython 0.23.5. > > If you're on Linux, and you upgrade pip to 8.1.1 (current) you should > now get Cython via a manylinux wheel by default. > > Please do test and let me know of any problems. I haven't heard of any problems, and both current and historical numpy and scipy wheels appear to be working without problems as well. So, I propose to upload historical Cython wheels (for versions 0.17 and up) to speed up CI testing with older versions of Cython. You can test the built wheels now with something like: python -m pip install --upgrade pip pip install -f https://nipy.bic.berkeley.edu/manylinux cython==0.17 Cheers, Matthew From yury at shurup.com Thu Apr 21 15:14:04 2016 From: yury at shurup.com (Yury V. Zaytsev) Date: Thu, 21 Apr 2016 21:14:04 +0200 (CEST) Subject: [Cython] Manylinux wheels for Cython In-Reply-To: References: Message-ID: Hi Matthew, On Thu, 21 Apr 2016, Matthew Brett wrote: > I haven't heard of any problems, and both current and historical numpy > and scipy wheels appear to be working without problems as well. I've been playing with your wheels last week and they seem to work just fine. Not sure if there is much market for debug wheels, I'm currently fetching the pre-built ones for the release version and building the debug ones myself... which is something I can totally live with. Thank you for your efforts! -- Sincerely yours, Yury V. Zaytsev From stefan_ml at behnel.de Fri Apr 22 03:01:07 2016 From: stefan_ml at behnel.de (Stefan Behnel) Date: Fri, 22 Apr 2016 09:01:07 +0200 Subject: [Cython] Manylinux wheels for Cython In-Reply-To: References: Message-ID: <5719CC33.6090507@behnel.de> Matthew Brett schrieb am 21.04.2016 um 19:47: > I haven't heard of any problems, and both current and historical numpy > and scipy wheels appear to be working without problems as well. > > So, I propose to upload historical Cython wheels (for versions 0.17 > and up) to speed up CI testing with older versions of Cython. There wasn't much feedback overall, so I'll just say that I'm happy you've put some effort into this. Thanks! I'm all for uploading wheels for those older versions to reduce build times for users. Even in the worst case, we can always take them out if anything goes really wrong. Stefan From isuruf at gmail.com Fri Apr 22 05:14:57 2016 From: isuruf at gmail.com (Isuru Fernando) Date: Fri, 22 Apr 2016 14:44:57 +0530 Subject: [Cython] Cython compiler crash in 0.24 Message-ID: Hi, When cythonizing a .pyx I get an error in Cython 0.24 which was not there in Cython 0.23.3 After printing the stacktrace, this seems to be because of this line https://github.com/cython/cython/commit/6d55fd189f6ee9d4374d00b8c9c320bd04332bab#diff-28c66ef9e2ff564619ef82aa9d72ee7dR2762 where a None object is passed and the following line calls the None object. https://github.com/cython/cython/blob/6d55fd189f6ee9d4374d00b8c9c320bd04332bab/Cython/Compiler/ExprNodes.py#L1780 Let me know if you need more information. Thanks, Isuru Fernando [ 33%] Cythonizing symengine_wrapper.pyx Error compiling Cython file: ------------------------------------------------------------ ... cdef double complex[::1] cmplx_view if real: try: real_view = iterable except (ValueError, TypeError): real_view = cython.view.array(shape=(_size(iterable),), ^ ------------------------------------------------------------ symengine_wrapper.pyx:2464:54: Compiler crash in TransformBuiltinMethods ModuleNode.body = StatListNode(symengine_wrapper.pyx:1:0) StatListNode.stats[163] = StatListNode(symengine_wrapper.pyx:2455:0) StatListNode.stats[0] = DefNode(symengine_wrapper.pyx:2455:0, doc = ' if iterable supports the buffer interface: return iterable,\n if not, return a cython.view.array object (which does) ', modifiers = [...]/0, name = 'with_buffer', num_required_args = 1, py_wrapper_required = True, reqd_kw_flags_cname = '0') DefNode.body = StatListNode(symengine_wrapper.pyx:2456:4) StatListNode.stats[0] = IfStatNode(symengine_wrapper.pyx:2460:4) IfStatNode.if_clauses[0] = IfClauseNode(symengine_wrapper.pyx:2460:7) IfClauseNode.body = StatListNode(symengine_wrapper.pyx:2461:8) StatListNode.stats[0] = TryExceptStatNode(symengine_wrapper.pyx:2461:8) TryExceptStatNode.except_clauses[0] = ExceptClauseNode(symengine_wrapper.pyx:2463:8) ExceptClauseNode.body = StatListNode(symengine_wrapper.pyx:2464:12, is_terminator = True) StatListNode.stats[0] = SingleAssignmentNode(symengine_wrapper.pyx:2464:41) SingleAssignmentNode.rhs = GeneralCallNode(symengine_wrapper.pyx:2464:41, result_is_used = True, use_managed_ref = True) File 'ExprNodes.py', line 8035, in compile_time_value: DictNode(symengine_wrapper.pyx:2464:47, is_dict_literal = True, is_temp = 1, obj_conversion_errors = [...]/0, reject_duplicates = True, result_is_used = True, use_managed_ref = True) File 'ExprNodes.py', line 7334, in compile_time_value: TupleNode(symengine_wrapper.pyx:2464:49, is_sequence_constructor = 1, result_is_used = True, use_managed_ref = True) File 'ExprNodes.py', line 6730, in compile_time_value_list: TupleNode(symengine_wrapper.pyx:2464:49, is_sequence_constructor = 1, result_is_used = True, use_managed_ref = True) File 'ExprNodes.py', line 4981, in compile_time_value: SimpleCallNode(symengine_wrapper.pyx:2464:54, result_is_used = True, use_managed_ref = True) File 'ExprNodes.py', line 1783, in compile_time_value: NameNode(symengine_wrapper.pyx:2464:54, cf_maybe_null = True, is_name = True, name = '_size', result_is_used = True, use_managed_ref = True) Compiler crash traceback from this point on: File "/home/isuru/miniconda3/envs/test-cython/lib/python3.5/site-packages/Cython/Compiler/ExprNodes.py", line 1781, in compile_time_value return denv.lookup(self.name) AttributeError: 'NoneType' object has no attribute 'lookup' -------------- next part -------------- An HTML attachment was scrubbed... URL: From matthew.brett at gmail.com Fri Apr 22 17:02:07 2016 From: matthew.brett at gmail.com (Matthew Brett) Date: Fri, 22 Apr 2016 14:02:07 -0700 Subject: [Cython] Manylinux wheels for Cython In-Reply-To: <5719CC33.6090507@behnel.de> References: <5719CC33.6090507@behnel.de> Message-ID: On Fri, Apr 22, 2016 at 12:01 AM, Stefan Behnel wrote: > Matthew Brett schrieb am 21.04.2016 um 19:47: >> I haven't heard of any problems, and both current and historical numpy >> and scipy wheels appear to be working without problems as well. >> >> So, I propose to upload historical Cython wheels (for versions 0.17 >> and up) to speed up CI testing with older versions of Cython. > > There wasn't much feedback overall, so I'll just say that I'm happy you've > put some effort into this. Thanks! No problem, happy to return something for all the good use I get from Cython. > I'm all for uploading wheels for those older versions to reduce build times > for users. Even in the worst case, we can always take them out if anything > goes really wrong. OK - I uploaded 64-bit wheels back to Cython 0.17 - please let me know of any problems. I have built 32-bit wheels as well, but haven't uploaded them as yet. They are in https://nipy.bic.berkeley.edu/manylinux if y'all want to test. Cheers, Matthew From robertwb at gmail.com Sat Apr 23 05:48:22 2016 From: robertwb at gmail.com (Robert Bradshaw) Date: Sat, 23 Apr 2016 02:48:22 -0700 Subject: [Cython] Cython compiler crash in 0.24 In-Reply-To: References: Message-ID: Thanks for the report. Looking into it. On Fri, Apr 22, 2016 at 2:14 AM, Isuru Fernando wrote: > Hi, > > When cythonizing a .pyx I get an error in Cython 0.24 which was not there > in Cython 0.23.3 > > After printing the stacktrace, this seems to be because of this line > https://github.com/cython/cython/commit/6d55fd189f6ee9d4374d00b8c9c320bd04332bab#diff-28c66ef9e2ff564619ef82aa9d72ee7dR2762 > > where a None object is passed and the following line calls the None object. > > > https://github.com/cython/cython/blob/6d55fd189f6ee9d4374d00b8c9c320bd04332bab/Cython/Compiler/ExprNodes.py#L1780 > > Let me know if you need more information. > > Thanks, > > Isuru Fernando > > > > [ 33%] Cythonizing symengine_wrapper.pyx > > Error compiling Cython file: > ------------------------------------------------------------ > ... > cdef double complex[::1] cmplx_view > if real: > try: > real_view = iterable > except (ValueError, TypeError): > real_view = cython.view.array(shape=(_size(iterable),), > ^ > ------------------------------------------------------------ > > symengine_wrapper.pyx:2464:54: Compiler crash in TransformBuiltinMethods > > ModuleNode.body = StatListNode(symengine_wrapper.pyx:1:0) > StatListNode.stats[163] = StatListNode(symengine_wrapper.pyx:2455:0) > StatListNode.stats[0] = DefNode(symengine_wrapper.pyx:2455:0, > doc = ' if iterable supports the buffer interface: return iterable,\n > if not, return a cython.view.array object (which does) ', > modifiers = [...]/0, > name = 'with_buffer', > num_required_args = 1, > py_wrapper_required = True, > reqd_kw_flags_cname = '0') > DefNode.body = StatListNode(symengine_wrapper.pyx:2456:4) > StatListNode.stats[0] = IfStatNode(symengine_wrapper.pyx:2460:4) > IfStatNode.if_clauses[0] = IfClauseNode(symengine_wrapper.pyx:2460:7) > IfClauseNode.body = StatListNode(symengine_wrapper.pyx:2461:8) > StatListNode.stats[0] = TryExceptStatNode(symengine_wrapper.pyx:2461:8) > TryExceptStatNode.except_clauses[0] = > ExceptClauseNode(symengine_wrapper.pyx:2463:8) > ExceptClauseNode.body = StatListNode(symengine_wrapper.pyx:2464:12, > is_terminator = True) > StatListNode.stats[0] = SingleAssignmentNode(symengine_wrapper.pyx:2464:41) > SingleAssignmentNode.rhs = GeneralCallNode(symengine_wrapper.pyx:2464:41, > result_is_used = True, > use_managed_ref = True) > File 'ExprNodes.py', line 8035, in compile_time_value: > DictNode(symengine_wrapper.pyx:2464:47, > is_dict_literal = True, > is_temp = 1, > obj_conversion_errors = [...]/0, > reject_duplicates = True, > result_is_used = True, > use_managed_ref = True) > File 'ExprNodes.py', line 7334, in compile_time_value: > TupleNode(symengine_wrapper.pyx:2464:49, > is_sequence_constructor = 1, > result_is_used = True, > use_managed_ref = True) > File 'ExprNodes.py', line 6730, in compile_time_value_list: > TupleNode(symengine_wrapper.pyx:2464:49, > is_sequence_constructor = 1, > result_is_used = True, > use_managed_ref = True) > File 'ExprNodes.py', line 4981, in compile_time_value: > SimpleCallNode(symengine_wrapper.pyx:2464:54, > result_is_used = True, > use_managed_ref = True) > File 'ExprNodes.py', line 1783, in compile_time_value: > NameNode(symengine_wrapper.pyx:2464:54, > cf_maybe_null = True, > is_name = True, > name = '_size', > result_is_used = True, > use_managed_ref = True) > > Compiler crash traceback from this point on: > File > "/home/isuru/miniconda3/envs/test-cython/lib/python3.5/site-packages/Cython/Compiler/ExprNodes.py", > line 1781, in compile_time_value > return denv.lookup(self.name) > AttributeError: 'NoneType' object has no attribute 'lookup' > > > > _______________________________________________ > cython-devel mailing list > cython-devel at python.org > https://mail.python.org/mailman/listinfo/cython-devel > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at vorpus.org Sun Apr 24 04:15:26 2016 From: njs at vorpus.org (Nathaniel Smith) Date: Sun, 24 Apr 2016 01:15:26 -0700 Subject: [Cython] bug report on cython-mode.el: freezes when using which-function-mode Message-ID: Hi all, Bug report here -- trying to edit some cython code in emacs just now, emacs was repeatedly freezing until I'd hit C-g repeatedly. Made things literally unusable -- I couldn't type characters into the buffer. M-x toggle-debug-on-quit gives the backtrace: Debugger entered--Lisp error: (quit) syntax-ppss() python-nav-beginning-of-statement() cython-beginning-of-block() cython-current-defun() run-hook-with-args-until-success(cython-current-defun) which-function() which-func-update-1(#) which-func-update() apply(which-func-update nil) timer-event-handler([t 0 0 500000 t which-func-update nil idle 0]) Which strongly suggests that the problem has something to do with my having which-function-mode enabled, and likely that something is wrong with cython-current-defun. (which-function-mode is a minor mode built into emacs.) Toggling which-function-mode off seems tentatively to have fixed the problem. So there's a workaround, but really cython-mode + which-function-mode shouldn't cause freezes :-). Possible contributing factor: this emacs is built from a git snapshot of master ("GNU Emacs 25.1.50.1 (x86_64-pc-linux-gnu, GTK+ Version 3.18.9) of 2016-04-22"), so it has the git versions of python-mode and which-function-mode. (I'm just using the python.el that ships with emacs, + elpy. But toggling elpy off didn't seem to affect the hangs.) I don't know whether the same thing happens with released versions of emacs. -n -- Nathaniel J. Smith -- https://vorpus.org From jason.madden at nextthought.com Sun Apr 24 08:25:44 2016 From: jason.madden at nextthought.com (Jason Madden) Date: Sun, 24 Apr 2016 07:25:44 -0500 Subject: [Cython] bug report on cython-mode.el: freezes when using which-function-mode In-Reply-To: References: Message-ID: > On Apr 24, 2016, at 03:15, Nathaniel Smith wrote: > > I don't know whether the same thing happens with released versions of > emacs. I see the same behaviour with the 24.5 release of emacs (stock python.el + elpy). Turning off which-function-mode seems to solve it (thanks for the tip, BTW). From erik.m.bray at gmail.com Mon Apr 25 11:18:11 2016 From: erik.m.bray at gmail.com (Erik Bray) Date: Mon, 25 Apr 2016 17:18:11 +0200 Subject: [Cython] Minor issue with running tests and -Wlong-long Message-ID: Hello, As some of you already know I've been doing some work on Cython on Cygwin [in the process I I'm constantly mixing the two up in speech, but maybe in writing I'll do better :)]. There are several issues with the tests on Cygwin, and that's one thing I'll work on. But a major annoyance I've encountered when running any tests is a huge number of warnings from gcc such as: embray at PC-pret-47 ~/src/cython $ CFLAGS="-O0" ./runtests.py -vv --no-cpp addloop Python 2.7.10 (default, Jun 1 2015, 18:05:38) [GCC 4.9.2] Running tests against Cython 0.24 f68b5bd0fa620d0dc26166bffe5fe42d94068720 Backends: c runTest (__main__.CythonRunTestCase) compiling (c) and running addloop ... === C/C++ compiler error output: === In file included from /usr/include/python2.7/Python.h:58:0, from addloop.c:4: /usr/include/python2.7/pyport.h:69:27: warning: ISO C90 does not support ?long long? [-Wlong-long] #define PY_LONG_LONG long long ^ /usr/include/python2.7/pyport.h:793:34: note: in definition of macro ?PyAPI_FUNC? # define PyAPI_FUNC(RTYPE) RTYPE ^ /usr/include/python2.7/intobject.h:46:21: note: in expansion of macro ?PY_LONG_LONG? PyAPI_FUNC(unsigned PY_LONG_LONG) PyInt_AsUnsignedLongLongMask(PyObject *); ^ In file included from /usr/include/python2.7/Python.h:58:0, from addloop.c:4: ... And so on. For now an easy workaround is to add -Wno-long-long to the compiler flags. But I was curious why I was seeing this in Cygwin but not on my Ubuntu system, and here's why: In runtests.py there is a check https://github.com/cython/cython/blob/master/runtests.py#L829 if self.language=='c' and compiler='gcc': ext_compile_flags.extend(['-std=c89', '-pedantic']) where in this case `compiler` is assigned `sysconfig.get_config_var('CC')`. On my Linux system this expands to "x86_64-linux-gnu-gcc -pthread". Whereas on Cygwin (and probably many other systems) it expands simply to "gcc". I'm guessing that to do what it intended the above line should read "and 'gcc' in compiler". But this also raises the question: Why are the tests run with these flags? If Python was configured with HAVE_LONG_LONG, then these warnings will be inevitable. Thanks, Erik From erik.m.bray at gmail.com Tue Apr 26 08:49:55 2016 From: erik.m.bray at gmail.com (Erik Bray) Date: Tue, 26 Apr 2016 14:49:55 +0200 Subject: [Cython] Fwd: Question about how best require compiler options for C sources In-Reply-To: References: Message-ID: On Mon, Apr 11, 2016 at 7:49 PM, Ian Henriksen wrote: > On Mon, Apr 11, 2016 at 7:46 AM Erik Bray wrote: >> >> On Mon, Apr 11, 2016 at 2:51 PM, Nathaniel Smith wrote: >> > On Apr 11, 2016 04:18, "Erik Bray" wrote: >> >> >> >> On Fri, Apr 8, 2016 at 5:49 PM, Nathaniel Smith wrote: >> >> > Can you give a tiny concrete example? My questions are basic enough >> >> > that >> >> > I >> >> > feel like I'm missing something fundamental :-) >> >> >> >> Yes, I think you might be missing something, but I'm not sure exactly >> >> where. In the issue I provided a tiny example which you can see here: >> >> >> >> https://gist.github.com/embray/12a67edb82b213217e31f408007898e6 >> >> >> >> The C code generated by this example currently does not compile on >> >> Windows, because of how Cython uses DL_IMPORT incorrectly. Regardless >> >> of what it *does* do, Cython should not be using the DL_IMPORT macro >> >> at all actually since that no longer even exists in Python. >> > >> > Sure. >> > >> > For that example, the correct thing to do is to *not* export the >> > function. >> >> So as I wrote in my previous message, my example was incomplete. But >> indeed if the intent was *only* to share the declaration between TUs >> of the same library then I agree the *most* correct thing would be to >> not use dllexport. Unfortunately there isn't currently a way in Cython >> to make this distinction. >> >> > Backing up, to make sure we're on the same page: >> >> Yep, I agree with your analysis that follows, and it's helpful to have >> all the cases laid out in one place, so thanks for that! >> >> There are three levels of >> > symbol visibility in C: file-internal, shared library internal >> > (different .c >> > files that make up the library can see it, but users of the shared >> > library >> > can't see it), and shared library exported (everyone can see it; can >> > also >> > carry other consequences, e.g. on Linux then internal calls will become >> > noticeably slower, and it becomes possible for weird symbol >> > interposition >> > issues to occur). So the rule of thumb is to make everything as private >> > as >> > you can get away with. >> > >> > Making this more interesting: >> > - vanilla C only includes 2 ways to mark symbol visibility, which is not >> > enough to make a 3-way distinction. Hence the need for an extra >> > attribute >> > thingummy. >> > - everyone agrees that symbols marked 'static' should be file-internal, >> > but >> > different platforms disagree about what should happen if the extra >> > attribute >> > thingummy is missing. >> > >> > So on Windows, the convention is: >> > 'static' -> file internal >> > no marking -> shared library internal >> > 'dllexport' -> public >> > >> > And on Linux it's: >> > 'static' -> file internal >> > 'visibility (hidden)' -> shared library internal >> > no marking -> public >> > >> > It's generally agreed that Linux got this wrong and that you should >> > always >> > use '-fvisibility=hidden' to switch it to the windows style, but cython >> > doesn't control compiler options and thus should probably generate code >> > that >> > works correctly regardless of such compiler settings. Fortunately, Linux >> > does provide some markings to explicitly make things public: you can >> > mark a >> > symbol 'visibility (default)' (which means public), or you can use the >> > dllexport syntax, just like Windows, because gcc is helpful like that. >> > >> > OTOH, windows is annoying because of this dllimport thing that started >> > this >> > whole thread: on other systems just marking symbols as extern is enough >> > to >> > handle both shared-object-internal and shared-library-exported symbols, >> > and >> > the linker will sort it out. On Windows, you have to explicitly >> > distinguish >> > between these. (And annoyingly, if you accidentally leave out the >> > dllimport >> > making on functions then it will use some fallback hack that works but >> > silently degrades performance; on other symbols it just doesn't work, >> > and >> > ditto for if you use it when it isn't needed.) >> >> Yep. Agreed with all of the above. >> >> > So final conclusion: for non-static symbols, cython should first decide >> > whether they are supposed to be shared-library-internal or actually >> > exported >> > from the shared library. >> > >> > For shared-library-internal symbols: their definition should be marked >> > 'visibility(hidden)' on Linux, and unmarked on Windows. This is easy >> > using >> > some preprocessor gunk. (Or maybe simplest is: marked that everywhere >> > except >> > if using msvc, because I think everyone else will understand 'visibility >> > (hidden)' even if it's a no op.) Their declaration in the header file >> > should >> > just be 'extern' everywhere. >> > >> > For shared-library-exported symbols: I am dubious about whether cython >> > should even support these at all. But if it does, then the definitions >> > should be marked 'dllexport' (no macro trickery needed, because everyone >> > understands this syntax), and their declaration in the header file needs >> > some extra hack that is the subject of this thread. >> >> I understand your doubts here, but for the purpose of discussion let's >> just assume that it should be supported? :) >> >> And yes, the issue that the header file needs some hacky >> context-dependent declarations. A possible alternative, which in fact >> is exactly the work-around I'm using right now, would be for Cython to >> generate two headers. In the case of my example they would be "foo.h" >> and "foo_internal.h". They would be almost exactly the same except >> that the former declares the function dllimport and the latter >> declares the function dllexport, and only the _internal.h should be >> used to share a declaration between TUs. I'm currently doing this by >> writing the "foo_internal.h" manually. >> >> > Now, back to your example: Here the caller and callee are both compiled >> > into >> > the same shared library, so you don't want dllexport/dllimport at all, >> > you >> > just want a shared-library-internal symbol, which as we see is much >> > easier. >> >> Well, again, in this case I do want dllexport/dllimport but I should >> have been more clear about that. But supposing I didn't want it, then >> this is true. >> >> > NumPy also ran into some problems with this in our experiments with >> > using >> > cython internally. Our temporary solution was to use the preprocessor to >> > monkeypatch DL_IMPORT into expanding to the appropriate >> > shared-library-internal thing :-). >> >> Ah, how did you manage that? Do you have to make sure to do it before >> Python.h is ever included? >> >> >> > My first question is why you even need this, since AFAIK there are no >> >> > cases >> >> > where it is correct to have a cython module dllexporting symbols that >> >> > appear >> >> > in header files. This is what cimport is for, right? >> >> >> >> I don't think this has anything to do with cimport. Could you explain >> >> what you mean? >> > >> > We only need to solve the dllimport issue if we want to support >> > shared-library-exported symbols from cython extensions. This is only >> > useful >> > if you have different extensions that are directly linking to each other >> > using the platform linker. But this simply doesn't work in almost any >> > cases, >> > because platform linkers have all kinds of quirks that don't play well >> > with >> > python packages -- to start with, your runtime linker probably does not >> > follow Python's rules for which directories to search to find a shared >> > library... To solve these problems, the Cython devs invented the cimport >> > mechanism, which is basically a portable, python-centric way of doing >> > shared-library-exports while avoiding all the problems caused by using >> > the >> > platform linker. So my question was, what's your use case that's better >> > served by linking directly against an extension module and using the >> > platform's shared-library-export functionality, instead of cython's? >> >> Well, yes, but you can't use cimport from C code :) As for the use >> case I'll have to get back to you on that. I'm trying to help the >> Sage project fix some issues related to that and my understanding is >> that it is needed for C code to be able to link directly against code >> that's compiled into an extension module. I could be wrong about that >> in which case we're free to ignore that case (unless someone does come >> up with a use case). So yes, I had better double-check on that :) >> >> Still, I'm not necessarily talking about linking Cython modules >> together, which is why I don't think cimport really comes into it. >> >> >> > My second question is why you would want to do this via the command >> >> > line, >> >> > when compiling the dll means that you are compiling some >> >> > cython-generated >> >> > .c, which means that you can put the #define directly in the source >> >> > code, >> >> > no? >> >> >> >> Not necessarily. As you can see in my example the file fooutil.c is >> >> hand-written C code that was not generated by Cython, but which uses a >> >> function in the Cython-generated code. It includes "foo.h". In >> >> principle you're right--the hand-written C code could set the proper >> >> #defines but it would have to do so *before* including "foo.h". It's >> >> not very obvious that this would be needed. Most other code I've seen >> >> that addresses this issue--including cpython itself--do so by passing >> >> an appropriate define to the compiler via the command-line, and that >> >> seems the clearest to me. >> > >> > I see, right. I guess my suggestion would be that if a symbol really >> > does >> > need to be marked for shared-library-export *and simultaneously* used by >> > different files within the same shared library -- which is the only case >> > where this arises -- then possibly the simplest and most robust thing is >> > to >> > set up the header file so that external users just do #include "foo.h", >> > and >> > the internal users do >> > >> > #define FOO_INTERNAL >> > #include "foo.h" >> >> Okay, this is similar to my above suggestion of using separate >> headers. I for one prefer the separate headers better, as I think >> it's easy to forget to set a #define like that--and to anyone not >> working on Windows it's not at all clear why that would be needed. In >> fact on Linux it will "just work" without the "#define FOO_INTERNAL" >> so without regular testing on Windows it will be too easy to forget. >> >> I'm not sure if having a separate _internal.h header is any better. >> But it *might* be--in particular if it's generated by Cython then it >> would force the developer to ask what the difference is between >> "foo.h" and "foo_internal.h". And they can both contain comments >> explaining when to use them. >> >> > I think the obvious thing is that cython should provide a natural way to >> > mark a function as being shared-library-internal; that covers 99% of >> > real >> > use cases *and* just works without any further macros needed. Probably >> > the >> > existing "public" annotation should be changed to mean this. >> >> Probably, yeah. Ignoring the DL_IMPORT stuff the other effect of >> marking a symbol "public" in Cython is to add the appropriate >> __PYX_EXTERN_C. And if that were *all* it did then its behavior would >> be consistent between platforms :) >> >> > (Obviously it >> > wasn't quite fully thought through in the first place and has few if any >> > users, since it got this very wrong without anyone noticing. So fixing >> > it >> > seems viable to me.) >> >> +1 >> >> > And then *maybe* there should also be a way to make a symbol >> > shared-library-exported, if that really is useful, but as a non-default >> > experts-only kind of thing, and as such it would be OK to require these >> > rare >> > expert users to be a bit more careful about how they #include the >> > resulting >> > header within their own project. >> >> Okay. My belief is that there is a case for this, but I should >> substantiate it better. Would you be amenable to the generation of a >> "_internal.h"? The more I think about it the more I'm >> convinced this would be the simplest way to handle this, and would >> simplify matters by not requiring Cython to impose any compiler flags >> (making my original quesition moot). >> >> Agreed that it could be non-default. >> >> Thanks again, >> >> Erik >> > > To answer the original question about define macros, it appears that the > canonical > way to pass preprocessor defines through distutils is to use the > define_macros > keyword when constructing your Extension class. You should also be able to > do > this within a Cython source file by including a directive like: > > # distutils: define_macros = MY_DEFINE, MY_DEFINE_2=2 > > Unfortunately, it looks like there's a bug in that that's making it so that > these > macros are undef'ed rather than being defined, so, for now, just pass the > appropriate flags to your Extension object. > > That aside, I agree with Nathaniel that exporting public declarations as a > part of the > shared object interface was a design mistake. That aside, however, using an > api > declaration lets you get equivalent results without exposing anything as a > part of > the shared object API. Here's how this all works: > > public declarations: export things to C/C++ through the shared object > interface. > Provide a header that exports this interface. > api declarations: export things to C/C++ through capsule objects. Provide a > header > for the Python module that exports that interface. > cimports: Use capsule objects and parsing of pxd files to share things like > external > declarations, header includes, inline Cython functions, and Cython functions > exported by modules between Cython modules. > > The public and api use cases are essentially the same most of the time, but > api > declarations use capsules rather than the OS's linker. > > There are still some trade-offs between public and api functions. > Technically, the > api functions require that the initialization routine exported in the api > header be > called for each translation unit that uses them. The public api just > requires that the > module already be initialized. In cases where no Python functionality is > used in a > public function, you may be able to get away with using the function without > initializing the module, though I really wouldn't recommend that. > > There are some more subtle issues here though. The reason api functions need > to > be initialized on a per-translation unit basis is that things exported as > api > declarations are exported as translation-unit-local (static) function > pointers. They > aren't shared by the different translation units within a module built from > multiple > source files. I think that's a mistake. It'd be ideal if we could have api > interfaces (or > something like them) provide things with shared object local visibility > rather than > translation unit local visibility. This would require that the API headers > have more > carefully structured ifdef directives so that a macro could be set in a > given > translation unit to designate when to emit the actual declarations for the > needed > pointers rather than just forward declaring them. It would also require that > the main > generated c/cpp file define the pointers it uses as shared-object-local > rather rather > than static. > > In dynd-python we currently solve this problem by defining shared object > local > wrappers for the api exported function pointers and then using those > instead, but > I'm not a huge fan of that approach. It works well, but results in another > unnecessary layer of indirection through the source files to connect the C++ > code > back to its Python bindings. > > With regards to the dllexporting/dllimporting of things: given that public > declarations > are already designed to export things through the shared object interface, > we may > as well fix the current setup to export the right things. It's a bad design > that > probably ought to be deprecated or at least documented better so people know > not > to use it unless their case actually requires sidestepping best practices. > On the > other hand, it's also a supported interface, so there's value in "fixing" > it. > > I think the best way to do that is the following: > - mark public symbols as dllimport unless a given (module specific) > preprocessor > define is set. > - people using the public header outside of the module exporting the symbols > should not have to set the define at all. > - people using the public header to compile other source files that are > linked in to > the same Python module should set the preprocessor flag for that module. > > On top of that, at some point we still need to fix our api and public > headers so that > they still work when included into the translation unit for the main > Cython-generated > c/cpp file. This use-case should just forward-declare everything since the > needed > symbols are all defined later on in the Cython module. Since static > variables > cannot be forward declared in C, this will require that api declarations use > shared > object local symbols or that the main generated c/cpp file use some ifdef > guards > when it initializes the various pointers in question. > > As far as making an additional header goes, I personally prefer the extra > preprocessor define. On the other hand, if people think an additional header > is > easier to use, then why not make it do something like > > #define USE_DLLEXPORT_NOT_DLLIMPORT_FOR_PARTICULAR_MODULE > #include > > I think, that'd cover all the use cases better. > > Anyway, sorry for the length of my remarks on this. There are several issues > here > that have been bothering me for quite some time. Hi all, Reviving this discussion again, since I'm back to working on it and in particular trying to improve Cython's test acceptance on Cygwin. There is another problem related to this that is not addressed at all by PR #360 [1] and is perhaps even trickier to handle in any sensible way. There is an example of this in the test suite under Cython.Debugger.Tests.TestLibCython. There is a Cython module in there called simply "codefile" (without .pyx). It contains the statement: cdef extern: void some_c_function() The some_c_function() is defined in a separate file cfuncs.c. Cythoning "codefile" results in a declaration for some_c_function(), since it is not provided by any header (?): __PYX_EXTERN_C DL_IMPORT(void) some_c_function(void); /*proto*/ However, this blows up at link time when codefile.o and cfuncs.o are linked. A workaround is to write a header file containing a declaration for some_c_function, and change the cdef extern statement to: cdef extern from "codefile.h": This indicates that some_c_function is found in an existing header file, and that a declaration should not be generated for it, as expected. So perhaps it should simply be documented that "cdef extern" without a header specified should never be used for objects defined in another object file that is being statically linked to. Alternatively, the only other workable possibility is to add to Cython a way to explicitly indicate that a function is defined in another .c file that we are statically linking with without actually *including* that .c file, so that a declaration is generated with the correct DL_ macro. Thoughts? Erik [1] https://github.com/cython/cython/pull/360 From erik.m.bray at gmail.com Tue Apr 26 10:58:44 2016 From: erik.m.bray at gmail.com (Erik Bray) Date: Tue, 26 Apr 2016 16:58:44 +0200 Subject: [Cython] Cygwin: Handling missing C99 long double functions Message-ID: Hi again, Sorry if I'm spamming the list too much, but I've encountered another pretty serious and unfortunate issue with Cython on Cygwin. The problem now is that many of libm long double functions like sqrtl, tanl, etc. are missing on Cygwin (missing from newlib to be specific). I think this is a previously known issue, but nothing's ever really been done about it. Well, to be clear, sometimes they're present, but only when sizeof(double) == sizeof(long double). However on 64-bit Cygwin sizeof(long double) == 16, so the functions are simply not defined. This seems to be due to lack of interest / effort: https://www.cygwin.com/ml/cygwin/2011-04/msg00231.html That post is 5 years old, but I can't find any evidence that this has changed. There are quite a few tests in Cygwin's test suite that test long double support. I guess what I'm asking is what would be the best thing to do about it. I could just skip those tests on Cygwin, though I'm not sure the best way to go about skipping an entire test for a given platform. More generally though, outside the context of testing, this means Cygwin will sometimes generate code that cannot be compiled on this platform, and a question arises as to what to do about that. I have some thoughts, but am not sure if it's worth discussing any further or not. Thanks, Erik From dimpase+github at gmail.com Tue Apr 26 11:16:43 2016 From: dimpase+github at gmail.com (Dima Pasechnik) Date: Tue, 26 Apr 2016 16:16:43 +0100 Subject: [Cython] Cygwin: Handling missing C99 long double functions In-Reply-To: References: Message-ID: Hi, certainly we did something with Sage on cygwin to work around these... Just in case, Dima On Tue, Apr 26, 2016 at 3:58 PM, Erik Bray wrote: > Hi again, > > Sorry if I'm spamming the list too much, but I've encountered another > pretty serious and unfortunate issue with Cython on Cygwin. > > The problem now is that many of libm long double functions like sqrtl, > tanl, etc. are missing on Cygwin (missing from newlib to be specific). > I think this is a previously known issue, but nothing's ever really > been done about it. Well, to be clear, sometimes they're present, but > only when sizeof(double) == sizeof(long double). However on 64-bit > Cygwin sizeof(long double) == 16, so the functions are simply not > defined. > > This seems to be due to lack of interest / effort: > https://www.cygwin.com/ml/cygwin/2011-04/msg00231.html That post is 5 > years old, but I can't find any evidence that this has changed. > > There are quite a few tests in Cygwin's test suite that test long > double support. I guess what I'm asking is what would be the best > thing to do about it. > > I could just skip those tests on Cygwin, though I'm not sure the best > way to go about skipping an entire test for a given platform. > > More generally though, outside the context of testing, this means > Cygwin will sometimes generate code that cannot be compiled on this > platform, and a question arises as to what to do about that. I have > some thoughts, but am not sure if it's worth discussing any further or > not. > > Thanks, > Erik > _______________________________________________ > cython-devel mailing list > cython-devel at python.org > https://mail.python.org/mailman/listinfo/cython-devel From erik.m.bray at gmail.com Tue Apr 26 11:36:20 2016 From: erik.m.bray at gmail.com (Erik Bray) Date: Tue, 26 Apr 2016 17:36:20 +0200 Subject: [Cython] Cygwin: Handling missing C99 long double functions In-Reply-To: References: Message-ID: On Tue, Apr 26, 2016 at 5:16 PM, Dima Pasechnik wrote: > Hi, > certainly we did something with Sage on cygwin to work around these... > Just in case, >From what I can tell there are several places where Sage has hacked around this issue in different packages, but it's not doing anything specifically with it for Cython. Sage uses the Cephes math lib to support these functions on FreeBSD--and apparently used to use it on Cygwin too, but disabled that for some reason. Regardless, Cython should ultimately do something sensible in these cases. My general thinking is that in cases where Cython generates code containing C math functions, it ought to support a fallback. This will require some feature checks so that Cython can generate wrappers, when necessary, around the double versions of those functions (as Numpy currently does). Erik > On Tue, Apr 26, 2016 at 3:58 PM, Erik Bray wrote: >> Hi again, >> >> Sorry if I'm spamming the list too much, but I've encountered another >> pretty serious and unfortunate issue with Cython on Cygwin. >> >> The problem now is that many of libm long double functions like sqrtl, >> tanl, etc. are missing on Cygwin (missing from newlib to be specific). >> I think this is a previously known issue, but nothing's ever really >> been done about it. Well, to be clear, sometimes they're present, but >> only when sizeof(double) == sizeof(long double). However on 64-bit >> Cygwin sizeof(long double) == 16, so the functions are simply not >> defined. >> >> This seems to be due to lack of interest / effort: >> https://www.cygwin.com/ml/cygwin/2011-04/msg00231.html That post is 5 >> years old, but I can't find any evidence that this has changed. >> >> There are quite a few tests in Cygwin's test suite that test long >> double support. I guess what I'm asking is what would be the best >> thing to do about it. >> >> I could just skip those tests on Cygwin, though I'm not sure the best >> way to go about skipping an entire test for a given platform. >> >> More generally though, outside the context of testing, this means >> Cygwin will sometimes generate code that cannot be compiled on this >> platform, and a question arises as to what to do about that. I have >> some thoughts, but am not sure if it's worth discussing any further or >> not. >> >> Thanks, >> Erik >> _______________________________________________ >> cython-devel mailing list >> cython-devel at python.org >> https://mail.python.org/mailman/listinfo/cython-devel > _______________________________________________ > cython-devel mailing list > cython-devel at python.org > https://mail.python.org/mailman/listinfo/cython-devel From jdemeyer at cage.ugent.be Tue Apr 26 14:02:42 2016 From: jdemeyer at cage.ugent.be (Jeroen Demeyer) Date: Tue, 26 Apr 2016 20:02:42 +0200 Subject: [Cython] Cygwin: Handling missing C99 long double functions In-Reply-To: References: Message-ID: <571FAD42.8020106@cage.ugent.be> On 2016-04-26 16:58, Erik Bray wrote: > The problem now is that many of libm long double functions like sqrtl, > tanl, etc. are missing on Cygwin Just to understand the issue better, can you elaborate how this impacts Cython? From robertwb at gmail.com Tue Apr 26 16:55:53 2016 From: robertwb at gmail.com (Robert Bradshaw) Date: Tue, 26 Apr 2016 13:55:53 -0700 Subject: [Cython] Cygwin: Handling missing C99 long double functions In-Reply-To: References: Message-ID: On Tue, Apr 26, 2016 at 8:36 AM, Erik Bray wrote: > On Tue, Apr 26, 2016 at 5:16 PM, Dima Pasechnik > wrote: > > Hi, > > certainly we did something with Sage on cygwin to work around these... > > Just in case, > > From what I can tell there are several places where Sage has hacked > around this issue in different packages, but it's not doing anything > specifically with it for Cython. Sage uses the Cephes math lib to > support these functions on FreeBSD--and apparently used to use it on > Cygwin too, but disabled that for some reason. > > Regardless, Cython should ultimately do something sensible in these cases. > > My general thinking is that in cases where Cython generates code > containing C math functions, it ought to support a fallback. This > will require some feature checks so that Cython can generate wrappers, > when necessary, around the double versions of those functions (as > Numpy currently does). > Let's make things concrete. You're complaining that something like cdef extern from "math.h": long double sqrtl(long double) def foo(long double x): return sqrtl(x) Doesn't work on Cygwin? The same is true for *any* C function that you use that's not totally portable (this is the bane of trying to use C). I don't think Cython should be detecting this and substituting a (less accurate) sqrt for sqrtl in this case. If you want do do this, write your own headers that (conditionally) define these things however you want. Or, are you complaining that Cython's test suite doesn't pass on some Cygwin because there are tests of features not available on Cygwin? (Your original email isn't clear.) If so, the test framework can be set up to exclude these tests on that platform. > > Erik > > > On Tue, Apr 26, 2016 at 3:58 PM, Erik Bray > wrote: > >> Hi again, > >> > >> Sorry if I'm spamming the list too much, but I've encountered another > >> pretty serious and unfortunate issue with Cython on Cygwin. > >> > >> The problem now is that many of libm long double functions like sqrtl, > >> tanl, etc. are missing on Cygwin (missing from newlib to be specific). > >> I think this is a previously known issue, but nothing's ever really > >> been done about it. Well, to be clear, sometimes they're present, but > >> only when sizeof(double) == sizeof(long double). However on 64-bit > >> Cygwin sizeof(long double) == 16, so the functions are simply not > >> defined. > >> > >> This seems to be due to lack of interest / effort: > >> https://www.cygwin.com/ml/cygwin/2011-04/msg00231.html That post is 5 > >> years old, but I can't find any evidence that this has changed. > >> > >> There are quite a few tests in Cygwin's test suite that test long > >> double support. I guess what I'm asking is what would be the best > >> thing to do about it. > >> > >> I could just skip those tests on Cygwin, though I'm not sure the best > >> way to go about skipping an entire test for a given platform. > >> > >> More generally though, outside the context of testing, this means > >> Cygwin will sometimes generate code that cannot be compiled on this > >> platform, and a question arises as to what to do about that. I have > >> some thoughts, but am not sure if it's worth discussing any further or > >> not. > >> > >> Thanks, > >> Erik > >> _______________________________________________ > >> cython-devel mailing list > >> cython-devel at python.org > >> https://mail.python.org/mailman/listinfo/cython-devel > > _______________________________________________ > > cython-devel mailing list > > cython-devel at python.org > > https://mail.python.org/mailman/listinfo/cython-devel > _______________________________________________ > cython-devel mailing list > cython-devel at python.org > https://mail.python.org/mailman/listinfo/cython-devel > -------------- next part -------------- An HTML attachment was scrubbed... URL: From insertinterestingnamehere at gmail.com Tue Apr 26 17:04:28 2016 From: insertinterestingnamehere at gmail.com (Ian Henriksen) Date: Tue, 26 Apr 2016 21:04:28 +0000 Subject: [Cython] Cygwin: Handling missing C99 long double functions In-Reply-To: References: Message-ID: On Tue, Apr 26, 2016 at 2:56 PM Robert Bradshaw wrote: > On Tue, Apr 26, 2016 at 8:36 AM, Erik Bray wrote: > >> On Tue, Apr 26, 2016 at 5:16 PM, Dima Pasechnik >> wrote: >> > Hi, >> > certainly we did something with Sage on cygwin to work around these... >> > Just in case, >> >> From what I can tell there are several places where Sage has hacked >> around this issue in different packages, but it's not doing anything >> specifically with it for Cython. Sage uses the Cephes math lib to >> support these functions on FreeBSD--and apparently used to use it on >> Cygwin too, but disabled that for some reason. >> >> Regardless, Cython should ultimately do something sensible in these cases. >> >> My general thinking is that in cases where Cython generates code >> containing C math functions, it ought to support a fallback. This >> will require some feature checks so that Cython can generate wrappers, >> when necessary, around the double versions of those functions (as >> Numpy currently does). >> > > Let's make things concrete. You're complaining that something like > > cdef extern from "math.h": > long double sqrtl(long double) > > def foo(long double x): > return sqrtl(x) > > Doesn't work on Cygwin? > > The same is true for *any* C function that you use that's not totally > portable (this is the bane of trying to use C). I don't think Cython should > be detecting this and substituting a (less accurate) sqrt for sqrtl in this > case. If you want do do this, write your own headers that (conditionally) > define these things however you want. > > Or, are you complaining that Cython's test suite doesn't pass on some > Cygwin because there are tests of features not available on Cygwin? (Your > original email isn't clear.) If so, the test framework can be set up to > exclude these tests on that platform. > > Right, this sounds like a good place to exclude some tests. long double's behavior is pretty platform dependent. I wouldn't expect to be able to write platform independent code that uses it. There's not much Cython can do to change the situation either. Best, -Ian -------------- next part -------------- An HTML attachment was scrubbed... URL: From erik.m.bray at gmail.com Wed Apr 27 06:07:51 2016 From: erik.m.bray at gmail.com (Erik Bray) Date: Wed, 27 Apr 2016 12:07:51 +0200 Subject: [Cython] Cygwin: Handling missing C99 long double functions In-Reply-To: References: Message-ID: On Tue, Apr 26, 2016 at 10:55 PM, Robert Bradshaw wrote: > On Tue, Apr 26, 2016 at 8:36 AM, Erik Bray wrote: >> >> On Tue, Apr 26, 2016 at 5:16 PM, Dima Pasechnik >> wrote: >> > Hi, >> > certainly we did something with Sage on cygwin to work around these... >> > Just in case, >> >> From what I can tell there are several places where Sage has hacked >> around this issue in different packages, but it's not doing anything >> specifically with it for Cython. Sage uses the Cephes math lib to >> support these functions on FreeBSD--and apparently used to use it on >> Cygwin too, but disabled that for some reason. >> >> Regardless, Cython should ultimately do something sensible in these cases. >> >> My general thinking is that in cases where Cython generates code >> containing C math functions, it ought to support a fallback. This >> will require some feature checks so that Cython can generate wrappers, >> when necessary, around the double versions of those functions (as >> Numpy currently does). > > > Let's make things concrete. You're complaining that something like > > cdef extern from "math.h": > long double sqrtl(long double) > > def foo(long double x): > return sqrtl(x) > > Doesn't work on Cygwin? > > The same is true for *any* C function that you use that's not totally > portable (this is the bane of trying to use C). I don't think Cython should > be detecting this and substituting a (less accurate) sqrt for sqrtl in this > case. If you want do do this, write your own headers that (conditionally) > define these things however you want. No, not at all. That would be silly. Let me be clearer... > Or, are you complaining that Cython's test suite doesn't pass on some Cygwin > because there are tests of features not available on Cygwin? (Your original > email isn't clear.) If so, the test framework can be set up to exclude these > tests on that platform. There are really two concerns. This is one of them, yes. The first question is what would be the best way to exclude certain tests on certain platforms? There's no clear documentation on that (which is fine, but it's why I'm asking :) The second concern, and more serious, is that there *are* cases where Cython generates code that uses long double functions where, for example, long double is passed as an argument. For example the following code def truncate_long_double(long double x): cdef float r = int(x) return r compiles to something that includes: /* "truncl.pyx":2 * def truncate_long_double(long double x): * cdef float r = int(x) # <<<<<<<<<<<<<< * return r */ __pyx_t_1 = truncl(__pyx_v_x); __pyx_v_r = __pyx_t_1; /* "truncl.pyx":3 * def truncate_long_double(long double x): * cdef float r = int(x) * return r # <<<<<<<<<<<<<< */ __Pyx_XDECREF(__pyx_r); __pyx_t_2 = PyFloat_FromDouble(__pyx_v_r); if (unlikely(!__pyx_t_2)) __PYX_ERR(0, 3, __pyx_L1_error) It's not not clear what the best way would be to *not* use this code on platforms where truncl missing (and there are a handful of other examples besides truncl). This code is generated by an optimization that was added here: http://trac.cython.org/ticket/400 In this case replacing truncl with trunc shouldn't hurt. It's not uncommon (I think) to ship pre-cythonized C sources in a package's source distribution so that it can be installed (especially by pip) without requiring the user to have Cython. This code will fail to compile on platforms like Cygwin. But it's not clear how to handle this at Cythonization time either since it doesn't know in advance what platform it will be targeting. I'm suggesting that maybe rather than using these functions directly Cython should generate some fallbacks for where they're not available, or at least have the option to. Erik P.S. additional long double functions that can be generated by Cython include powl and fmodl, and long double complex functions like conjl, cabsl, and cpowl. So it's really only a limited number at the moment. From erik.m.bray at gmail.com Wed Apr 27 06:12:53 2016 From: erik.m.bray at gmail.com (Erik Bray) Date: Wed, 27 Apr 2016 12:12:53 +0200 Subject: [Cython] Cygwin: Handling missing C99 long double functions In-Reply-To: References: Message-ID: On Wed, Apr 27, 2016 at 12:07 PM, Erik Bray wrote: > On Tue, Apr 26, 2016 at 10:55 PM, Robert Bradshaw wrote: >> On Tue, Apr 26, 2016 at 8:36 AM, Erik Bray wrote: >>> >>> On Tue, Apr 26, 2016 at 5:16 PM, Dima Pasechnik >>> wrote: >>> > Hi, >>> > certainly we did something with Sage on cygwin to work around these... >>> > Just in case, >>> >>> From what I can tell there are several places where Sage has hacked >>> around this issue in different packages, but it's not doing anything >>> specifically with it for Cython. Sage uses the Cephes math lib to >>> support these functions on FreeBSD--and apparently used to use it on >>> Cygwin too, but disabled that for some reason. >>> >>> Regardless, Cython should ultimately do something sensible in these cases. >>> >>> My general thinking is that in cases where Cython generates code >>> containing C math functions, it ought to support a fallback. This >>> will require some feature checks so that Cython can generate wrappers, >>> when necessary, around the double versions of those functions (as >>> Numpy currently does). >> >> >> Let's make things concrete. You're complaining that something like >> >> cdef extern from "math.h": >> long double sqrtl(long double) >> >> def foo(long double x): >> return sqrtl(x) >> >> Doesn't work on Cygwin? >> >> The same is true for *any* C function that you use that's not totally >> portable (this is the bane of trying to use C). I don't think Cython should >> be detecting this and substituting a (less accurate) sqrt for sqrtl in this >> case. If you want do do this, write your own headers that (conditionally) >> define these things however you want. > > No, not at all. That would be silly. Let me be clearer... > >> Or, are you complaining that Cython's test suite doesn't pass on some Cygwin >> because there are tests of features not available on Cygwin? (Your original >> email isn't clear.) If so, the test framework can be set up to exclude these >> tests on that platform. > > There are really two concerns. This is one of them, yes. The first > question is what would be the best way to exclude certain tests on > certain platforms? There's no clear documentation on that (which is > fine, but it's why I'm asking :) > > The second concern, and more serious, is that there *are* cases where > Cython generates code that uses long double functions where, for > example, long double is passed as an argument. For example the > following code > > def truncate_long_double(long double x): > cdef float r = int(x) > return r > > compiles to something that includes: > > /* "truncl.pyx":2 > * def truncate_long_double(long double x): > * cdef float r = int(x) # <<<<<<<<<<<<<< > * return r > */ > __pyx_t_1 = truncl(__pyx_v_x); > __pyx_v_r = __pyx_t_1; > > /* "truncl.pyx":3 > * def truncate_long_double(long double x): > * cdef float r = int(x) > * return r # <<<<<<<<<<<<<< > */ > __Pyx_XDECREF(__pyx_r); > __pyx_t_2 = PyFloat_FromDouble(__pyx_v_r); if > (unlikely(!__pyx_t_2)) __PYX_ERR(0, 3, __pyx_L1_error) > > It's not not clear what the best way would be to *not* use this code > on platforms where truncl missing (and there are a handful of other > examples besides truncl). This code is generated by an optimization > that was added here: http://trac.cython.org/ticket/400 In this case > replacing truncl with trunc shouldn't hurt. > > It's not uncommon (I think) to ship pre-cythonized C sources in a > package's source distribution so that it can be installed (especially > by pip) without requiring the user to have Cython. This code will > fail to compile on platforms like Cygwin. > > But it's not clear how to handle this at Cythonization time either > since it doesn't know in advance what platform it will be targeting. > I'm suggesting that maybe rather than using these functions directly > Cython should generate some fallbacks for where they're not available, > or at least have the option to. It be clear, when I talk about "generate some fallbacks" I don't think it's obvious that there's a great way to do that, but that's why I bring it up. I'll keep thinking about it in the meantime. Thanks, Erik From robertwb at math.washington.edu Thu Apr 28 03:29:52 2016 From: robertwb at math.washington.edu (Robert Bradshaw) Date: Thu, 28 Apr 2016 00:29:52 -0700 Subject: [Cython] Cygwin: Handling missing C99 long double functions In-Reply-To: References: Message-ID: On Wed, Apr 27, 2016 at 3:07 AM, Erik Bray wrote: > On Tue, Apr 26, 2016 at 10:55 PM, Robert Bradshaw > wrote: > > On Tue, Apr 26, 2016 at 8:36 AM, Erik Bray > wrote: > >> > >> On Tue, Apr 26, 2016 at 5:16 PM, Dima Pasechnik > >> wrote: > >> > Hi, > >> > certainly we did something with Sage on cygwin to work around these... > >> > Just in case, > >> > >> From what I can tell there are several places where Sage has hacked > >> around this issue in different packages, but it's not doing anything > >> specifically with it for Cython. Sage uses the Cephes math lib to > >> support these functions on FreeBSD--and apparently used to use it on > >> Cygwin too, but disabled that for some reason. > >> > >> Regardless, Cython should ultimately do something sensible in these > cases. > >> > >> My general thinking is that in cases where Cython generates code > >> containing C math functions, it ought to support a fallback. This > >> will require some feature checks so that Cython can generate wrappers, > >> when necessary, around the double versions of those functions (as > >> Numpy currently does). > > > > > > Let's make things concrete. You're complaining that something like > > > > cdef extern from "math.h": > > long double sqrtl(long double) > > > > def foo(long double x): > > return sqrtl(x) > > > > Doesn't work on Cygwin? > > > > The same is true for *any* C function that you use that's not totally > > portable (this is the bane of trying to use C). I don't think Cython > should > > be detecting this and substituting a (less accurate) sqrt for sqrtl in > this > > case. If you want do do this, write your own headers that (conditionally) > > define these things however you want. > > No, not at all. That would be silly. Let me be clearer... > > > Or, are you complaining that Cython's test suite doesn't pass on some > Cygwin > > because there are tests of features not available on Cygwin? (Your > original > > email isn't clear.) If so, the test framework can be set up to exclude > these > > tests on that platform. > > There are really two concerns. This is one of them, yes. The first > question is what would be the best way to exclude certain tests on > certain platforms? There's no clear documentation on that (which is > fine, but it's why I'm asking :) > Probably add a tag and then modify runtests.py to detect Cygwin and automatically add this to the exclusion list. The second concern, and more serious, is that there *are* cases where > Cython generates code that uses long double functions where, for > example, long double is passed as an argument. For example the > following code > > def truncate_long_double(long double x): > cdef float r = int(x) > return r > > compiles to something that includes: > > /* "truncl.pyx":2 > * def truncate_long_double(long double x): > * cdef float r = int(x) # <<<<<<<<<<<<<< > * return r > */ > __pyx_t_1 = truncl(__pyx_v_x); > __pyx_v_r = __pyx_t_1; > > /* "truncl.pyx":3 > * def truncate_long_double(long double x): > * cdef float r = int(x) > * return r # <<<<<<<<<<<<<< > */ > __Pyx_XDECREF(__pyx_r); > __pyx_t_2 = PyFloat_FromDouble(__pyx_v_r); if > (unlikely(!__pyx_t_2)) __PYX_ERR(0, 3, __pyx_L1_error) > > It's not not clear what the best way would be to *not* use this code > on platforms where truncl missing (and there are a handful of other > examples besides truncl). This code is generated by an optimization > that was added here: http://trac.cython.org/ticket/400 In this case > replacing truncl with trunc shouldn't hurt. > Here, yes, but in general truncl(x) != trunc(x) for, say, 2**54 + 1. It's not uncommon (I think) to ship pre-cythonized C sources in a > package's source distribution so that it can be installed (especially > by pip) without requiring the user to have Cython. This code will > fail to compile on platforms like Cygwin. > > But it's not clear how to handle this at Cythonization time either > since it doesn't know in advance what platform it will be targeting. > I'm suggesting that maybe rather than using these functions directly > Cython should generate some fallbacks for where they're not available, > or at least have the option to. The safest is to never use the long double type in this case--we assume that if long double is present (used), so are the corresponding functions in math.h (which is wrong for this platform, but if we need to use powl I don't know what else to do). Alternatively, ship your own (conditionally defined) fallbacks. -------------- next part -------------- An HTML attachment was scrubbed... URL: