[Cython] patch for #655

Felix Salfelder felix at salfelder.org
Thu Jun 27 21:18:32 CEST 2013


Hi Robert.

On Thu, Jun 27, 2013 at 11:05:48AM -0700, Robert Bradshaw wrote:
> And you're planning on calling cython manually, cutting distutils out
> of the loop completely?

If someone tells me, how to fix distutils, (better: does it), i might
change my mind. also, I need VPATH... just something that works.

> > sage currently uses hardwired paths for all and everything. in
> > particular for header locations. it works right now, but the plan is to
> > support packages installed to the host system.
> 
> I don't see how that would change anything.

well, what would $SAGE_LOCAL/include/something.h be then? and how to
tell without reimplementing gcc -M functionality?

> > That's exactly what i want to do. use cython to translate .pyx->.c[pp]
> > and nothing else. the existing tool (make) needs to know when cython
> > needs to be called, so it has to know the dependency chain.
> 
> It also needs to know how cython needs to be called, and then how gcc
> needs to be called (or, would you invoke setup.py when any .pyx file
> changes, in which case you don't need a more granular rules).

autotools takes care of that. for C/C++ this has been working for ages.
automatically. cython rules need to be added manually (currently, until
somebody tweaks autotools a bit). setup.py is not needed.

> In general, I'm +1 on providing a mechanism for exporting dependencies
> for tools to do with whatever they like. I have a couple of issues
> with the current approach:
> 
> (1) Doing this on a file-by-file basis is quadratic time (which for
> something like Sage takes unbearably long as you have to actually read
> and parse the entire file to understand its dependencies, and then
> recursively merge them up to the leaves). This could be mitigated (the
> parsing at least) by writing dep files and re-using them, but it's
> still going to be sub-optimal.

i do not understand. the -MF approach writes out dependencies *during*
compilation and does no extra parsing. it can't be more efficient (can
it, how?).

the makefiles maybe are the dep files you are referring to. a second
make run will just read the dependency output and compare timestamps.

> The exact dependencies may also depend
> on the options passed into cythonize (e.g. the specific include
> directories, some dynamically computed like numpy_get_includes()).

Do you want to change include paths (reconfigure the whole thing) beween
two runs? in general this wont work without "make clean"... if it's
options within a config.h file, it may trigger recompilation of course

(also you can add any sort of dependency, if you want that)

> (2) I don't think we need to co-opt gcc's flags for this.

See e.g. /usr/share/automake-1.11/depcomp, to get an idea on how many
compilers support the -M family. There is at least "hp, "aix", "icc",
"tru64", "gcc". the "sgi" case looks similar. I don't know who started
it.

> A single
> flag that writes its output to a named file should be sufficient. No
> one expects to be able to pass gcc options to Cython, and Cython can
> be used with more C compilers than just gcc.

okay, if you feel like it, lets translate -M -MF, -MD, -MP to something
more pythonic. it would be great to use single letter options, as
otherwise the commands are unnecessarily lengthy. My current rules just
set -M -MD -MP (==-MDP).

> (3) The implementation is a bit hackish, with global dictionaries and
> random printing.

i need a global dictionary, as some files are accessed multiple times.
how can i avoid this? what is "random printing?".

i'm not a cython expert, but with some hints I might be able to improve
the patch.

thanks
felix


More information about the cython-devel mailing list