How to install Python package from source on Windows

Chris Angelico rosuav at gmail.com
Sun May 21 07:06:50 EDT 2017


On Sun, May 21, 2017 at 8:23 PM, bartc <bc at freeuk.com> wrote:
> On 20/05/2017 19:37, Chris Angelico wrote:
>
>> rosuav at sikorsky:~/linux$ find -name \*.c -or -name \*.h | wc -l
>> 44546
>>
>> These repositories, by the way, correspond to git URLs
>> https://github.com/python/cpython,
>> git://pike-git.lysator.liu.se/pike.git,
>> git://source.winehq.org/git/wine, and
>> https://github.com/torvalds/linux respectively, if you want to check
>> my numbers. Two language interpreters, a popular execution subsystem,
>> and an OS kernel.
>>
>> I'd like to see you create a single-file version of the Linux kernel
>> that compiles flawlessly on any modern compiler and has no configure
>> script.
>
>
> I've had a look at the Linux stuff. (BTW, when copying to Windows, the file
> "aux" occurs several times, which causes problems as it's a reserved
> filename I think. Also there were a dozen conflicts where different versions
> of the same file wanted to be stored at the same location.)

I don't understand where you would have obtained the sources that
there are duplicate files. It's easiest just to clone someone's git
repo (eg Linus Torvald's).

> So, it /is/ big: 24000 .c files, 19000 .h files, out of 59000 total. (And
> 12000 directories, but I think that includes multiple "." and ".."
> instances, so probably 'only' about 4000.)
>
> However, I assume then when it is at some point compiled to binary, that a
> single image file results.

There are kernel modules too.

> An attempt to create a single source file would result in a representation
> somewhere between those two points. But it sounds like it wouldn't be
> possible, or practical, to have a single source that works for every
> possible target; it might have to be one source file for each of a number of
> the most common targets.
>
> I've noticed a number of files that look like assembly code - not C anyway -
> that also need to be taken into account. But on something this size, it is
> not absolutely essential the end result is exactly one file. A dozen files
> in one flat directory that can be trivially compiled would still be an
> improvement.
>
> (If you imagine a future where the number of targets has increased a
> hundred-fold (or we colonise the galaxy and there are a million possible
> targets), then it might become clearer that the approach used here - putting
> files for every conceivable machine in the same place - is not scalable.)

Actually it's not scalable as soon as you have TWO targets. Every
change has to be made to both source files. And where they differ, you
need markers to say that it's different. You know, like ifdef lines
usually are. I wonder if maybe the current system isn't the result of
incompetence after all?

ChrisA



More information about the Python-list mailing list