urllib/ftpwrapper

Guido van Rossum guido at python.org
Fri May 19 23:10:50 EDT 2000


Oleg Broytmann <phd at phd.russ.ru> writes:

>    Cannot beleive it's just so simple :( My robot uses urllib.urlretrieve;
> and any test program hangs. Just run in python:
> 
> fname, headers = urllib.urlretrieve("ftp://starship.python.net/")
> 
>    and it hangs :( Almost all FTP URLs will hang, but there are some that
> not: ftp://sun.med.ru/ worked (at least for me), there is pretty standard
> wu-ftpd running there.
> 
> > all" FTP URLs.  The proper fix would be the same as in webchecker --
> > read the rest of the data.
> 
>    I beleive URLopener.retrieve eats all data...

Yes it does.

I played with this a bit and found that ftp.python.org works but
starship.python.net hangs...  Then I had a sudden idea: maybe these
ftp daemons only send the 226 message once the data socket has been
closed!

The following change works for me: in urllib.py, class addclosehook,
method close(), move the call addbase.close(self) to the top of the
method, so the data socket gets closed before the hook (which waits
for the 226 code on the control socket) is called.

    def close(self):
        addbase.close(self)
        if self.closehook:
            apply(self.closehook, self.hookargs)
            self.closehook = None
            self.hookargs = None

Let me know if this fixes your problem (with no other changes to
urllib).

-- 
--Guido van Rossum (home page: http://www.python.org/~guido/)



More information about the Python-list mailing list