urllib.urlretireve problem

Ritesh Raj Sarraf riteshsarraf at users.sourceforge.net
Wed Mar 30 12:46:58 EST 2005


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Larry Bates wrote:

> I noticed you hadn't gotten a reply.  When I execute this it put's the
> following in the retrieved file:
> 
> <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
> <HTML><HEAD>
> <TITLE>404 Not Found</TITLE>
> </HEAD><BODY>
> <H1>Not Found</H1>
> The requested URL /pool/updates/main/p/perl/libparl5.6_5.6.1-8.9_i386.deb
> was no t found on this server.<P>
> </BODY></HTML>
> 
> You will probably need to use something else to first determine if the URL
> actually exists.

I'm happy that at least someone responded as this was my first post to the
python mailing list.

I'm coding a program for offline package management.
The link that I provided could be obsolete by newer packages. That is where
my problem is. I wanted to know how to raise an exception here so that
depending on the type of exception I could make my program function.

For example, for Temporary Name Resolution Failure, python raises an
exception which I've handled well. The problem lies with obsolete urls 
where no exception is raised and I end up having a 404 error page as my
data.

Can we have an exception for that ?  Or can we have the exit status of
urllib.urlretrieve to know if it downloaded the desired file.
I think my problem is fixable in urllib.urlopen, I just find
urllib.urlretrieve more convenient and want to know if it can be done with
it.

Thanks for responding.

rrs
- -- 
Ritesh Raj Sarraf
RESEARCHUT -- http://www.researchut.com
Gnupg Key ID: 04F130BC
"Stealing logic from one person is plagiarism, stealing from many is
research".
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.5 (GNU/Linux)

iD8DBQFCSuYS4Rhi6gTxMLwRAu0FAJ9R0s4TyB7zHcvDFTflOp2joVkErQCfU4vG
8U0Ah5WTdTQHKRkmPsZsHdE=
=OMub
-----END PGP SIGNATURE-----




More information about the Python-list mailing list