Logging tracebacks to MySQL (was RE: hang in urllib.read()?)

Nick Arnett narnett at mccmedia.com
Tue Mar 26 18:15:05 EST 2002


> -----Original Message-----
> From: python-list-admin at python.org
> [mailto:python-list-admin at python.org]On Behalf Of Daniel Ortmann

[snip]

> Could this information help?
>
> The web page, "Known bugs in Python 2.2", says:
>
>     "The ftplib module's FTP class was supposed to default to paassive
>     mode.  Unfortunately it doesn't.  This means that urllib.urlopen()
>     doesn't work from inside most firewalls.  If you have this problem,
>     delete or comment out line 117, 'self.passiveserver = 0', from file
>     ftplib.py".

Nope, doesn't apply, but thanks.

At this point, my robot is being nicely stable, but now I'm starting to
think that the real problem was time-outs, but I sure don't see why they
would last for hours.  Aahz's suggestion of using timeoutsocket was right on
the mark.  I am definitely seeing timeouts now, which are recovering nicely
by simply starting over with urlopen.

I was confused by the fact that I thought that urlopen() establishes the
connection, but it now seems clear to me that it starts with read().

And after a fairly hideous time figuring out how to come up with an
appropriate error message and escape it for MySQL, timeouts are now even
being logged into a table.  If anyone has an example of getting a full
traceback into a MySQL field, I'd be grateful to see it.  What seemingly
should work seems to somehow recurse a few times, so that backslashes and
such get repeatedly escaped... at least in all the ways I could see to use
the traceback module.  I gave up and up and I'm using sys.exc_info()[:2]
instead, which is probably sufficient information.

Nick





More information about the Python-list mailing list