download limit

"Martin v. Löwis" martin at v.loewis.de
Sun Aug 10 10:05:55 EDT 2008


> I have a multithreaded script that  mainly creates several wget
> processes to download files. I would like to check/see and eventually
> limit the bandwidth of the pool of processes. One way to do this is to
> change the number of wget instances, but it's a workaround.
> 
> What do you recommend to do the following in python:
> 1) know the bitrate at the script scale
> 2) control, and limit or not this bitrate

I recommend to not use wget, but implement the network access directly
in Python. Then you can easily measure that bitrate, and also limit it.
(by having some threads sleep).

Once you fork out new processes, you lose, unless there is an operating
system bandwidth limitation framework available which works on groups
of processes. Solaris project objects may provide such a thing, but
apart from that, I don't think it's available in any operating system
that you might be using.

Regards,
Martin



More information about the Python-list mailing list