download x bytes at a time over network

n.s.buttar at gmail.com n.s.buttar at gmail.com
Tue Mar 17 03:05:41 EDT 2009


2009/3/16 Saurabh <phonethics at gmail.com>:
> I want to download content from the net - in chunks of x bytes or characters
> at a time - so that it doesnt pull the entire content in one shot.
>
> import urllib2
> url = "http://python.org/"
> handler = urllib2.urlopen(url)
>
> data = handler.read(100)
> print """Content :\n%s \n%s \n%s""" % ('=' * 100, data, '=' * 100)
>
> # Disconnect the internet
>
> data = handler.read(100) # I want it to throw an exception because it cant
> fetch from the net
> print """Content :\n%s \n%s \n%s""" % ('=' * 100, data, '=' * 100) # But
> still works !
>
> Apparently, handler = urllib2.urlopen(url) takes the entire data in buffer
> and handler.read(100) is just reading from the buffer ?
>
> Is it possible to get content from the net in chunks ? Or do I need to use
> HTTPClient/sockets directly ?
>
> Thanks
>
> --
> http://mail.python.org/mailman/listinfo/python-list
>
>

yes you can do it with urllib(2). Please take a look at the following
HTTP headers which facilitiate this kind of transfers

Content-Range http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.16
Range http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.35

You can set the values for these and send the request to get partial contents.
However let me warn you that, not all servers allow this.



More information about the Python-list mailing list