client to upload big files via https and get progress info

News123 news1234 at free.fr
Thu May 13 12:39:37 EDT 2010


Hi Aaaz,

Aahz wrote:
> In article <4bea6b50$0$8925$426a74cc at news.free.fr>,
> News123  <news1234 at free.fr> wrote:
>> I'd like to perform huge file uploads via https.
>> I'd like to make sure,
>> - that I can obtain upload progress info (sometimes the nw is very slow)
>> - that (if the file exceeds a certain size) I don't have to
>>  read the entire file into RAM.
> 
> Based on my experience with this, you really need to send multiple
> requests (i.e. "chunking").  There are ways around this (you can look
> into curl's resumable uploads), but you will need to maintain state no
> matter what, and I think that chunking is the best/simplest.
I agree I need  chunking. (the question is just on which level of the
protocol)

I just don't know how to make a chunkwise file upload or what library is
best.

Can you recommend any libraries or do you have a link to an example?


I'd like to avoid to make separate https post requests for the chunks
(at least if the underlying module does NOT support keep-alive connections)


I made some tests with high level chunking (separate sequential https
post requests).
What I noticed is a rather high penalty in data throughput.
The reason is probably, that each request makes its own https connection
and that either the NW driver or the TCP/IP stack doesn't allocate
enough band width to my request.

Therefore I'd like to do the chunking on a 'lower' level.
One option would be to have a https module, which supports keep-alive,

the other would be  to have a library, which creates a http post body
chunk by chunk.


What do others do for huge file uploads
The uploader might be connected via ethernet, WLAN, UMTS, EDGE, GPRS. )

N



More information about the Python-list mailing list