[BangPypers] AWS S3 Routine with Python

Anand Chitipothu anandology at gmail.com
Wed Jun 22 12:58:25 EDT 2016


On Wed, 22 Jun 2016 at 22:18 Sundar N <suntracks at gmail.com> wrote:

> Hi ,
> Looking for some pointers on using Python to decompress files on AWS S3.
> I have been using Boto 2.x library .
> As of now I have a requirement to extract a compressed file in one of the
> s3 buckets,
> that needs to be extracted. There is no direct API to handle this as of
> now.
>
> It would be of great assistance , if there are any pointers to tackle this
> problem.


s3 is just a storage service. For uncompressing an archive, you need to do
some computation, which is you have to handle separately.

You need to download that file, extract it locally and then upload all the
files back to s3. If that file is too big or the bandwidth is not that
great on your local machine, you can try it on a server with a fat pipe.
Remember that S3 also charges you for the transfer. If you care about those
charges, then try using an EC2 server (IIRC transfers among AWS services
are not billed). If you need to do that operation a lot of times, try
exploring AWS lambda.

Anand


More information about the BangPypers mailing list