Thanks Anand, I was looking at avoiding ec2 instance. Would there be any chance of using s3 as a filesystem and then uncompress the contents.
Regards, Sundar. > 1. AWS S3 Routine with Python (Sundar N) > 2. Re: AWS S3 Routine with Python (Anand Chitipothu) > 3. My proposal for Pycon India 2016 (Annapoornima Koppad) > 4. Re: My proposal for Pycon India 2016 (Akshay Aradhya) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Wed, 22 Jun 2016 22:17:19 +0530 > From: Sundar N <suntra...@gmail.com> > To: bangpypers@python.org > Subject: [BangPypers] AWS S3 Routine with Python > Message-ID: > <CACWaKWY+Gs3j+yE= > hmjvujkagxq4atravg8yhybmtrpn38q...@mail.gmail.com> > Content-Type: text/plain; charset=UTF-8 > > Hi , > Looking for some pointers on using Python to decompress files on AWS S3. > I have been using Boto 2.x library . > As of now I have a requirement to extract a compressed file in one of the > s3 buckets, > that needs to be extracted. There is no direct API to handle this as of > now. > > It would be of great assistance , if there are any pointers to tackle this > problem. > > Thanks in advance. > Sundar. > > > ------------------------------ > > Message: 2 > Date: Wed, 22 Jun 2016 16:58:25 +0000 > From: Anand Chitipothu <anandol...@gmail.com> > To: Bangalore Python Users Group - India <bangpypers@python.org> > Subject: Re: [BangPypers] AWS S3 Routine with Python > Message-ID: > <CAC7wXFwFUK+XdL0+4kE943yG6vHWWYbQan-X6K6= > aaurktt...@mail.gmail.com> > Content-Type: text/plain; charset=UTF-8 > > On Wed, 22 Jun 2016 at 22:18 Sundar N <suntra...@gmail.com> wrote: > > > Hi , > > Looking for some pointers on using Python to decompress files on AWS S3. > > I have been using Boto 2.x library . > > As of now I have a requirement to extract a compressed file in one of the > > s3 buckets, > > that needs to be extracted. There is no direct API to handle this as of > > now. > > > > It would be of great assistance , if there are any pointers to tackle > this > > problem. > > > s3 is just a storage service. For uncompressing an archive, you need to do > some computation, which is you have to handle separately. > > You need to download that file, extract it locally and then upload all the > files back to s3. If that file is too big or the bandwidth is not that > great on your local machine, you can try it on a server with a fat pipe. > Remember that S3 also charges you for the transfer. If you care about those > charges, then try using an EC2 server (IIRC transfers among AWS services > are not billed). If you need to do that operation a lot of times, try > exploring AWS lambda. > > Anand > _______________________________________________ BangPypers mailing list BangPypers@python.org https://mail.python.org/mailman/listinfo/bangpypers