Hi all,

over 3 months after the last release of s3cmd it's time for an upgrade
again ;-) As you may have already noticed on SourceForge or Freshmeat,
s3cmd 0.9.9 has finally been released with a load of new features and
improvements.

To learn more about Amazon S3 and s3cmd visit http://s3tools.org/s3cmd

Major new features in s3cmd 0.9.9:
---------------------------------

== Improved [put], [get] and [sync] ==
All these three commands now can do recursive transfer, accept multiple
sources and support wildcards even for the direction from S3 to local
dir. All of them also support --exclude / --include and eventually
--skip-existing filtering. For instance:
        s3cmd put --recursive dir1 dir2 file3.txt s3://bkt/backup/
        s3cmd get --exclude "tmp*.jpg" s3://bkt/backup/*.jpg /tmp/
Be aware that these commands may potentially behave slightly differently
to what you were used to in older s3cmd versions. To make sure that it
does what you meant and that your exclude / include patterns are correct
use --dry-run mode. With that option s3cmd will only print what files
should be transferred without actually transferring them.
Check out http://s3tools.org/s3cmd-sync for more details.

== CloudFront support ==
Amazon recently released their CloudFront Content Delivery Network for
public and thanks to a sponsorship from Joseph Denne s3cmd now can
manage CloudFront for you - the commands are [cfcreate], [cfdelete],
[cfmodify] and [cfinfo]. See more at http://s3tools.org/s3cmd-cloudfront

== Progress meter ==
Uploading or downloading large files often meant many minutes wait
without any way to check how far the transfer has gone. Now we've got a
new progress meter that'll keep you updated on the progress:
        testfile.pdf -> s3://bkt/testfile.pdf  [1 of 1]
          495616 of 2304411    21% in    4s   112.52 kB/s
When s3cmd is running on a terminal it by default uses the progress
meter. If it's running from a cron job or otherwise without a terminal
it won't. These defaults can be overriden with --progress and
--no-progress switches.

== Commands for copying and moving remote files ==
Andrew Ryan contributed support for Copy and Move operations, accessible
as [cp] and [mv] commands. You can copy or move objects within a bucket
or from one bucket to another, even from US to EU data center. At the
moment these two commands don't support recursive operation or multiple
sources, such a functionality will be added in the upcoming version.

== New command [setacl] for setting ACL on existing objects ==
To enable public access to your files over HTTP they need to have a
"Public ACL". You can do it either during upload with --acl-public
parameter or later on using [setacl] command. Again, sponsored by Joseph
Denne. Use it with --acl-public or --acl-private to set the desired
Access Control.

== New --add-header option ==
One more feature for web hosters - occasionally you may want to set a
specific headers to your files that are to be accessed over HTTP. Such
as "Content-encoding:gzip" for, say, your gzipped javascript, or perhaps
Expires or Cache-control headers (not only) for use with CloudFront CDN.

With --add-header, that is available with [put] and [sync] as well as
with [cp] and [mv], you can do all that:

        s3cmd put blah.js.gz s3://public-bkt/js/        \
             --mime-type "application/javascript"       \
             --add-header "Content-Encoding: gzip"      \
             --acl-public

== Recursive [del] and removal of non-empty buckets ==
Many people wanted an easy way to delete a subtrees from S3 or to remove
 non-empty buckets. Both is now possible - [del] understands --recursive
and [rb] honours --force (which in fact will do a recursive delete first
internally):
        ~$ s3cmd rb --force s3://bkt-test
        WARNING: Bucket is not empty. Removing all the objects from it
                 first. This may take some time...
        File s3://bkt-test/testfile.txt deleted
        Bucket 's3://bkt-test/' removed

== Support for non-recursive [ls] ==
Listing large buckets with thousands of files used to be pretty
difficult with s3cmd ls. With the non-recursive [ls], which is a default
from now, listings will look like this:

        ~$ s3cmd ls s3://bkt/
                               DIR   s3://bkt/subdir1/
                               DIR   s3://bkt/subdir2/
        2009-02-11 03:32   2304411   s3://bkt/testfile.pdf

The old "list everything" behaviour is still available with --recursive
option.

== Added --list-md5 option for [ls] ==
One more improvement for [ls] - print out MD5 sums of the remote files:

        ~$ s3cmd ls s3://bkt/
                               DIR                   s3://bkt/subdir1/
        2009-02-11 03:32   2304411 6e4469df...7a8b88 s3://bkt/tst.pdf

== Improvements in handling non-ASCII filenames ==
Ever lasting troubles with non-ASCII characters in filenames should now
be gone (or at least reduced). If you're on a UTF-8 system and all your
files have UTF-8 encoded filenames s3cmd should work just fine. In all
other cases ... let me know ;-)

Your system encoding should be autodetected correctly, but in case of
cron jobs it may eventually fail at times. In such a case --encoding
option to set the encoding explicitly:
        ~$ s3cmd put --encoding UTF-8 číča.txt s3://bkt/


Minor features and bugfixes:
----------------------------
* Improved resistance to communication errors (Connection
  reset by peer, etc.)
* Continue [get] of partially downloaded files with --continue
* Fixed GPG (--encrypt) compatibility with Python 2.6.
* Always send Content-Length header to satisfy some http proxies.
* Fixed installation on Windows and Mac OS X.
* Don't print a nasty backtrace on KeyboardInterrupt.

Quite an bunch of new stuff, isn't it?


Download
--------
Users of RPM-based Linux distributions are advised to grab one of the
RPMs provided in our repositories and eventually add our repos to their
package manager. Details are here: http://s3tools.org/repositories

All others should install from a source package. Get it from SourceForge:
https://sourceforge.net/project/showfiles.php?group_id=178907&package_id=218690


Many thanks to all of you who were testing the -pre and -rc releases and
reported back the problems encountered. And indeed many thanks to those
who contributed with a donation (http://s3tools.org/donations). Your
help is much appreciated!


Any questions or feedback? Please send an email to the mailing list:
s3tools-general@lists.sourceforge.net

Enjoy!

Michal


------------------------------------------------------------------------------
Open Source Business Conference (OSBC), March 24-25, 2009, San Francisco, CA
-OSBC tackles the biggest issue in open source: Open Sourcing the Enterprise
-Strategies to boost innovation and cut costs with open source participation
-Receive a $600 discount off the registration fee with the source code: SFAD
http://p.sf.net/sfu/XcvMzF8H
_______________________________________________
S3tools-general mailing list
S3tools-general@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/s3tools-general

Reply via email to