You can also acomplish this with shell:<p>export BUCKET=____; aws s3 ls "$BUCKET" | tail -n+2 | awk '{print $4}' | while read k; do aws s3 cp "s3://$BUCKET/$k" -; done
Cool. I had a similar use case and created a tool to stream colorized logs from CloudWatch to your terminal that is a little more ergonomic to use than this:<p><a href="https://github.com/TylerBrock/saw" rel="nofollow">https://github.com/TylerBrock/saw</a>
One thing to be careful of with this is the Data Transfer (egress) cost you will incur streaming data out from S3.<p>If you're just wanting to do a 'grep' style action on an S3 prefix, might be worth looking into "S3 Select"for your use case instead
Pretty neat. I'm working on a product that relies heavily on S3 buckets and tagged files.<p>Does s3st support tags or other ways of identifying which files to stream other than filtering by the content of the files? Asking because I didn't see this feature in the demo.
We have a large s3 bucket 2 billions objects and we start thinking about cleaning it a bit. Is anyone had done such things or any tools on :<p>- categorising what's inside<p>- checking what's used or not<p>Thanks!