AWS CLI manage files on an S3 bucket

2 min read | by Jordi Prats

Using the AWS CLI we can perform most operations for files sitting on a S3 bucket such as: list, copy, rename, cat, etc...

We can also use aws s3 ls for listing all the available S3 buckets:

$ aws s3 ls
2021-01-01 19:01:55 tfstates
2021-01-11 21:32:11 ampa-preprod
2021-01-07 21:09:20 sonarqube-test
2021-04-15 21:14:11 pet2cattle
2021-01-27 18:46:40 ampa-prod

But it can also be used to list the contents of a bucket:

$ aws s3 ls s3://pet2cattle/
2021-04-15 22:23:49        129 demo.yml

We can also copy files from or to the bucket:

$ aws s3 cp s3://pet2cattle/demo.yml .
download: s3://pet2cattle/demo.yml to ./demo.yml

Remove them from the bucket using aws s3 rm:

$ aws s3 rm s3://pet2cattle/demo.yml
delete: s3://pet2cattle/demo.yml

To be able to get the contents of a file (as we would do with cat) we can also use cp instructing it to copy the file to stdout as follows:

$ aws s3 cp s3://pet2cattle/demo.yml -
contents: {}

To be able to create directories on a S3 bucket it gets trickier, we need to use aws s3api put-object instead:

$ aws s3api put-object --bucket pet2cattle --key test/
    "ETag": "\"d41d8cd98f00b204e9800998ecf8427e\""
$ aws s3api put-object --bucket pet2cattle --key test/test2
    "ETag": "\"d41d8cd98f00b204e9800998ecf8427e\""

We can use aws s3 ls with the --recursive flag to see the directories we have created:

$ aws s3 ls --recursive s3://pet2cattle/
2021-04-25 16:38:58          0 test/
2021-04-25 16:39:36          0 test/test2

The --recursive flag can also be used with aws s3 rm to recursively remove the contents of a S3 bucket:

$ aws s3 rm --recursive s3://pet2cattle/test
delete: s3://pet2cattle/test/test2
delete: s3://pet2cattle/test/

Posted on 28/04/2021