The AWS CLI should be your best friend

The AWS console is certainly very well laid out and, with time, becomes very easy to use. However if you are not using the AWS CLI (Command Line Interface) from your local terminal, you may be missing out on whole lot of great functionality and speed. If you are not yet comfortable with the AWS Command Line Interface, there’s a great course on the subject available right now on Cloud Academy.

Even if you are used to the AWS CLI, I encourage you to take a look at the commands below, as you may not be completely aware of the power of the AWS CLI, and you might just end up saving yourself a whole lot of time. One important note: the precise syntax of some commands can vary between versions and packages.

1. Delete an S3 bucket and all its contents with just one command

Sometimes you may end up with a bucket full of hundreds or thousands of files that you no longer need. If you have ever had to delete a substantial number of items in S3, you know that this can be a little time consuming. The following command will  delete a bucket and all of its content including directories:

aws s3 rb s3://bucket-name --force

2. Recursively copy a directory and its subfolders from your PC to Amazon S3

If you have used the S3 Console, at some stage, you’ve probably found yourself having to copy a ton of files to a bucket from your PC. It can be a little clunky at times, especially if you have multiple directory levels that need to be copied. The following AWS CLI command will make the process a little easier, as it will copy a directory and all of its sub folders from your PC to Amazon S3 to a specified region.

aws s3 cp MyFolder s3://bucket-name -- recursive [--region us-west-2]

3. Display subsets of all available ec2 images

The following will display all available ec2 images, filtered to include only those built on Ubuntu (assuming, of course, that you’re working from a terminal on a Linux or Mac machine).

aws ec2 describe-images | grep ubuntu

Warning: this may take a few minutes.

4. List users in a different format

Sometimes, depending on the output format you chose as default, when you invoke long lists – like a large set of users – the display format can be a little hard to read. Including the –output parameter with, say, the table argument, will display a nice, easy-to-read table this one time without having to change your default.

aws iam list-users --output table

5.  List the sizes of an S3 bucket and its contents

The following command uses JSON output to list the size of a bucket and the items stored within. This might come in handy when auditing what is taking up all your S3 storage.


6. Move S3 bucket to different location

If you need to quickly move an S3 bucket to a different location, then this command just might save you a ton of time.


7. List users by ARN

“jq” is like sed for JSON data – you can use it to slice, filter, map, and transform structured data with the same ease that sed, awk, grep and friends let you play with non-JSON text.

Armed with that knowledge, we can now nicely list all our users, but only show their ARNs.

aws iam list-users --output json | jq -r .Users[].Arn

Note: jq, might not be installed on your system by default. On Debian-based systems (including Ubuntu), use sudo apt-get install jq


8. List all of your  instances that are currently stopped, and the reason for the stop

Here’s another use of the JSON output parameter. This one will list all of your stopped instances and, best of all, show the reason that they were stopped:

9. Test one of your public CloudFormation templates

If you have written a Cloud Formation Template and need to validate it before launching, you can do it from the CLI using the following format:

10. Other ways to pass input parameters to the AWS CLI with JSON:

You can pass all sorts of input parameters to the AWS CLI. Here’s an example of how to do it:

Learn to love the AWS CLI

The above commands are only the tip of the iceberg when it comes to using the AWS CLI, but they have hopefully given you some idea of how powerful it can be when it is used correctly. It’s easy to get too comfortable using the AWS Console for most of your work, but this post should have given you some incentive for trying out the AWS CLI if you haven’t already. What are you waiting for? Give it a go: you just might surprise yourself.

  • Peter Moon

    Great post!
    By the way, #7 and #8 are also possible with the built in “–query” and “–output text” options:
    aws iam list-users –query ‘Users[].[Arn]’ –output text
    aws ec2 describe-instances –filters Name=instance-state-name,Values=stopped –region us-west-2 –output text –query ‘Reservations[].Instances[].[StateReason.Message]’

  • Rakz

    If I have files based on date like 08th aug,09th aug, how can I download selective date file?.I used “aws s3 cp s3://bucketname/ folder/file –profile pname –exclude “*” –recursive –include “” + “2015-08-09” + “*””

  • Michael Sheehy

    Hi Rakz,

    I could be wrong but I don’t think the AWS command line tools, include filter, supports this. You can use wildcards but I think they are also limited in so far as it only works on file extensions.
    So like the following

    –include “*.txt”

    will work but you can’t do it for dates as far as I am aware.

  • #2 has a typo it seems.

    Where “– recursive” should have no space in between and instead read “–recursive”

  • Jamir Josimar Huamán Campos

    How can I create directory inside EC2 with AWS CLI?

  • Shanu Peer

    This is to list the number of objects and total size of the same

    aws s3 ls –summarize –human-readable –recursive s3://bucketname | tail -2
    Output is:
    Total Objects: 52264
    Total Size: 5.3 GiB