Blog

AWS CLI Tutorial

Picture of Iva Todorova
Iva Todorova
DevOps & Cloud Engineer
14.01.2022
Reading time: 9 mins.
Last Updated: 10.06.2024

Table of Contents

How to Install, Configure, and Use AWS Command Line Interface

There is no way to work in a DevOps industry and not come across AWS CLI and the need to use it. We will present to you with a brief guide on how to install, configure, and in what cases AWL CLI is very useful and more convenient than the AWS console.

The AWS Command Line Interface (CLI) is an open-source unified tool that helps you manage multiple AWS services, or automate them with scripts. AWS CLI version 2 is the latest major version. In order to access AWS services, it is required to have an AWS account, IAM credentials, and an IAM access key pair.

If you already have AWS version 1 installed, you have two options:

  • Recommended –  uninstall AWSv1 and use only AWSv2
  • Use alias/symlink to differentiate the versions, as they both use aws command name.

 1. Installation

Supported operating systems are  Linux, MacOS and Windows

AWS CLI is supported on 64-bit versions of recent distributions of Linux ARM, CentOS, Fedora, Ubuntu, Amazon Linux 1, and Amazon Linux 2.

$ curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"unzip awscliv2.zipsudo ./aws/install
  1. Download the installation file using one of the following ways:
    • Use the curl command – The -o option specifies the file name that the package is written to. The following example command writes the downloaded file to the current directory with the name awscliv2.zip.
$ curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
  1. Unzip the installer. The following example command unzips the package and creates a directory named aws under the current directory.
$ unzip awscliv2.zip
  1.  Run the install program. By default, the files are all installed to /usr/local/aws-cli, and a symbolic link is created in /usr/local/bin. The command includes sudo to grant write permissions to those directories. 
$ sudo ./aws/install

You can install without sudo if you specify directories that you already have write permissions to, or you are the root user. 

Use the following instructions for the install command to specify the installation location:

  • Make sure that the paths you provide to the -i and -b parameters contain no volume or directory names that contain space characters, or other white space characters. If there is a space, the installation will fail.
  • –install-dir or -i – This option specifies the directory where all files will be copied.
    The default value is /usr/local/aws-cli.
  • –bin-dir or -b – This option specifies that the main aws program is symbolically linked to the file aws in the specified path. You need to have write permissions to the specified directory. Creating a symlink to a directory that is already in your path eliminates the need to add the install directory to the $PATH variable.
    The default value is /usr/local/bin.
$ ./aws/install -i /usr/local/aws-cli -b /usr/local/bin

Confirm the installation with the following command.

$ aws --version

aws-cli/2.4.6 Python/3.8.8 Linux/4.18.0-348.2.1.el8_5.x86_64 exe/x86_64.centos.8 prompt/off

If the AWS command is still not found, you just need to restart your terminal.

 2. Configuring the AWS CLI

There are a few methods to configure AWS CLI, the most common of which are:

Quick configuration with aws configure 

You can generate an access key ID and a secret access key from  AWS Console -> IAM Service. The purpose of the keys is to authenticate your AWS account and authorize what you can do according to your permission level.

The configuration is stored in ~/.aws/credentials in a profile named [default]. Moreover, you can use this command to change/update existing values.

$ aws configure
AWS Access Key ID [None]: EXAMple/AcceSSKEY/IDAWS 
Secret Access Key [None]: EXAMpleseCRET/AcceSSKEY
Default region name [None]: us-west-1
Default output format [None]: json

The default output format is JSON. However, there are several other formats that you can use – YAML, text, or table. 

You can also import the key pair from the .csv file downloaded after generating them.

$ aws configure import --csv file://credentials.csv

AWS CLI stores the sensitive credential information in the local file credentials. The file config stores less sensitive configuration options. The default location for both files is the home directory ~/.aws.  

~/.aws/credentials

[default]
aws_access_key_id=EXAMple/AcceSSKEY/ID
aws_secret_access_key=EXAMpleseCRET/AcceSSKEY

~/.aws/config

[named_profile]
region=us-west-1
output=json

Usually, settings and credentials are sorted into named profiles. By specifying a profile to run a CLI command, profile settings and credentials are used to run that command, as shown below:

$ aws ec2 describe-instances --profile named_profile

Another way a profile can be specified is with an environment variable (AWS_PROFILE), which basically overrides the default profile for commands that run in the current session. The AWS CLI supports using multiple named profiles that are stored in the config and credentials files. This is a very useful feature when you run AWS Organization and have different profiles for different environments, like dev, stage, etc. 

Additional profiles can be added by using the AWS configure –profile option, or by adding entries to the config and credentials files.

Аs AWS CLI can read credentials from the config file, profile settings can be stored in a single file. In case there is a profile with the same name in both files, the keys in the credentials file take precedence. 

AWS Single Sign-On

Automatically create a profile that uses SSO by using the command 

$ aws configure sso

or manually by updating the ~/.aws/config 

The result should be similar to the below example:

[profile my-named-profile] 
sso_start_url = https://my-sso-portal.awsapps.com/start 
sso_region = us-east-1
sso_account_id = 12345678910
sso_role_name = readOnly
region = us-west-2
output = json

After you create the SSO profile, you can invoke it to request temporary credentials from AWS. Before you can run an AWS CLI service command, you must retrieve temporary credentials. To get the temporary credentials, run the following command in the terminal: 

$ aws sso login --profile  my-named_profile

The AWS CLI will open your default browser and verify your AWS SSO login. You will see the following output:

Attempting to automatically open the SSO authorization page in your default browser. If the browser does not open or you wish to use a different device to authorize this request, open the following URL: https://device.sso.eu-central-1.amazonaws.com/

Then enter the code:

GJMX-DJRHSuccessfully logged into Start URL: https://example.awsapps.com/start#/

Environment variables

Most of the time this method is used as another way to specify configuration options and credentials, and can be useful for scripting or temporarily setting a named profile as default.

Rule of precedence – if an option is specified by using an environment variable, it overrides any value from the config file. If an option is specified by using a parameter on the AWS CLI, it overrides any value from either the corresponding environment variable or a profile in the config file.

Setting up environment variables

Linux or macOS

$ export AWS_ACCESS_KEY_ID=EXAMple/AcceSSKEY/ID 
$ export AWS_SECRET_ACCESS_KEY=EXAMpleseCRET/AcceSSKEY
$ export AWS_DEFAULT_REGION=us-west-2

For further information on how to set variables and a list of supported environment variables, you can check the official AWS CLI documentation here.

  • Useful commands: 

$ aws configure

$ aws configure set –profile

$ aws configure list-profiles

$ aws configure list

$ aws configure import

$ aws configure get

$ aws sso login –profile

$ aws sso logout 

There are many more options in the official AWS command reference guide here.

 3. Using the AWS CLI

There are a lot of tips & tricks that can be very useful while working with AWS CLI. 

Below you may observe some basics that can ease your everyday job:

Command structure  

$ aws [options] <command> <subcommand> [parameters]

Getting help

You can get help for any command by running the command 

$ aws help

The help includes command descriptions, available filters, available commands, and output review. Also some useful examples of the common usage of the commands.

Enable command completion 

With the following set of commands, you can easily configure command completion to run every time you open a new shell

[root@localhost ~]# which aws_completer 
/usr/local/bin/aws_completer
[root@localhost ~]# echo “export PATH=/usr/local/bin/:$PATH” >> .bash_profile 
[root@localhost ~]# source ~/.bash_profile 
[root@localhost ~]# echo "complete -C '/usr/local/bin/aws_completer' aws" >> ~/.bashrc

Using templates 

By adding the following parameter –generate-cli-skeleton you can generate and display a template that can be updated and customized according to future needs, and later to be applied as input on a command.

The generated template output includes all of the parameters that the command supports. 

The –generate-cli-skeleton parameter can accept one of the following values:

  • input – The template includes all input parameters formatted as JSON (default value).
  • yaml-input – The template includes all input parameters formatted as YAML.
  • output –  includes all output parameters formatted as JSON. You cannot currently request the output parameters as YAML.

You can save the skeleton parameter in a file, then remove all the unnecessary parameters and update the values.

For example:

JSON format       YAML format

{ “DryRun”: true, “ImageId”: “”, “KeyName”: “”, “SecurityGroups”: [ “” ], “InstanceType”: “”, “Monitoring”: { “Enabled”: true }}DryRun: falseImageId: ‘ami-abc19ef’KeyName: ‘examplekey’SecurityGroups:- ‘example-sg’InstanceType: ‘t3.small’Monitoring: Enabled: true

After that you can run the following command:

$ aws ec2 run-instances --cli-input-json file://ec2runinst.json --output json 

оr

$ aws ec2 run-instances --cli-input-yaml file://ec2runinst.yaml --output yaml 

Using Amazon ECS Exec 

You may see a newly added feature to the capabilities of AWS CLI – access containers on AWS Fargate and Amazon EC2. This means that from now on users are able to run an interactive shell or execute a single command against the container.

Prerequisites for ECS Exec – make sure you have an IAM user, role, and policy that allow the proper access. ECS Exec leverages SSM Session Manager, to create a secure channel between the user device and the target container.

Example for a user with IAM role, and policy attached:

{    “Version”: “2012-10-17”,    “Statement”: [        {            “Effect”: “Allow”,            “Action”: [                “ssmmessages:CreateControlChannel”,                “ssmmessages:CreateDataChannel”,                “ssmmessages:OpenControlChannel”,                “ssmmessages:OpenDataChannel”            ],            “Resource”: “*”        }    ]}

With the following commands you can invoke interactive shell or single command:

 aws ecs execute-command  
    --region $AWS_REGION 
    --cluster ecs-exec-example-cluster 
    --task abcdef123499876gtr1234 
    --container <name> 
    --command "/bin/bash" 
    --interactive

–container <name>  is optional in case the task has only one container.

aws ecs execute-command  
    --region $AWS_REGION 
    --cluster ecs-exec-example-cluster 
    --task abcdef123499876gtr1234 
    --container <name> 
    --command "ls" 
    --interactive

You can monitor the logs from CloudTrail and Cloudwatch. 

 4. Security in the AWS CLI

It is very important that AWS CLI configuration meets the required security and compliance objectives. Also, AWS CLI can help to monitor and secure AWS resources.

Data protection 

The best practice related to data protection is to follow the Least Privilege concept. Therefore, users receive only the necessary amount of access required to complete their duties.

Other very important ways of securing your data are, as follows:

  • Use MFA(Multi-Factor Authentication) with each account.
  • Use TLS 1.2 or later for communication with AWS resources.
  • Setup CloudTrail activity logging 
  • Use encryption whenever possible
  • Use roles with temporary credentials 

By default, all transmitted data between client AWS CLI and AWS services is encrypted.

 5. Troubleshooting 

When AWS CLI is reporting an error, the first thing to do is to gather as much information as possible. One of the most useful things to do is to run the failing command again using the –debug command at the end of the command line. Thus, the AWS CLI will print details of every step it takes while executing the command.

There are other commonly encountered errors that have a possible cause, already documented. For further information on the subject, you can check here.

 6. Summary

We went through the fundamentals of AWS CLI. We learned how to install it and what important considerations to keep in mind while using this tool. It is very useful, fast, and very convenient in a lot of cases in DevOps work. AWS is constantly adding new features, therefore it is a good idea to start using it in your everyday job. 

There is much more to say and discuss for the capabilities of AWS CLI. For further information and technical details, you can check the following links:

Click here, here, here.

If you’d like to learn more about the DevOps world from professionals, do not hesitate to check out our Blog section at ITGix Blog. You will find out a lot of different subjects and interesting real-life cases, documented by our ITGix experienced DevOps engineers.

Leave a Reply

Your email address will not be published. Required fields are marked *

More Posts

What is AWS AppStream 2.0 AWS AppStream 2.0 is a service that lets you run desktop applications in the cloud, so users can access them from anywhere, without having to...
Reading
Introduction Integrating Alertmanager with Microsoft Teams enables you to receive alerts directly in Teams channels, facilitating swift collaboration with your team to address issues promptly. This guide will use Prometheus-MSTeams,...
Reading
Get In Touch
ITGix provides you with expert consultancy and tailored DevOps services to accelerate your business growth.
Newsletter for
Tech Experts
Join 12,000+ business leaders and engineers who receive blogs, e-Books, and case studies on emerging technology.