AWS CLI Automation, Writing Shell Scripts for Cloud Management

Managing cloud resources can feel like a chore, especially when you’re stuck doing the same tasks repeatedly. But here’s a secret that’s not a secret, you can automate almost anything, for AWS we have tools like AWS CLI, Cloudformation, and various 3rd party tools that work well with AWS.

In this guide, I’ll show you how to write simple scripts to automate common tasks.

Why Should You Care

Rather than repeatedly clicking through the AWS Management Console or typing the same commands, you can write a script once and automate the entire process. Here's why you should automate your workflow:

  1. Save’s time by automating repetitive tasks so you can focus on more important work.

  2. Scripts do the same thing every time, reducing the chance of mistakes ensuring consistency

  3. Automating tasks is a skill that’ll make your life (and your team’s) much easier.

What Do You Need to Get Started

Before we jump into scripting, let’s make sure you’ve:

  1. AWS CLI Installed, If you don’t have it yet, download and install it from the AWS CLI Download.

  2. AWS Credentials Set Up, Run aws configure and enter in your access key, secret key, and region, you can additionally set up environmental variable -

  3. A Text Editor, Use anything you’re comfortable with: VS Code, Nano, VIM, or even Notepad.

  4. Basic Shell Knowledge: If you know how to open a terminal and type commands, you’re good to go.

Creating Your First Script

Let’s start with something easy, a script to list all your EC2 instances.

Example 1: List All EC2 Instances

#!/bin/bash
aws ec2 describe-instances --query "Reservations[].Instances[].{InstanceID: InstanceId, State: State.Name}" --output table

What’s Happening Here?

  • aws ec2 describe-instances: This is the command to get details about your EC2 instances.

  • -query: This filters the output to show only the instance ID and its state (like “running” or “stopped”).

  • -output table: This makes the output look nice and organized in a table.

Save this script as list_instances.sh, make it executable with chmod +x list_instances.sh, and run it.

Example 2: Start or Stop an EC2 Instance

Now let’s make things a bit more interactive. This script will ask you for an instance ID and whether you want to start or stop it.

#!/bin/bash
read -p "Enter the EC2 instance ID: " INSTANCE_ID

read -p "Do you want to start or stop the instance? (start/stop): " ACTION

if [[ "$ACTION" == "start" ]]; then
  echo "Starting instance $INSTANCE_ID..."
  aws ec2 start-instances --instance-ids $INSTANCE_ID
elif [[ "$ACTION" == "stop" ]]; then
  echo "Stopping instance $INSTANCE_ID..."
  aws ec2 stop-instances --instance-ids $INSTANCE_ID
else
  echo "Oops! Please type 'start' or 'stop'."
fi

How It Works:

  • The script asks for an instance ID and whether you want to start or stop it.

  • It then uses the aws ec2 start-instances or aws ec2 stop-instances command to do the job.

  • If you type something wrong, it’ll gently remind you to type “start” or “stop.”

Example 3: Create a S3 Bucket

Now Lets create an S3 bucket using AWS CLI:

#!/bin/bash

BUCKET_NAME="my-unique-bucket-name-548752"

aws s3 mb s3://$BUCKET_NAME
aws s3 cp samplefile.txt s3://$BUCKET_NAME/
aws s3 ls s3://$BUCKET_NAME

#to Delete the objects created in the bucket
#aws s3 rm s3://$BUCKET_NAME --recursive

#To delete the Bucket
#aws s3 rb s3://$BUCKET_NAME

What’s Going On Here?

  • BUCKET_NAME="my-unique-bucket-name-548752": Defines a variable BUCKET_NAME to store the name of the S3 bucket.

  • aws s3 mb s3://$BUCKET_NAME: Creates a new S3 bucket with the name stored in BUCKET_NAME.

    Note: Bucket names must be globally unique.

  • aws s3 cp samplefile.txt s3://$BUCKET_NAME/: Uploads the local file samplefile.txt to the newly created S3 bucket.

  • aws s3 ls s3://$BUCKET_NAME: Lists the contents of the bucket to confirm the file (samplefile.txt) was uploaded.

Tips for Writing Better Scripts

  • Use IAM Roles: Avoid embedding credentials in scripts; use IAM roles where possible.

  • Validate Inputs: Use conditional statements to verify user inputs.

  • Error Handling: Implement error handling with set -e and || exit 1.

  • Logging: Redirect output to log files for debugging.

  • Keep It Simple: Start with small scripts and build up as you get more comfortable.

  • Add Comments: Use # to add notes in your script so you (or others) can understand what it does later.

  • Test, Test, Test: Always test your scripts in a safe environment before using them on important resources.

  • Use Variables: Store things like instance IDs or bucket names in variables to make your scripts reusable.

Automating AWS tasks with shell scripts is a game-changer. It saves time, reduces mistakes, and makes your life easier. Whether you're managing EC2 instances or creating S3 buckets, these scripts can handle the heavy lifting for you. Start small, and soon you'll be automating like a pro

RESOURCES

Lets Connect: Lewis Sawe - LinkedIn