How to Launch AWS S3 bucket using Shell Scripting.

We all need a place to store data such deployment scripts, deployment packages also to host a website we require space. In earlier days there were servers where data use to take lot of time to be copied and those servers were not scalable and neither fault tolerant. In case there was issue such as server down or server gets corrupted , data was either lost or application use to remain down for long hours.

In order to solve space or data storage issue with unlimited capacity and scalable with tolerant behaviors Amazon AWS provides a service AWS S3 which solves all these problem.

In this tutorial we will demo how to launch a AWS S3 bucket in Amazon account using Bash or shell scripting .

Table of Content

  1. What is Shell script?
  2. What is Amazon S3 bucket ?
  3. Prerequisites
  4. Install AWS CLI Version 2 on windows machine
  5. How to launch or create AWS S3 bucket in Amazon account using shell script
  6. Conclusion

What is Shell Scripting or Bash Scripting?

Shell Script is simply a text of file with various or lists of commands that are executed even on terminal or shell one by one. But in order to make thing little easier and run together as a group and in quick time we write them in single file and run it.

Main tasks which are performed by shell scripts are : file manipulation , printing text , program execution. We can include various environmental variables in script that can be used at multiple places , run programs and perform various activities are known as wrapper scripts.

A good shell script will have comments, preceded by a pound sign or hash mark, #, describing the steps. Also we can include conditions or pipe some commands to make more creative scripts.

When we execute a shell script, or function, a command interpreter goes through the ASCII text line-by-line, loop-by-loop, test-by-test, and executes each statement as each line is reached from the top to the bottom.

What is Amazon AWS S3 bucket?

AWS S3 , why it is S3 ? The name itself tells that its a 3 word whose alphabet starts with “S” . The Full form of AWS S3 is simple storage service. AWS S3 service helps in storing of unlimited data very safely and efficiently. There is a very basic architecture of AWS S3 . Everything in AWS S3 is a object such as pdf files, zip files , text files or war files anything. The next thing is bucket where all these objects resides.

AWS S3 Service  ➡️ Bucket  ➡️ Objects  ➡️ PDF , HTML DOCS, WAR , ZIP FILES etc.

Some of the features of AWS S3 bucket are:

  • In order to store the data in bucket you will need to upload it.
  • To keep your bucket permissions more secure provide necessary permissions to IAM role or IAM user.
  • Buckets have unique name globally that means there will be only 1 bucket throughout different accounts or any regions.
  • 100 buckets can be created in any AWS account , post that you need to raise a ticket to Amazon.
  • Owner of Bucket is specific to AWS account only.
  • Buckets are created region specific such as us-east-1 , us-east-2 , us-west-1 or us-west-2
  • Bucket objects are objected in AWS S3 using AWS S3 API service.
  • Buckets can be publicly visible that means anybody on the internet can access it. So it is always recommended to keep the public access blocked for all buckets unless very much required.

Prerequisites

  1. AWS account to create S3 bucket. If you don’t have AWS account please create from AWS account or AWS Account
  2. Windows 7 or plus edition where you will execute the shell script.
  3. Python must be installed on windows machine which will be required by AWS cli. If you want to install python on windows machine follow here
  4. You must have Git bash already installed on your windows machine. If you don’t have install from here
  5. Code editor for writing the shell script on windows machine. I would recommend to use visual studio code on windows machine. If you wish to install visual studio on windows machine please find steps here

In this demo , we will use shell script to launch AWS S3 bucket. So In order to use shell scripts from your local machine that is windows you will require AWS CLI installed and configured. So First lets install AWS CLI and then configure it.

Install AWS CLI Version 2 on windows machine

  • Download the installed for AWS CLI on windows machine from here
  • Select I accept the terms and then click next button
  • Do custom setup like location of installation and then click next button
  • Now you are ready to install the AWS CLI 2
  • Click finish and now verify the AWS cli
  • Verify the AWS version by going to command prompt and type
aws --version

Now AWS cli version 2 is successfully installed on windows machine, now its time to configure AWS credentials so that our shell script connects AWS account and execute commands.

  • Configure AWS Credentials by running the command on command prompt
aws configure
  • Enter the details such as AWS Access key , ID , region . You can skip the output format as default.
  • Check the location on your system C:\Users\YOUR_USER\.aws file to confirm the the AWS credentials
  • Now, you’re AWS credentials are configured successfully.

How to launch or create AWS S3 bucket in Amazon account using shell script

Now we have configured AWS cli on windows machine , its time to create our shell script to create AWS S3 bucket.

  • Create a folder on your desktop and under that create file create-s3.sh
#! /usr/bin/bash
# This Script will create S3 bucket and tag the bucket with appropriate name.

# To check if access key is setup in your system 


if ! grep aws_access_key_id ~/.aws/config; then
   if ! grep aws_access_key_id ~/.aws/credentials; then
   echo "AWS config not found or you don't have AWS CLI installed"
   exit 1
   fi
fi

# read command will prompt you to enter the name of bucket name you wish to create 


read -r -p  "Enter the name of the bucket:" bucketname

# Creating first function to create a bucket 

function createbucket()
   {
    aws s3api  create-bucket --bucket $bucketname --region us-east-2
   }


# Creating Second function to tag a bucket 

function tagbucket()    {
    
   aws s3api  put-bucket-tagging --bucket $bucketname --tagging 'TagSet=[{Key=Name,Value="'$bucketname'"}]'
}


# echo command will print on the screen 

echo "Creating the AWS S3 bucket and Tagging it !! "
echo ""
createbucket    # Calling the createbucket function  
tagbucket       # calling our tagbucket function
echo "AWS S3 bucket $bucketname created successfully"
echo "AWS S3 bucket $bucketname tagged successfully "
  • Now open visual studio code and open the location of file create-s3.sh and choose terminal as Bash
  • Now run the script
./create-s3.sh
  • Script ran successfully , now lets verify the AWS S3 bucket by going on AWS account.
  • Click on the Bucket name testing-s3buck2 and then click on properties

  • Great we can see that tagging was also done successfully.

Conclusion

In this tutorial, we demonstrated some benefits of Amazon AWS S3 and learnt how to set up Amazon AWS S3 using shell script on AWS step by step . Most of your phone data and your website data are stored on AWS S3. This service specially to host a website is best in market.

Hope this tutorial will help you in understanding the shell script and provisioning the AWS S3 on Amazon cloud. Please share with your friends

How to Launch AWS S3 bucket on Amazon using Terraform

Do you have issues with lots of log rotation or does your system hangs when lots of logs are generated on the disk and your system behaves very abruptly and do you have less space to keep your important deployments JAR’s or WAR’s? These are all challenges which everyone has faced while working with datacenter applications of with less capacity VM’s.

It is right time to store all your logs, deployment code and scripts and this is very much possible with Amazon’s AWS S3 which provide unlimited storage , safe and secure and quick . So in this tutorial we will go through what is AWS S3 , AWS S3 features and how to launch a S3 bucket using terraform.

Table of content

  1. What is AWS Amazon S3 bucket?
  2. Prerequisites
  3. How to Install Terraform on Ubuntu 18.04 LTS
  4. Terraform Configuration Files and Structure
  5. Launch AWS S3 bucket on Amazon Web Service using Terraform
  6. Upload an object to AWS S3 bucket
  7. Conclusion

What is Amazon AWS S3 bucket?

AWS S3 , why it is S3 ? The name itself tells that its a 3 word whose alphabet starts with “S” . The Full form of AWS S3 is simple storage service. AWS S3 service helps in storing of unlimited data very safely and efficiently. There is a very basic architecture of AWS S3 . Everything in AWS S3 is a object such as pdf files, zip files , text files or war files anything. The next thing is bucket where all these objects resides.

AWS S3 Service ➡️ Bucket ➡️ Objects ➡️ PDF , HTML DOCS, WAR , ZIP FILES etc

Some of the features of AWS S3 bucket are:

  • In order to store the data in bucket you will need to upload it.
  • To keep your bucket permissions more secure provide necessary permissions to IAM role or IAM user.
  • Buckets have unique name globally that means there will be only 1 bucket throughout different accounts or any regions.
  • 100 buckets can be created in any AWS account , post that you need to raise a ticket to Amazon.
  • Owner of Bucket is specific to AWS account only.
  • Buckets are created region specific such as us-east-1 , us-east-2 , us-west-1 or us-west-2
  • Bucket objects are objected in AWS S3 using AWS S3 API service.
  • Buckets can be publicly visible that means anybody on the internet can access it. So it is always recommended to keep the public access blocked for all buckets unless very much required.

Prerequisites

  • Ubuntu machine to run terraform preferably 18.04 version + , if you don’t have any machine you can create a ec2 instance on AWS account or AWS Account
  • Recommended to have 4GB RAM
  • At least 5GB of drive space
  • Ubuntu machine should have IAM role attached with full access to create AWS S3 bucket or it is always great to have administrator permissions to work with demo’s.
  • If you wish to create bucket manually click here for the setup instruction but we will use terraform for this demo.

You may incur a small charge for creating an EC2 instance on Amazon Managed Web Service.

How to Install Terraform on Ubuntu 18.04 LTS

  • Update your already existing system packages.
sudo apt update
  • Download the latest version of terraform in opt directory
wget https://releases.hashicorp.com/terraform/0.14.8/terraform_0.14.8_linux_amd64.zip
This image has an empty alt attribute; its file name is image-163.png
  • Install zip package which will be required to unzip
sudo apt-get install zip -y
  • unzip the Terraform download zip file
unzip terraform*.zip
  • Move the executable to executable directory
sudo mv terraform /usr/local/bin
  • Verify the terraform by checking terraform command and version of terraform
terraform               # To check if terraform is installed 

terraform -version      # To check the terraform version  
This image has an empty alt attribute; its file name is image-164.png
This image has an empty alt attribute; its file name is image-165.png
  • This confirms that terraform has been successfully installed on ubuntu 18.04 machine.

Terraform Configuration Files and Structure

Let us first understand terraform configuration files before running Terraform commands.

  • main.tf : This file contains code that create or import other AWS resources.
  • vars.tf : This file defines variable types and optionally set the values.
  • output.tf: This file helps in generating of the output of AWS resources .The output is generated after the terraform apply command is executed.
  • terraform.tfvars: This file contains the actual values of variables which we created in vars.tf
  • provider.tf: This file is very important . You need to provide the details of providers such as AWS , Oracle or Google etc. so that terraform can make the communication with the same provider and then work with resources.

Launch AWS S3 bucket on AWS using Terraform

Now we will create all the configuration files which are required for creation of S3 bucket on AWS account.

  • Create a folder in opt directory and name it as terraform-s3-demo 
mkdir /opt/terraform-s3-demo
cd /opt/terraform-s3-demo
  • Create main.tf file under terraform-s3-demo folder and paste the content below.
# Bucket Access

resource "aws_s3_bucket_public_access_block" "publicaccess" {
  bucket = aws_s3_bucket.demobucket.id
  block_public_acls       = false
  block_public_policy     = false
}

# Creating the encryption key which will encrypt the bucket objects

resource "aws_kms_key" "mykey" {
  deletion_window_in_days = "20"
}

# Creating the bucket

resource "aws_s3_bucket" "demobucket" {

  bucket          = var.bucket
  force_destroy   = var.force_destroy

  server_side_encryption_configuration {
    rule {
      apply_server_side_encryption_by_default {
        kms_master_key_id = aws_kms_key.mykey.arn
        sse_algorithm     = "aws:kms"
      }
    }
  }
  versioning {
    enabled               = true
  }
  lifecycle_rule {
    prefix  = "log/"
    enabled = true
    expiration {
      date = var.date
    }
  }
}
  • Create vars.tf file under terraform-s3-demo folder and paste the content below
variable "bucket" {
 type = string
}
variable "force_destroy" {
 type = string
}
variable "date" {
 type = string
}
  • Create provider.tf file under terraform-s3-demo folder and paste the content below.
provider "aws" {
  region = "us-east-2"
}
  • Create terraform.tfvars file under terraform-s3-demo folder and paste the content below.
bucket          = "terraformdemobucket"
force_destroy   = false
date = "2022-01-12"
  • Now your files and code are ready for execution . Initialize the terraform
terraform init
  • Terraform initialized successfully , now its time to see the plan which is kind of blueprint before deployment. We generally use plan to confirm if correct resources is going to provisioned or deleted.
terraform plan
  • After verification , now its time to actually deploy the code using apply.
terraform apply

Terraform command execution was done successfully. Now you should have AWS S3 bucket launched in AWS. Verify it by navigating it to AWS account and search for AWS S3 service and check if the bucket has been created.

Upload an object in AWS S3 bucket

All the files and folder inside the AWS S3 Buckets are known as objects.

  • Now Let us upload a sample text file in the bucket , Next Click on the terraformdemobucket
  • Now click on UPLOAD and then Add files
  • Choose from your system any file , We used sample.txt

Conclusion

In this tutorial, we demonstrated some benefits of Amazon AWS S3 and learnt how to set up Amazon AWS S3 using terraform on AWS step by step . Most of your phone data and your website data are stored on AWS S3. This service specially to host a website is best in market.

Hope this tutorial will help you in understanding the terraform and provisioning the AWS S3 on Amazon cloud. Please share with your friends