How to Create AWS S3 bucket using Python S3 boto3 ?

Published by

on

Boto3 is the Amazon Web Services (AWS) Software Development Kit (SDK) for Python, which allows Python developers to write software that makes use of services like Amazon S3 (Simple Storage Service) and Amazon EC2 (Elastic Compute Cloud). boto3 is the name of the Python module that provides access to AWS services in a Pythonic way.

Amazon S3 is a service offered by AWS that provides object storage through a web service interface. S3 uses the concept of buckets (which can be thought of as top-level folders) and objects (which are essentially files within these buckets). S3 is widely used for backup and storage, serving web content, hosting static websites, and more.

Prerequisites

  • You can install boto3 using pip, which is the package installer for Python. The command to install it is pip install boto3.
  • Before you can start using boto3 to interact with AWS services, you need to set up authentication credentials. This typically involves creating an IAM (Identity and Access Management) user in your AWS account and configuring access keys on your local machine, either through environment variables, a configuration file, or directly in your code.

    How Python Boto3 connects with AWS Cloud.

    S3 Client and Resource: In boto3, there are two ways to interact with AWS services: client and resource. The client provides a low-level service interface, while the resource provides a higher-level, object-oriented interface. For S3, you can create a client using boto3.client('s3') or a resource using boto3.resource('s3').

    With boto3, you can perform a variety of operations on S3, such as creating and deleting buckets, uploading and downloading objects, listing objects within a bucket, and managing object and bucket permissions.

    Creating AWS S3 bucket using Python boto3 S3 Library module.

    In this section I will show you how to perform below actions on AWS S3 bucket using boto3 Python Library.

    The provided Python script is a demonstration of how to interact with Amazon S3 using the boto3 library. It performs the following actions:

    1. Creates a new S3 bucket with a unique name using uuid.uuid4().
    2. Prompts the user to enter the name of a file to upload to the newly created bucket.
    3. Uploads the specified file to the bucket.
    4. Asks the user if they want to download the uploaded object into memory and, if so, downloads the first 20 bytes and prints them.
    5. Asks the user if they want to copy the uploaded object to a subfolder within the same bucket and, if so, performs the copy operation.
    6. Lists all the objects in the bucket and prints their keys.
    7. Asks the user if they want to delete all objects in the bucket and the bucket itself and, if so, performs the deletion.
    8. Prints a farewell message and ends the demonstration.

    The script includes error handling for various operations, such as bucket creation, file upload, object download, object copy, and bucket deletion. It uses ClientError to catch exceptions related to AWS client operations and S3UploadFailedError for issues during file upload.

    To run this script, you need to have the boto3 library installed and configured with AWS credentials that have the necessary permissions to perform S3 operations. The script is designed to be run from the command line and interacts with the user through input prompts.

    Please note that running this script will incur costs on your AWS account for the S3 services used, and you should ensure that you have the appropriate permissions and understand the cost implications before executing the script.

    '''
    Create a bucket and upload a file to it.
    Download an object from a bucket.
    Copy an object to a subfolder in a bucket.
    List the objects in a bucket.
    Delete the bucket objects and the bucket.
    
    '''
    
    import io  # common task in Python for input/output operations. 
    import uuid
    import os
    import boto3
    from boto3.s3.transfer import S3UploadFailedError
    from botocore.exceptions import ClientError
    
    
    def my_function(s3_resource):
        print("Amazon S3 demo starting now")
    
      
        bucket_name = f'my-bucket-{uuid.uuid(4)}'  # uuid4() creates a random UUID.
        bucket = s3_resource.Bucket(bucket_name)
    
        try:   # Creating the AWS S3 Bucket
            bucket.create(
                CreateBucketConfiguration={
                    "LocationConstraint": s3_resource.meta.client.meta.region_name
                }
            )
            print(f"Created demo bucket named {bucket.name}.")    
        
        except ClientError as err:  #  Error if Bucket is not created
            print(f"Tried and failed to create demo bucket {bucket_name}.")
            print(f"\t{err.response['Error']['Code']}:{err.response['Error']['Message']}")
            print(f"\nCan't continue the demo without a bucket!")
            return
        
        file_name = None  # Checking the files Locally to upload in the AWS S3 Bucket
        while file_name is None:
            file_name = input("Enter file to upload")
            if not os.path.exists(file_name):
                print(f"Couldn't find file {file_name}. Are you sure it exists?")
                file_name = None
        
        obj = bucket.Object(os.path.basename(file_name)) # Uploading the files in the AWS S3 Bucket
        try:
             obj.upload_file(file_name)
             print( f"Uploaded file {file_name} into bucket {bucket.name} with key {obj.key}.")
        except S3UploadFailedError as err:
                print(f"Couldn't upload file {file_name} to {bucket.name}.")
                print(f"\t{err}")    
        
        download = input("what you want to download")   # Downloading the files in the AWS S3 Bucket
        if download.lower() == "y":
             data = io.BytesIO()    # It can be used as a file-like object for reading and writing binary data.
             try: 
                obj.download_fileobj(data)
                data.seek(0) # Move to the start of the buffer
                print(f"Got your object. Here are the first 20 bytes:\n")
                print(f"\t{data.read(20)}")
             except ClientError as err:
                print(f"Couldn't download {obj.key}.")
                print(f"\t{err.response['Error']['Code']}:{err.response['Error']['Message']}")  
        
        copy = input("what you want to copy")   # Downloading the files in the AWS S3 Bucket
        if copy.lower() == "y":
             dest_obj = bucket.Object(f"demo-folder"/{obj.key}) # Uploading the files in the AWS S3 Bucket
             try:
                dest_obj.copy({"Bucket":bucket.name,"Key":obj.key})
                print(f"Copied {obj.key} to {dest_obj.key}.")
             except ClientError as err:
                print(f"Couldn't copy {obj.key} to {dest_obj.key}.")
                print(f"\t{err.response['Error']['Code']}:{err.response['Error']['Message']}")
        
        print("Listing the objects in the bucket")
        try:
            for i in bucket.objects.all():
                print(f"\t{i.key}")
        except ClientError as err:
                print(f"Couldn't list the objects in bucket {bucket.name}.")
                print(f"\t{err.response['Error']['Code']}:{err.response['Error']['Message']}")
    
        print("Deleting the objects in the bucket & finally the bucket")
        try:
                bucket.objects.delete()
                bucket.delete()
                print(f"Emptied and deleted bucket {bucket.name}.\n")
        except ClientError as err:
                print(f"Couldn't empty and delete bucket {bucket.name}.")        
    
    if __name__ == "__main__":
        my_function(boto3.resource("s3"))
    

    Conclusion