What is AWS Kinesis firehose and kinesis data streams?

What is Kinesis Data Streams?

Amazon Kinesis Data Streams collects and process large streams of data records by using data-processing applications, known as Kinesis Data Streams applications.

Kinesis Data Streams application reads data from a data stream as data records. These applications can use the Kinesis Client Library, and they can run on Amazon EC2 instances.

You can send the processed records to dashboards, use them to generate alerts, dynamically change pricing and advertising strategies, or send data to a variety of other AWS services

The producers continually push data to Kinesis Data Streams, and the consumers process the data in real time. Consumers (such as a custom application running on Amazon EC2 or an Amazon Kinesis Data Firehose delivery stream) can store their results using an AWS service such as Amazon DynamoDB, Amazon Redshift, or Amazon S3.

What is Kinesis Data Firehose?

  • Data firehose is used to deliver real time streaming data to AWS S3, AWS Redshift, Amazon OpenSearch, Splunk or any HTTP Endpoints, third party providers such as Splunk, Dynatrace or data dog.
  • With Kinesis Data Firehose, you configure producers such as AWS EC2, WAF, Logs etc. to send data to Data firehose and which automatically delivers the data to the destination.
  • For Amazon Redshift destinations, streaming data is delivered to your S3 bucket first. Kinesis Data Firehose then issues an Amazon Redshift COPY command to load data from your S3 bucket to your Amazon Redshift cluster. 
  • You can also configure Kinesis Data Firehose to transform your data before delivering it.
  • Kinesis Data Firehose supports Amazon S3 server-side encryption with AWS Key Management Service (AWS KMS) for encrypting delivered data in Amazon S3.
  • If data transformation is enabled, Kinesis Data Firehose can log the Lambda invocation, and send data delivery errors to CloudWatch Logs.
  • Kinesis Data Firehose uses IAM roles for all the permissions that the delivery stream needs such as access to various services, including your S3 bucket, AWS KMS key (if data encryption is enabled), and Lambda function (if data transformation is enabled)

Kinesis Data Firehose delivery stream

  • You create data delivery stream so that you can send your data to this delivery stream.


  • The data of interest that your data producer sends to a Kinesis Data Firehose delivery stream


  • Producers send records to Kinesis Data Firehose delivery streams. 
  • You can also configure your Kinesis Data Firehose delivery stream to automatically read data from an existing Kinesis data stream and load it into destinations.
  • You can create a new Data stream and then select the Data streams instead of DIRECT PUT.
  • Now to retrieve from Data streams into Firehose you need Amazon Kinesis agent which is a standalone Java software application that offers an easy way to collect and send data to Kinesis Data Firehose .
  • You can install the agent on Linux-based server environments such as web servers, log servers, and database servers. The agent can pre-process the records parsed from monitored files before sending them to your delivery stream.
sudo yum install –y aws-kinesis-agent
  • To configure agent Open and edit the configuration file (as superuser if using default file access permissions).

sudo service aws-kinesis-agent start
  • The IAM role or AWS credentials that you specify must have permission to perform the Kinesis Data Firehose PutRecordBatch operation for the agent to send data to your delivery stream.
   "flows": [
            "filePattern": "/tmp/app.log*", 
            "deliveryStream": "yourdeliverystream"

    "flows": [
            "filePattern": "/tmp/app.log*", 
            "deliveryStream": "my-delivery-stream",
            "dataProcessingOptions": [
                    "optionName": "LOGTOJSON",
                    "logFormat": "COMMONAPACHELOG"

How to create Kinesis Firehose delivery Stream with Dynamic partitioning enabled

  • Navigate to the Amazon Kinesis and click on Delivery streams
  • Next choose the S3 bucket as the Destination. Here, we selected Dynamic partitioning as Not enabled.

Note: Dynamic partitioning enables you to create targeted data sets by partitioning streaming S3 data based on partitioning keys. You can partition your source data with inline parsing and/or the specified AWS Lambda function. You can enable dynamic partitioning only when you create a new delivery stream. You cannot enable dynamic partitioning for an existing delivery stream.

  • Select the Amazon Cloud watch error logging as enabled and also create a new IAM role.

Kinesis Data Firehose uses Amazon S3 to backup all or failed only data that it attempts to deliver to your chosen destination. You can specify the S3 backup settings

  • If you set Amazon S3 as the destination for your Kinesis Data Firehose delivery stream and you choose to specify an AWS Lambda function to transform data records or if you choose to convert data record formats for your delivery stream
  • If you set Amazon Redshift as the destination for your Kinesis Data Firehose delivery stream and you choose to specify an AWS Lambda function to transform data records.
  • If you set any of the following services as the destination for your Kinesis Data Firehose delivery stream: Amazon OpenSearch Service, Datadog, Dynatrace, HTTP Endpoint, LogicMonitor, MongoDB Cloud, New Relic, Splunk, or Sumo Logic.

When you send data from your data producers to your data stream, Kinesis Data Streams encrypts your data using an AWS Key Management Service (AWS KMS) key before storing the data at rest.

When your Kinesis Data Firehose delivery stream reads the data from your data stream, Kinesis Data Streams first decrypts the data and then sends it to Kinesis Data Firehose.

Writing to Kinesis Data Firehose delivery stream using Cloud Watch Events

  • On CloudWatch page, click on rules.
  • Once we click on rules, then create rules, provide the Source and Target details.

Sending Amazon VPC Logs to Kinesis Data Firehose Delivery Stream ( Splunk) using Cloud Watch

  • On AWS VPC, create a VPC flow log with Destination as Cloud Watch Log group.
  • In Cloud watch service Create Log group and choose Log groups.
  • Create a Kinesis Data Firehose Delivery Stream with Splunk as a Destination.
  • Now, create CloudWatch subscription which will send all the CloudWatch logs to delivery stream.
aws logs put-subscription-filter --log-group-name "VPCtoSplunkLogGroup" --filter-name "Destination" --filter-pattern "" --destination-arn "arn:aws:firehose:your-region:your-aws-account-id:deliverystream/VPCtoSplunkStream" --role-arn "arn:aws:iam::your-aws-account-id:role/VPCtoSplunkCWtoFHRole"

How to Create EC2 instance using AWS boto3 ec2 client

If you are looking to provision the EC2 instance in AWS cloud then there are many ways of doing it and one of the best way to do by invoking a simple Python Script using Boto3.

In this tutorial we will create a AWS EC2 instance using Python.


This post will be a step-by-step tutorial. If you’d like to follow along, ensure you have the following in place:

Ensure the IAM user is set up for programmatic access and that you assign it to the existing policy of AmazonEC2FullAccess.

  • Python v3.6 or later installed on your local machine. This tutorial will be using Python v3.11 on a Windows 10 machine.

Creating AWS EC2 instance using Python boto3 client

To create a Python script on your windows or Linux machine create a file named main.py and copy/paste the below code. The code below:

  • Imports the boto3 library which is used to connect to AWS API’s.
  • Next line of code creates a (ec2_client ) client. Boto3 supports two types of interactions with AWS; resource or client levels. The client level provides low-level service access while the resource level provides higher-level, more abstracted level access. This tutorial will use client access.
  • Next we use client to run a instance (ec2_client.run_instances) and store the information in instances variable.
  • Final line of code prints the instance id from the instance variable which is of type dictionary.
import boto3

def create_instance():
    ec2_client = boto3.client("ec2", region_name="us-east-1")
    instances = ec2_client.run_instances(

    print('The instance launched in ${region_name}' ["Instances"][0]["InstanceId"])


Executing Python boto3 Script to Launch AWS EC2

Now, we have created the code, lets run the Python with the below command.

python main.tf

Verifying the AWS EC2 in AWS Management console.


You should now have the basic knowledge to manage EC2 instances with the Boto3 EC2 Python SDK. Performing tasks in the Management Console such as creating, tagging, listing, and describing instances should be a thing of the past!

Monitoring and Alerting with Prometheus

What is Prometheus?

Prometheus is a powerful, open-source monitoring system that collects metrics from services and stores them in a time-series database. It records real-time metrics and alerts. It is written in Go Language. It allows powerful queries and great visualization. Prometheus works very well with Grafana when it comes to Dashboards and alerting notifications.

Prometheus includes a Flexible query language. Every time series contains a metrics name and set of key-value pairs called labels.

# Notation of time series
<metric name> {<label name>=<label value>,.....} 
# Example
node_boot_time {instance="localhost:9000",job="node_exporter"}

Prometheus Architecture

How to Install Maven and Setup Maven in Jenkins

What is Apache Maven

A maven is a build tool used for Java projects. Maven can also be used to build and manage projects written in various languages such as C#, Ruby, Scala, and other languages. The Maven project is hosted by the Apache Software Foundation, where it was formerly part of the Jakarta Project.

The most powerful feature of Maven is download project dependency libraries automatically defined in the pom.xml. Also, configure project build cycle such as invoke junits for test sonarqube for static analysis.


Java Development Kit (JDK) and EclipseMaven 3.3+ require JDK 1.7 or above to execute
MemoryNo minimum requirement
DiskApproximately 10MB is required for the Maven installation itself. In addition to that, additional disk space will be used for your local Maven repository. The size of your local repository will vary depending on usage but expect at least 500MB.
Operating System

No minimum requirement. Start-up scripts are included as shell scripts and Windows batch files.

Install Apache Maven

  • Next unzip the file that you just downloaded on your machine.
  • Next, set the enviornmental variables for Apache Maven by going in enviormental variable and add variable name as M2_HOME and MAVEN_HOME
  • Next append the Maven bin directory in the PATH variable.
  • Next, check the Maven version
  • Next, configure the local maven repository by going in the conf folder and then inside the setting.xml file and update the path after creating a maven_repo folder as shown below.

Setup Maven Project

  • First step is to open eclipse and navigate to new project and then look for Maven as shown below.
  • Select Create a simple project
  • Provide the group ID, artifact ID as shwon below.
  • It will create the new folder on your eclipse as shown below.
  • Next, create a Java class and name it something as my-calculator
  • Create a new class mycalculator
  • After clicking on Finish you will as below.
  • Next, add the methods in the class.
package mycalculator_package;

public class mycalculator {

// Method to add two numbers
 public int add(int a, int b) {
  return a + b;	
// Methods to multiple two numbers
 public int multiple(int a, int b) {
	  return a * b;	
// Methods to subtract two numbers
 public int subtract(int a, int b) {
	  return a - b;	
// Methods to divide two numbers
 public int divide(int a, int b) {
	  return a % b;	
  • Similarly create a junit by creating a test class
  • Next, we dont want Junit4 to be added.
<!-- https://mvnrepository.com/artifact/junit/junit -->
  • Now, add the code below in the mycalculatortest.java as shown below.
package mycalculatortest_package;
import static org.junit.Assert.*;
import org.junit.Test;
import mycalculator_package.mycalculator;
public class mycalculatortest {

	public void addtest() {
		mycalculator calc = new mycalculator();
		assertEquals(100, calc.add(80, 20));
	public void subtracttest() {
		mycalculator calc = new mycalculator();
		assertEquals(60, calc.subtract(80, 20));
	public void multipletest() {
		mycalculator calc = new mycalculator();
		assertEquals(100, calc.multiple(10, 10));

  • Next add the maven compiler version in the POM file as we are using JDK 1.8 and then go to Project and Maven and update Project as shown below.
  • Next Run the Maven Project as shown below by navigating to Run as and then Maven Build. All your Java source code remains in the src folder and all the classes are pressent in the target folder.
Execution Process of Java Program in Detail | Working of JUST-IT-TIME  Compiler (JIT) in Detail - Simple Snippets
  • Further run mvn clean and mvn clean test: Cleans the project and removes all files generated by the previous build and the mvn clean test cleans the target folder as well.
  • Next test locally on your machine some of the other commands such as mvn compile that Compiles source code of the project.
  • mvn test-compile: Compiles the test source code.
  • mvn test: Runs tests for the project.
  • mvn package: Creates JAR or WAR file for the project to convert it into a distributable format.
  • mvn install: Deploys the packaged JAR/ WAR file to the local repository.
  • mvn deploy: Copies the packaged JAR/ WAR file to the remote repository after compiling, running tests and building the project.

Setting up Maven in Jenkins WAY 1

  • Create a new Job in Jenkins and call it Maven-JOB

  • Select top level Maven targets in the Build Step
  • Next add clean test package in the Goals and the location of pom.xml file.
  • Next, trigger the Jenkins Job and should see the project compiled succesfully.

Setting up Maven in Jenkins WAY 2

  • Install Maven Plugin using Manage Jenkins
  • Now create another job where you will notice the Maven Project option as shown below. Select Maven Project and enter an item name as maven-job2
  • Inside the Build TAB add the workspace path which is the location where you pom file is located.
  • Make sure to add: Resolve Dependencies during Pom parsing in the Build step
  • Next navigate to Manage Jenkins to Global Tool Configuration and add Maven and JDK as shown below.
  • Next run the Jenkins Job and you should then be able to run the Job.

Html full form( Hypertext Markup Language): Learn complete HTML

HTML introduction

HTML stands for HyperText Markup Language which is a markup language that is used to create a web page and markup language means a language that uses tags to define elements within a document.

With HTML you can create static pages however if you combine CSS, Javascript and HTMl together you will be able to create dynamic and more functional web pages or websites.

HTML Basic Example and HTML Syntax

Now that you have a basic idea of HTML, let’s kick off this tutorial by learning how to declare HTML Syntax. In the below HTML code:

  • <!DOCTYPE html> specific that it is a HTML5 document.
  • <html> is the root of html page.
  • <head> contains page information.
  • <title> is title of the page
  • <body> is documents body
  • <h1> is heading
  • <p> is the paragraph
<!DOCTYPE html>
<title> Page title </title>
<h1> My first heading</h1>
<p> My fist para </p>
<!DOCTYPE html>  # Document Type 
<html lang="en-US">    # Language Attribute to declare the language of the Web page
# Also <meta> element is used to specify the character set, page description, author and viewport.
<meta name="viewport" content="width=device-width, initial-scale-1.0"  # Setting the viewport which is a users visible area of a web page, initial-scale=1.0 sets the initial zoom level. 
<style>                                                     # Head element is the container of title, style, meta, link, script etc.
body{background-color: red;}                                # Internal CSS and define style information for a single HTML page
h1{ color:red; }                                            # Internal CSS
p {
  border: 2px red;                                          # Border( Border becomes more dark when pixels increased)
  padding: 30px;                                            # Padding ( Space between text and Border)
  margin: 10px;                                             # Margin (Space outside the border)
a: link, a:visted, a:hover, a:active {                      # HTML Links with different with different scenerios
  text-align: center;
  color: blue;
.city {                                                     # Decalring the CSS for Class City
 background-color: tomato;
 color: white;
<link rel="stylesheet" href="styles.css">                   # External CSS and link is a 
<link rel="icon" type="image/x-icon" href="/images/favicon.ico" # Adding the Favicon in HTML page.
<div class="city">                                          # Creating a class named city 
  <h2> Hello this is my City </h2>
</div>                                                      # Class City ends here
<!--      -->                                               # Comments in HTML 
<p style="background:red" background-image: url('a.jpg') background-repeat>...........</p>                   # Paragraph
<p><a href="#C4">Jump to Chapter 4</a>                      # Creating a Link to create a BookMark using the ID 
<h2 id="C4"> Chapter 4 </h2>                                # Creating a heading with id that will be tagged with a link to create a bookmark. ID's are unique and used only with one HTML element rather than class being used by multiple HTML elements
<a href="google.com"> This is a link</a>                    # Link
<a href="google.com" target="_blank"> This is a link</a>    # Opens the document in a new window or tab
<a href="google.com" target="_parent"> This is a link</a>   # Opens the document in parent frame
<a href="google.com" target="_top"> This is a link</a>      # Opens the document in full body of the window
<a href="google.com" target="_self"> This is a link</a>     # Opens the document in same window
<iframe src="a.html" name="iframe_a" height="10" width="10" title="Title Iframe"></iframe>   # Creating a Iframe ( An inline fram) 
<p><a href="google.com" target="iframe_a">Hello, the link will open when clicked on the link</a></p>   # Using Iframe in the link  
<ol>                                                        # Ordered List
  <li>Coffee</li>                                           # Lists
<img src="a.jpeg" alt="Image" width="2" height="2">         # Image
<img src="computer_table.jpeg" usermap="#workmap">          # Using Image Map
<map name="workmap">
   <area shape="" coords="34,44,270,350" href="computer.htm">
   <area shape="" coords="31,41,21,35"   href="phone.htm">

<script>                                                   # Creating a Javascript inside the Html Page
  function myfunc() {
  document.getElementById("C4").innerHTML = "Have a nice DAY "
  var x = document.getElementsByClassName("city");         # Using the Class city within the Javascript within a HTML Page
  for (var i = 0; i< x.length; i ++ ) {
    x[i].style.display = "none"


<header> - Defines a header for a document or a section
<nav> - Defines a set of navigation links
<section> - Defines a section in a document
<article> - Defines an independent, self-contained content
<aside> - Defines content aside from the content (like a sidebar)
<footer> - Defines a footer for a document or a section
<details> - Defines additional details that the user can open and close on demand
<summary> - Defines a heading for the <details> element

How to Automate XML and YML and CSV files using Python

Reading and Writing a YML file using python

yaml and yml files are superset of JSON. Some of the automation tools such as ansible uses yaml based files, referred to as playbooks, to define actions you want to automate. These playbooks use the YAML format.

Working with yaml files is a fun in python , so lets get started but In order to work with yaml files in python you would require to install a PyYAML library as Python doesn’t contain standard library. PyYAML is a YAML parser and emitter for Python.

  • Run the following command to install PyYAML library in your favorite code editor terminal such as visual code studio.
pip install PyYAML
  • Next, create a folder with a name Python and under that create a simple YML file and name it as apache.yml and paste the below content and save it.
- hosts: webservers
    http_port: 80
    max_clients: 200
  remote_user: root
  - name: ensure apache is at the latest version
      name: httpd
      state: latest
  • Next, create another file in same Pythonfolder and name it as read_write_yaml.py and paste the below python code.

Below Python script imports yaml module to work with yaml files and pprint module to get a output in well designed pattern. Next, using open() function it opens the apache.yml file and reads the data using yaml.safe_load() method. Later, using yaml.dump() you can add or write the data into it. As we are not adding any data into it , the output would result as NONE

import yaml
from pprint import pprint

with open('apache.yml', 'r') as new_file:
     verify_apache = yaml.safe_load(new_file)

with open('apache.yml', 'w') as new_file2:
     verify_apache2 = yaml.dump(verify_apache, new_file2)

  • Execute the above python script using python command and you should see the below output.
[{'hosts': 'webservers',
  'remote_user': 'root',
  'tasks': [{'name': 'ensure apache is at the latest version',
             'yum': {'name': 'httpd', 'state': 'latest'}}],
  'vars': {'http_port': 80, 'max_clients': 200}}]

Reading and Writing a XML file using python

XML files used mostly for structured data. Many web system uses XML to transfer data and one of them is RSS ( Real Simple Syndication) feeds which helps in finding the latest updates on websites from various sources. Python offers XML Library.

  • Next, in the same Python folder create a simple XML file and name it as book.xml and paste the below content and save it. XML has a tree like structure and top element is known as root and rest of them are elements
<?xml version="1.0"?>
   <book id="bk109">
      <title>Automate Infra Part 2</title>
      <genre>Science Fiction</genre>
   <book id="bk112">
      <title>Automate Infra Part 1</title>

  • Next, create another file in same python folder and name it as read_write_xml.py and paste the below python code.
  • In below script, importing xml.etree.ElementTree module helps to work with xml files and implements a simple and efficient API for parsing and creating XML data. Next, In XML file entire tree is parsed that is it reads the book.xml file and then prints the content inside it.

    import xml.etree.ElementTree as ET   
    tree = ET.parse('book.xml')    # checking each elements
    root = tree.getroot()          # finding the root 
    for child in root:                    # Each child and its attributes

  • Execute the above python script using python command and you should see the below output.
  • O/P:
    catalog {}
    book {'id': 'bk109'}
    book {'id': 'bk112'}

    Reading and Writing a comma-separated values (CSV) file using python

    CSV is most widely used spreadsheets. To work with these file in python you need to import the csv module. Lets learn how to read and write data into CSV.

  • Next, in the same Python folder create a CSV file and name it as devops.csv and add the content similar to below in your file and save it.
    • Next, create another file in same python folder and name it as read_write_csv.py and paste the below python code.

    Below script uses csv module to work with csv files. As soon as python script is executed , open() function opens the csv file and then using csv.reader() it reads it and then prints the rows according to the defined range.

    import csv
    with open('devops.csv' , 'r') as csv_file:
        read = csv.reader(csv_file,  delimiter=',')
        for _ in range(5):
    • Execute the above python script using python command and you should see the below output.
    ['Date', ' PreviousUserCount', ' UserCountTotal', ' sitepage']
    ['02-01-2021', '61', '5336', ' automateinfra.com/blog']
    ['03-01-2021', '42', '5378', ' automateinfra.com/blog1']
    ['04-01-2021', '26', '5404', ' automateinfra.com/blog2']
    ['05-01-2021', '65', '5469', ' automateinfra.com/blog3']
    <_csv.reader object at 0x0336A370>

    Python – Pandas (Data Analysis 3rd Party Library)

    pandas.DataFrame, which acts like a data table, similar to a very powerful spreadsheet. If you want to work on something like row or column in Spreadsheet then DataFrames is the tool for you. So lets get started by installing pip install pandas

    import pandas as pd
    df = pd.read_csv('devops.csv')
    print(df.head(4)) # Seeing TOP 4 rows in devops.csv file
    print(df.describe()) # Statical View 
    <class 'pandas.core.frame.DataFrame'>
             Date   PreviousUserCount   UserCountTotal                  sitepage
    0  02-01-2021                  61             5336    automateinfra.com/blog
    1  03-01-2021                  42             5378   automateinfra.com/blog1
    2  04-01-2021                  26             5404   automateinfra.com/blog2
    3  05-01-2021                  65             5469   automateinfra.com/blog3
            PreviousUserCount   UserCountTotal
    count            4.000000         4.000000
    mean            48.500000      5396.750000
    std             18.046237        55.721779
    min             26.000000      5336.000000
    25%             38.000000      5367.500000
    50%             51.500000      5391.000000
    75%             62.000000      5420.250000
    max             65.000000      5469.000000

    PYTHON : Regular Expressions to Search Text ( * MOSTLY USED AND IMPORTANT)

    BEST TWO EXAMPLE OF SEARCHING [Can be used for Different Practices such as Analysis, HR, Sales Team and many more ]

    name_list = '''Ezra Sharma <esharma@automateinfra.com>,
       ...: Rostam Bat   <rostam@automateinfra.com>,
       ...: Chris Taylor <ctaylor@automateinfra.com,
       ...: Bobbi Baio <bbaio@automateinfra.com'''
    # Some commonly used ones are \w, which is equivalent to [a-zA-Z0-9_] and \d, which is equivalent to [0-9]. 
    # You can use the + modifier to match for multiple characters:
    print(re.search(r'Rostam', name_list))
    print(re.search('[RB]obb[yi]',  name_list))
    print(re.search(r'Chr[a-z][a-z]', name_list))
    print(re.search(r'[A-Za-z]+', name_list))
    print(re.search(r'[A-Za-z]{5}', name_list))
    print(re.search(r'[A-Za-z]{7}', name_list))
    print(re.search(r'[A-Za-z]+@[a-z]+\.[a-z]+', name_list))
    print(re.search(r'\w+', name_list))
    print(re.search(r'\w+\@\w+\.\w+', name_list))
    print(re.search(r'(\w+)\@(\w+)\.(\w+)', name_list))
    <re.Match object; span=(49, 55), match='Rostam'>
    <re.Match object; span=(147, 152), match='Bobbi'>
    <re.Match object; span=(98, 103), match='Chris'>
    <re.Match object; span=(0, 4), match='Ezra'>
    <re.Match object; span=(5, 10), match='Sharm'>
    <re.Match object; span=(13, 20), match='esharma'>
    <re.Match object; span=(13, 38), match='esharma@automateinfra.com'>
    <re.Match object; span=(0, 4), match='Ezra'>
    <re.Match object; span=(13, 38), match='esharma@automateinfra.com'>
    <re.Match object; span=(13, 38), match='esharma@automateinfra.com'>
    # <IP Address> <Client Id> <User Id> <Time> <Request> <Status> <Size>
    Line1 = ' - Automateinfra1 [13/Nov/2021:14:43:30 -0800] "GET /assets/234 HTTP/1.0" 200 2326'
    access_log = ''' - Automateinfra1 [13/Nov/2021:14:43:30 -0800] "GET /assets/234 HTTP/1.0" 200 2326 - Automateinfra2 [13/Nov/2021:14:43:30 -0800] "GET /assets/235 HTTP/1.0" 200 2324 - Automateinfra3 [13/Nov/2021:14:43:30 -0800] "GET /assets/236 HTTP/1.0" 200 2325
    count_ip = r'(?P<IP>\d+\.\d+\.\d+\.\d+)'
    count_time = r'(?P<Time>\d\d/\w{3}/\d{4}:\d{2}:\d{2}:\d{2})'
    count_clientid = r'(?P<User>".+")'
    count_request = r'(?P<Request>".+")'
    sol = re.search(r'(?P<IP>\d+\.\d+\.\d+\.\d+)', Line1 )
    print(re.search(count_request , Line1))
    print(re.search(count_time , Line1))
    value = re.finditer(count_ip, access_log)
    for sol in value:
    <re.Match object; span=(56, 82), match='"GET /assets/234 HTTP/1.0"'>
    <re.Match object; span=(28, 48), match='13/Nov/2021:14:43:30'>


    Rather than loading the whole file into memory as you have done up until now, you can read one line at a time, process the line, and then move to the next. The lines are removed from memory automatically by Python’s garbage collector, freeing up memory.

    with open("devops.txt",mode="r") as mynewfile:       # if you open any binary file such as pdf keep w as wb
        with open("devops-corrected.txt", "w") as target_file:
            for line in mynewfile:
    # FILE BREAKER with chunk of data with number of bytes 
    with open('book.xml' , 'rb') as sourcefile:
        while True:
            chunk = sourcefile.read(1024)  # break down in 1024 bytes
            if chunk:
    b'<?xml version="1.0"?>\r\n<catalog>\r\n   <book id="bk109">\r\n      <author>Author1</author>\r\n      <title>Automate Infra Part 2</title>\r\n      <genre>Science Fiction</genre>\r\n      <price>6.95</price>\r\n    
      <publish_date>2000-11-02</publish_date>\r\n      <description>book1</description>\r\n   </book>\r\n   <book id="bk112">\r\n      <author>Author2</author>\r\n      <title>Automate Infra Part 1</title>\r\n      <genre>Computer</genre>\r\n      <price>49.95</price>\r\n      <publish_date>2001-04-16</publish_date>\r\n      <description>book2</description>\r\n   </book>\r\n</catalog>'


    There are many times you need to encrypt text to ensure security. In addition to Python’s built-in package hashlib, there is a widely used third-party package called cryptography

    HASHLIB: Uses Hash Function and based on SHA1, SHA224, SHA384, SHA512, and RSA’s MD5 Algorithms


    symmetric key encryption: Its based on shared keys. These algorithms include Advanced Encryption Algorithm (AES), Blowfish, Data Encryption Standard (DES), Serpent, and Twofish

    asymmetric key encryption: Its based on public keys ( which are widely shared ) and private keys which is kept secretly

    # Encryption using HashLib
    import hashlib                  # Python Built in Package
    line = "I like editing automateinfra.com"
    bline = line.encode()       # Converting into Binary string
    print(bline)                  # Print the converted Binary string
    algo = hashlib.md5()            # Using the secure alogorithm using haslib object
    algo.update(bline)            # Applying the secure alogorithm
    print("Encrypted  text Message")
    print(algo.digest())            # Print the Encypted string
    # Encryption using Cryptography (Symmetric key encryption)
    from cryptography.fernet import Fernet  # Third Party Package So you would need pip install cryptography
    key = Fernet.generate_key()             # Generating the keys
    print("Generating the keys ")
    print(key)                              # Prining the keys
    algo = Fernet(key) # Using the key AES alogo using Fenet object
    message = b"I definetely like Editing AutomateInfra.com"
    encrypted = algo.encrypt(message)
    print("Encrypted  text Message ")
    # Encryption using Cryptography (ASymmetric key encryption)
    from cryptography.hazmat.backends import default_backend
    from cryptography.hazmat.primitives.asymmetric import padding ,rsa
    from cryptography.hazmat.primitives import hashes
    private_key = rsa.generate_private_key(public_exponent=65537,key_size=4096,backend=default_backend())  # Generating the Private Key
    print(private_key)   # Printing  the Private Key
    public_key = private_key.public_key()   # Generating the Public Key
    print(public_key)    # Printing  the Public  Key
    message = b"I am equally liking Editing AutomateInfra.com"
    encrypted = public_key.encrypt(message,padding.OAEP(mgf=padding.MGF1(algorithm=hashes.SHA256()), algorithm=hashes.SHA256() , label=None))
    decrypted = private_key.decrypt(encrypted,padding.OAEP(mgf=padding.MGF1(algorithm=hashes.SHA256()), algorithm=hashes.SHA256(), label=None))
    b'I like editing automateinfra.com'
    Encrypted  text Message
    Generating the keys
    Encrypted  text Message
    b'I definetely like Editing AutomateInfra.com'
    <cryptography.hazmat.backends.openssl.rsa._RSAPrivateKey object at 0x036491D8>
    <cryptography.hazmat.backends.openssl.rsa._RSAPublicKey object at 0x03850E38>
    b"\x8b\xec\xb0\x91\xec\xe7\x8d;\x11\xbclch\xbdVD@c\xd3J\x07'\xe9\x07\x15\x1c@=^\xd2h\xcaDL\x95\xea[\x0fv\x012\xed\xd5\xed\x0e\x9b\x93V2\x00\xba\x9c\x07\xba\x8b\xf3\xcb\x03M\xa8\xb1\x12ro\xae\xc0\xfb$\xf9\xcc\x85\xe8s\xfc`{\xfe{\x88\xd2\xc3\xffI\x90\xe3\xd2\x1e\x82\x95\xdfe<\xd5\r\x0b\xc4z\xc4\xf7\x00\xcfr\x07npm0\xd4\xc4\xa4>w\x9d]\xcf\xae7F\x91&\x93\xd5\xda\xcaR\x13A\x8ewB\xf6\xd9\xae\xce\xca\x8f\xd6\x91\x06&:\x00\xa0\x84\x05#,\x7fdA\x87\xb2\xe7\x1d\x8b*\xa15\xf8\xb0\x07\xa0n\x1e\xeaI\x02\xbaA\x88ut\x8e\x82<\xfe\xbfM\xe6F\xa3\xcc\xd4\x8b\x80PY\xb5\xd3\x14}C\xe2\x83j\xaf\x85\xa6\x9e\x19\xb2\xd9\xb8\xac\xa4\xfb\x1f\x0c\xce\x9d4\x82\x1e\xfd5\xb49\xa5\xbbL\x01~\x8fA\xee\r\xc7\x84\x9e\x0c\t\x15z\r\xfd]\x0b\xcfW\x01\xd2\x16\x17btc\xeaSl\xf5\xb0\x8a\xe2X\xe7\xa7a\xa7\xf7M\x01\xa2\x0b8\xd6\xf2\xc5c\xbf\xea\xe0\x80\x15\xde-\x98\xa1\xc8ud*\xbel2\xb5\xc8:\x92\xd5\r(_8\xbd\xcb\x80\xf1\x93\x83\xe2\x9f\xed\x82f\xd0\xb2\x8f\x1b\x9eMC\x07\xf9\x08\xb0\x00QA\xea\x93\xc7@&\x84\xff<\xde\x80@\xc8\xc6\x83O&%\x91r-\xb0\xef}\x18tU{C\xa6\x17\x97\x1b\x95g\xc5\x0e>{\xb0\x94a)\xbc)*Sq\x98\xad\xf3>\x04\x9b+x\x95&\xa6\xe6,\xb4~\xf2Y\x06,\xab'uq \x9f0\x7f\xb5\xd50\xbdp\xbb\xdf\x1c\xe9\xb1\xc4\x88y\nq\\\x85\x1e\xd8\x18M\x87\x1aU.\x918;\xcd\x10 \x9b\x11\xf9R\xd3\x8fz\xe8\xf6|C\xfb\x1f\xfd1\x19\x10:>\x1c\x06\x8e\xda\x98\xb2\xf3aa^\xa54\x03\xf8\x03\xc4\xe6\xd9mw\r\x8b\x96\xa2rJ\x03\xe7\xda\x0f\rJ-iPo!^\x8a\xdcg\x8c!L\xa4\xedY\xe5\x12\xdf\xe8\xe7\x0cE\xcd\xa2\xa2Gr\xc0\xe1\xa6\xc5\x9a\x9f\x07\x89\x84\x8b\xb7"
    b'I am equally liking Editing AutomateInfra.com'


    This module will help to connect with many low level operating system calls and offers connectivity between multi -OS like Unix and Windows.

    import os   # Python Built in Package
    print(os.listdir('.'))   # List the directories
    os.chmod('automateinfra.txt',777) # Add the permissions to the file.
    os.mkdir('/tmp/automateinfra.pdf') # Make the directory
    os.rmdir('/tmp/automateinfra.pdf') # remove the directory
    os.stat('b.txt')  #  These stats include st_mode, the file type and permissions, and st_atime, the time the item was last accessed.
    cur_dir = os.getcwd()  # Get the current working directory.
    print(os.path.dirname(cur_dir))   # Returns the Parent Directory Path
    print(os.path.split(cur_dir))     # Gives structure from Parent Directory
    print(os.path.basename(cur_dir))  # Returns Base Directory 
    while os.path.basename(cur_dir):   # Until Base Path directory is true , keep continuing 
        cur_dir = os.path.dirname(cur_dir)  # Prints the base directory and all above parents Directory
    ('C:\\Users\\AutomateInfra\\Desktop\\GIT\\Python-Desktop', 'Basics')
    import os
    # Check the current working directory
    file_name = "automateinfra.txt"
    file_path = os.path.join(os.getcwd(), file_name)  
    print(f"Checking {file_path}")
    if os.path.exists(file_path):
    # Check user home directory
    home_dir = os.path.expanduser("~/") #expanduser function to get the path to the user’s home directory.
    file_path = os.path.join(home_dir,file_name)
    print(f"Checking {file_path}")
    if os.path.exists(file_path):
    Checking C:\Users\Automateinfra\Desktop\GIT\Python-Desktop\Basics\automateinfra.txt
    Checking C:\Users\Automateinfra/automateinfra.txt

    The Ultimate Guide on API Testing with Complete Automation

    API Automation with Rest Assured library

    What is an API ?

    API is an interface that allows communication between client to server to simplify the building of client-server software.

    API is an software that allows two applications to talk to each other. Each time you use an app like Facebook, send an instant message, or check the weather on your phone, you’re using an API.

    When you use an application on your mobile phone, the application connects to the Internet and sends data to a server. The server then retrieves that data, interprets it, performs the necessary actions and sends it back to your phone. The application then interprets that data and presents you with the information you wanted in a readable way. This is all possible with an API.

    Difference between Types of API’s [ SOAP v/s REST ]

    REST: Representational State Transfer. It is an lightweight and scalable service built on REST architecture. It uses HTTP protocol. It is based on architectural pattern

    Elements of REST API:

    • Method: GET, PUT, DELETE
      • POST – This would be used to send the data to the server such as customer information or uploading any file using the RESTful web service. To send the data use Form parameter and body payload.
      • GET – This would be used to retrieve data from the server using the RESTful web service. It only extracts the data there is no change in the data. No Payload or body required. To get the data use query parameter.
      • PUT – This would be used to update the resources using the RESTful web service
      • DELETE – This would be used to delete * using the RESTful services
    • Request Headers: These are additional instructions that are sent along with the request
    • Request Body: Data is sent along with the POST request that is it wants to add a resource to the server.
    • Response status code: Returned along with the request such as 500, 200 etc.

    Characteristics of REST

    • REST is an Architectural style in which a web service can only be treated as a RESTful service if it follows the constraints of being 1. Client Server 2. Stateless 3. Cacheable 4. Layered System 5. Uniform Interface
    • Stateless means that the state of the application is not maintained in REST .For example, if you delete a resource from a server using the DELETE command, you cannot expect that delete information to be passed to the next request. This is required so that server can process the response appropriately
    • The Cache concept is to help with the problem of stateless which was described in the last point. Since each server client request is independent in nature, sometimes the client might ask the server for the same request again
    • REST use Uniform Service locators to access to the components on the hardware device. For example, if there is an object which represents the data of an employee hosted on a URL as automateinfra.com , the below are some of URI that can exist to access them automateinfra.com/blog

    SOAP: Simple Object Access Protocol.

    • Follows strict rules for communicate between [client-server] as it doesn’t follows what is being followed by REST follows Uniform Interface, Client-Server, Stateless, Cacheable, Layered System, Code.
    • SOAP was designed with a specification. It includes a WSDL file which has the required information on what the web service does in addition to the location of the web service.
    • The other key challenge is the size of the SOAP messages which get transferred from the client to the server. Because of the large messages, using SOAP in places where bandwidth is a constraint can be a big issue.
    • SOAP uses service interfaces to expose its functionality to client applications. In SOAP, the WSDL file provides the client with the necessary information which can be used to understand what services the web service can offer.
    • SOAP uses only XML to transfer the information or exchanging the information where as REST uses plain text, HTML , JSON and XML and more.

    Application Programming Interface theory (API-theory)

    • When Website is owned by single owner such as Google: In that case when frontend site needs to connect to backend site then it may vary with different languages and can cause lot of compatibility issues such as frontend uses Angular and backend uses Java , so you would need API to deal with it.
    • When Your client needs to access data from your website then you would need to expose the API rather than exploring your entire code and packages.
    • When client connects to another client or server using API the transmission of data takes places using either XML or JSON which are language independent.

    Ultimate Jenkins tutorial for DevOps Engineers

    Jenkins is an open source automated CI/CD tool where CI stands for continuous integration and CD stands for Continuous delivery. Jenkins has its own built-in Java servlet container server which is Jetty. Jenkins can also be run in different servlet containers such as Apache tomcat or glassfish.

    • Jenkins is used to perform smooth and quick deployment. It can be deployed to local machine or on premises data center or any cloud.
    • Jenkins takes your code any sort of code such as python, java or go or JS etc. and compiles it using different compiler such as MAVEN one of the most used compiler and then builds your code in war or Zip format and sometimes as a docker Image. Finally once everything is built properly it deploy as an when required . It integrates very well with lots of third party tools.

    JAVA_HOME and PATH are variables to enable your operating system to find required Java programs and utilities.

    JAVA_HOME: JAVA_HOME is an (OS) environment variable that can optionally be set after either the (JDK) or (JRE) is installed. The JAVA_HOME environment variable points to the file system location where the JDK or JRE was installed. This variable should be configured on all OS’s that have a Java installation, including Windows, Ubuntu, Linux, Mac, and Android. 

    The JAVA_HOME environment variable is not actually used by the locally installed Java runtime. Instead, other programs installed on a desktop computer that requires a Java runtime will query the OS for the JAVA_HOME variable to find out where the runtime is installed. After the location of the JDK or JRE installation is found, those programs can initiate Java-based processes, start Java virtual machines and use command-line utilities such as the Java archive utility or the Java compiler, both of which are packaged inside the Java installation’s \bin directory.

    • JAVA_HOME if you installed the JDK (Java Development Kit)
    • JRE_HOME if you installed the JRE (Java Runtime Environment) 

    PATH: Set the PATH environment variable if you want to be able to conveniently run the executables (javac.exejava.exejavadoc.exe, and so on) from any directory without having to type the full path of the command. If you do not set the PATH variable, you need to specify the full path to the executable every time you run it, such as:

    C:\Java\jdk1.8.0\bin\javac Myprogram.java
    # The following is an example of a PATH environment variable:

    Installing Jenkins using msi installer on Windows Machine

    MSI is an installer file that installs your program on the executing system. Setup.exe is an application (executable file) that has MSI file(s) as one of the resources. The MSI is the file extension of MSI files. They are Windows installers. An MSI file is a compressed package of installer files. It consists of all the information pertaining to adding, modifying, storing, or removing the respective software.  MSI file includes data, instructions, processes, and add-ons that are necessary for the application to work normally.

    EXE is short for Executable. This is any kind of binary file that can be executed. All windows programs are exe files. Prior to MSI files, all installers were EXE files. The exe is a file extension of an executable file. An executable file executes a set of instructions or a code when opening it. An executable file is compiled from source code to binary code. It can be directly executed by the Windows OS. These files are understandable by the machine, and they can be directly executed by the operating system

    MSI is a file extension of windows installer which is a software component of Microsoft Windows used for the installation, maintenance, and removal of software. Whereas, exe is a file extension of an executable file that performs indicated tasks according to the encoded instructions. 

    1. Navigate to https://www.jenkins.io/download/ and select windows option and your download of Jenkins msi will begin.
    1. Once downloaded click on the jenkins.msi
    1. Continue the Jenkins setup.
    1. Select the Port 8080 and click on Test Port and then Hit Next.
    1. Provide the admin password from the provided Path mentioned in RED color.
    1. Further install the plugins required for jenkins.
    1. Next,it will prompt for First admin user. Please fill the required information and keep it safe with you , as you will use this to login.
    1. Now Jenkins URL configuration screen will appear , keep it as it is for now.
    1. Click on Save and Finish.
    1. Now your Jenkins is ready , click on Start using Jenkins. Soon, you will see Jenkins Dashboard. You can create New Jobs by clicking on New Item.

    Installing Jenkins using jenkins exe on Windows Machine

    1. Similarly now install jenkins.war from jenkins URL and click on Generic Java package(.war).
    2. Next run the command as below.
    java -jar jenkins.war -http=8181
    1. Next, copy the Jenkins password from the log output and paste it in the as you did earlier in windows msi section point (5) and follow rest of the points.

    Installing jenkins on Apache Tomcat server on Windows Machine

    1. Install the Apache Tomcat on windows machine from https://tomcat.apache.org/download-90.cgi and click on tomcat installer as per your system. This tutorial is performed on 64 bit windows machine.
    1. Next, unzip the tomcat installation folder and copy the jenkin.war file in the webapps folder.
    1. Next, go inside the bin folder and run the tomcat by clicking on the startup batch script.
    1. Finally you will notice that Apache Tomcat has started and Jenkins as well.
    1. Now, navigate to localhost:8080 URL and you should see tomcat page as shown below.
    1. Further, navigate to localhost:8080/jenkins to redirect to Jenkins Page.

    Configuring the Jenkins UI

    1. First click on Manage Jenkins and then navigate to Configure system.
    1. Next, add the system message and save it which should display this message on Jenkins everytime as below.
    1. To configure the name of the Jobs add the name Pattern as below.
    1. Next, try creating a a new Jenkins Job with random name then it will not allow you and display the error message.

    Managing User’s and Permission’s in Jenkins UI

    • Go to Manage Jenkins and Navigate to Manage users in the Jenkins UI.
    • Then Create three users as shown below admin, dev, qa.
    • Next, Navigate to Manage Jenkins and choose Configure Global Security.
    • Next select Project-based Matrix Authorization Strategy and define the permissions for all users as you want.

    Role Based Stratergy

    • In Previous section you noticed that adding all users and grnating all permissions is little tough job. So, instead create a role and add users in it. To do that first step is to install the Plugin as shown below.
    • Next select Role based Stratergy as shown below and define the permissions for all users as you want.
    • Next, navigate to Manage Jenkins and then to Manage and Assign Jenkins and then click on Manage Roles.
    • Add 3 Global Roles named DEV Team, QA Team and admin.
    • Add 2 Items Roles developers and Testers with define patterns so that Jobs names are declared accordingly.
    • Next, Click on Assign Role
    • Assigning the roles as shown below.


    In this tutorial you learnt how to install jenkins on windows through various ways , how to configure Jenkins Dashboard UI and how to manager users and Permissions.

    How does Python work Internally with a computer or operating system

    Are you a Python developer and trying to understand how does Python Language works? This article is for you where you will learn each and every bit and piece of Python Language. Let’s dive in!


    Python is a high-level language, which is used in designing, deploying, and testing at lots of places. It is consistently ranked among today’s most popular programming languages. It is also dynamic and object-oriented language but also works on procedural styles as well, and runs on all major hardware platforms. Python is an interpreted language.

    High Level v/s Low Level Languages

    High-Level Language: High-level language is easier to understand than is it is human readable. It is either compiled or interpreted. It consumes way more memory and is slow in execution. It is portable. It requires a compiler or interpreter for a translation.

    The fastest translator that converts high level language is .

    Low-Level Language: Low-level languages are machine-friendly that is machines can read the code but not humans. It consumes less memory and is fast to execute. It cannot be ported. It requires an assembler for translation.

    Interpreted v/s Compiled Language

    Compiled Language: Compiled language is first compiled and then expressed in the instruction of the target machine that is machine code. For example – C, C++, C# , COBOL

    Interpreted Language: An interpreter is a computer program that directly executes instructions written in a programming or scripting language, without requiring them previously to have been compiled into a machine language program and these kinds of languages are known as interpreter languages. For example JavaScript, Perl, Python, BASIC

    Python vs C++/C Language Compilation Process

    C++ or C Language: These Languages need compilation that means human-readable code has to be translated into Machine-readable code. The Machine code is executed by the CPU. Below is the sequence in which code execution takes place.

    1. Human Readable is compiled.
    2. Compilation takes place.
    3. Compiled code generates a executable file which is in a machine code format (Understood by Hardware).
    4. Execuation file is executed by CPU

    Python Language:

    Python is a high-level language

    Bytecode, also termed p-code, is a form of instruction set designed for efficient execution by a software interpreter

    1. Python code is written in .py format such as test.py.
    2. Python code is then compiled into .pyc or .pyo format which is a byte code not a machine code ( Not understood by Machine) using Python Interpreter.
    3. Once your program has been compiled to byte code (or the byte code has been loaded from existing .pyc files), it is shipped off for execution to something generally known as the Python Virtual Machine
    4. Byte code is converted into machine code using PVM ( Python Virtual Machine).
    5. Once your program has been compiled to byte code (or the byte code has been loaded from existing .pyc files), it is shipped off for execution to something generally known as the Python Virtual Machine
    6. Now byte code that is test.pyc is further converted into machine code using virtual machine such as (10101010100010101010)
    • Finally Program is executed and output is displayed.
    How Python runs? – Indian Pythonista


    In this tutorial, you learnt how the python language works and interacts with Operating systems and Hardware. So, which application are you planning to build using Python?

    How to run Python flask applications on Docker Engine

    Cannot we isolate our apps so that they are independent of each other and run perfectly ? The answer is absolutely “YES”, that correct that’s very much possible with docker and containers. They provide you isolated environment and are your friend for deploying many applications with each taking its own container. You can run as many as containers in docker and are independent of each other. They all share same kernel memory.

    In this tutorial we will go through a simple demonstration of a python application which will run on docker engine.

    Table of content

    1. What is Python ?
    2. What is docker ?
    3. Prerequisites
    4. Create a Python flask application
    5. Create a Docker file
    6. Build Docker Image
    7. Run the Python flask application Container
    8. Conclusion

    What is Python ?

    Python is a language from which you create web applications and system scripts. It is a used vastly across the organizations and very easy to learn. Python apps require isolated environment to run its application very well. This is quite possible with Docker and containers which we will use in this tutorial.

    If you wish to know more about python please visit our Python’s Page to learn all about Python.

    What is docker ?

    Docker is an open source tool for developing , shipping and running applications. It has ability to run applications in loosely isolated environment using containers. Docker is an application which helps in management of containers in a very smooth and effective way. In containers you can isolate your applications. Docker is quite similar to virtual machine but it is light weighted and can be ported easily.

    Containers are light weighted as they are independent of hypervisors load and configuration. They directly connect with machines ie. hosts kernel.


    You may incur a small charge for creating an EC2 instance on Amazon Managed Web Service.

    Create a Python flask application

    • Before we create our first program using python flask we need to install python flask and python virtual environment for flask to run.
    pip install virtualenv # virtual python environment 
    • Create and activate a virtual environment named virt:
    virtualenv venv
    source virt/bin/activate
    • Finally install Flask
    pip install flask # Install Flask from pip
    • Now create a text file and name it as app.py where we will write our first python flask code as below.
    from flask import Flask # Importing the class flask
    app = Flask(__name__)   # Creating the Flask class object.
    @app.route('/')         # app.route informs flask about the URL to be used by function
    def func():             # Creating a function
          return("Iam from Automateinfra.com")  
    if __name__ ==  "__main__":    # Programs starts from here.
    • Create one more file in same directory and name it as requirements.txt where we will define the dependency of flask application
    • Now our python code app.py and requirements.txt are ready for execution. Lets execute our code using below command.
    python app.py
    This image has an empty alt attribute; its file name is image-42.png
    • Great, so our python flask application ran successfully on our local machine. Now we need to execute same code on docker . Lets now move to docker part.

    Create a docker file

    Docker file is used to create a customized docker images on top of basic docker image. It is a text file that contains all the commands to build or assemble a new docker image. Using docker build command we can create new customized docker images . Its basically another layer which sits on top of docker image. Using newly built docker image we can run containers in similar way.

    This image has an empty alt attribute; its file name is image-43.png
    • Create a docker file and name it as Docker file . Keep this file also in same directory as app.py and requirements.txt
    FROM python:3.8           # Sets the base image 
    WORKDIR /code             # Sets the working directory in the container
    COPY requirements.txt .   # copy the dependencies file to the working directory
    RUN pip install -r requirements.txt  # Install dependencies
    COPY src/ .               # Copy the content of the local src directory to the working directory
    CMD [ "python", "./app.py" ] # Command to run on container start  
    This image has an empty alt attribute; its file name is image-44.png

    Build docker Image

    • Now we are ready to build our new image . So lets build our image
    docker build -t myimage .
    • You should see the docker images by now.
    docker images
    This image has an empty alt attribute; its file name is image-45.png

    Run the Python flask application Container

    • Now run our first container using same docker image ( myimage)
    docker run -d -p 5000:5000 myimage
    • Verify if container is successfully created.
    docker ps -a
    This image has an empty alt attribute; its file name is image-45.png


    In this tutorial we covered what is docker , what is python and using python flask application created a application on docker engine in one of the containers.

    Hope this tutorial will helps you in understanding and setting up Python flask and python flask applications on docker engine in ubuntu machine.

    Please share with your friends.