Amazon Kinesis Data Streams collects and process large streams of data records by using data-processing applications, known as Kinesis Data Streams applications.
Kinesis Data Streams application reads data from a data stream as data records. These applications can use the Kinesis Client Library, and they can run on Amazon EC2 instances.
You can send the processed records to dashboards, use them to generate alerts, dynamically change pricing and advertising strategies, or send data to a variety of other AWS services
The producers continually push data to Kinesis Data Streams, and the consumers process the data in real time. Consumers (such as a custom application running on Amazon EC2 or an Amazon Kinesis Data Firehose delivery stream) can store their results using an AWS service such as Amazon DynamoDB, Amazon Redshift, or Amazon S3.
What is Kinesis Data Firehose?
Data firehose is used to deliver real time streaming data to AWS S3, AWS Redshift, Amazon OpenSearch, Splunk or any HTTP Endpoints, third party providers such as Splunk, Dynatrace or data dog.
With Kinesis Data Firehose, you configure producers such as AWS EC2, WAF, Logs etc. to send data to Data firehose and which automatically delivers the data to the destination.
For Amazon Redshift destinations, streaming data is delivered to your S3 bucket first. Kinesis Data Firehose then issues an Amazon Redshift COPY command to load data from your S3 bucket to your Amazon Redshift cluster.
You can also configure Kinesis Data Firehose to transform your data before delivering it.
Kinesis Data Firehose supports Amazon S3 server-side encryption with AWS Key Management Service (AWS KMS) for encrypting delivered data in Amazon S3.
If data transformation is enabled, Kinesis Data Firehose can log the Lambda invocation, and send data delivery errors to CloudWatch Logs.
Kinesis Data Firehose uses IAM roles for all the permissions that the delivery stream needs such as access to various services, including your S3 bucket, AWS KMS key (if data encryption is enabled), and Lambda function (if data transformation is enabled)
Kinesis Data Firehose delivery stream
You create data delivery stream so that you can send your data to this delivery stream.
Records
The data of interest that your data producer sends to a Kinesis Data Firehose delivery stream
Producers
Producers send records to Kinesis Data Firehose delivery streams.
You can also configure your Kinesis Data Firehose delivery stream to automatically read data from an existing Kinesis data stream and load it into destinations.
You can create a new Data stream and then select the Data streams instead of DIRECT PUT.
Now to retrieve from Data streams into Firehose you need Amazon Kinesis agent which is a standalone Java software application that offers an easy way to collect and send data to Kinesis Data Firehose .
You can install the agent on Linux-based server environments such as web servers, log servers, and database servers. The agent can pre-process the records parsed from monitored files before sending them to your delivery stream.
sudo yum install –y aws-kinesis-agent
To configure agent Open and edit the configuration file (as superuser if using default file access permissions).
/etc/aws-kinesis/agent.json
sudo service aws-kinesis-agent start
The IAM role or AWS credentials that you specify must have permission to perform the Kinesis Data Firehose PutRecordBatch operation for the agent to send data to your delivery stream.
How to create Kinesis Firehose delivery Stream with Dynamic partitioning enabled
Navigate to the Amazon Kinesis and click on Delivery streams
Next choose the S3 bucket as the Destination. Here, we selected Dynamic partitioning as Not enabled.
Note: Dynamic partitioning enables you to create targeted data sets by partitioning streaming S3 data based on partitioning keys. You can partition your source data with inline parsing and/or the specified AWS Lambda function. You can enable dynamic partitioning only when you create a new delivery stream. You cannot enable dynamic partitioning for an existing delivery stream.
Select the Amazon Cloud watch error logging as enabled and also create a new IAM role.
Kinesis Data Firehose uses Amazon S3 to backup all or failed only data that it attempts to deliver to your chosen destination. You can specify the S3 backup settings
If you set Amazon S3 as the destination for your Kinesis Data Firehose delivery stream and you choose to specify an AWS Lambda function to transform data records or if you choose to convert data record formats for your delivery stream
If you set Amazon Redshift as the destination for your Kinesis Data Firehose delivery stream and you choose to specify an AWS Lambda function to transform data records.
If you set any of the following services as the destination for your Kinesis Data Firehose delivery stream: Amazon OpenSearch Service, Datadog, Dynatrace, HTTP Endpoint, LogicMonitor, MongoDB Cloud, New Relic, Splunk, or Sumo Logic.
When you send data from your data producers to your data stream, Kinesis Data Streams encrypts your data using an AWS Key Management Service (AWS KMS) key before storing the data at rest.
When your Kinesis Data Firehose delivery stream reads the data from your data stream, Kinesis Data Streams first decrypts the data and then sends it to Kinesis Data Firehose.
Writing to Kinesis Data Firehose delivery stream using Cloud Watch Events
On CloudWatch page, click on rules.
Once we click on rules, then create rules, provide the Source and Target details.
Sending Amazon VPC Logs to Kinesis Data Firehose Delivery Stream ( Splunk) using Cloud Watch
On AWS VPC, create a VPC flow log with Destination as Cloud Watch Log group.
In Cloud watch service Create Log group and choose Log groups.
Create a Kinesis Data Firehose Delivery Stream with Splunk as a Destination.
Now, create CloudWatch subscription which will send all the CloudWatch logs to delivery stream.
If you are looking to provision the EC2 instance in AWS cloud then there are many ways of doing it and one of the best way to do by invoking a simple Python Script using Boto3.
In this tutorial we will create a AWS EC2 instance using Python.
Prerequisites
This post will be a step-by-step tutorial. If you’d like to follow along, ensure you have the following in place:
Ensure the IAM user is set up for programmatic access and that you assign it to the existing policy of AmazonEC2FullAccess.
Python v3.6 or later installed on your local machine. This tutorial will be using Python v3.11 on a Windows 10 machine.
Creating AWS EC2 instance using Python boto3 client
To create a Python script on your windows or Linux machine create a file named main.py and copy/paste the below code. The code below:
Imports the boto3 library which is used to connect to AWS API’s.
Next line of code creates a (ec2_client ) client. Boto3 supports two types of interactions with AWS; resource or client levels. The client level provides low-level service access while the resource level provides higher-level, more abstracted level access. This tutorial will use client access.
Next we use client to run a instance (ec2_client.run_instances) and store the information in instances variable.
Final line of code prints the instance id from the instance variable which is of type dictionary.
Now, we have created the code, lets run the Python with the below command.
python main.tf
Verifying the AWS EC2 in AWS Management console.
Conclusion
You should now have the basic knowledge to manage EC2 instances with the Boto3 EC2 Python SDK. Performing tasks in the Management Console such as creating, tagging, listing, and describing instances should be a thing of the past!
Prometheus is a powerful, open-source monitoring system that collects metrics from services and stores them in a time-series database. It records real-time metrics and alerts. It is written in Go Language. It allows powerful queries and great visualization. Prometheus works very well with Grafana when it comes to Dashboards and alerting notifications.
Prometheus includes a Flexible query language. Every time series contains a metrics name and set of key-value pairs called labels.
# Notation of time series
<metric name> {<label name>=<label value>,.....}
# Example
node_boot_time {instance="localhost:9000",job="node_exporter"}
A maven is a build tool used for Java projects. Maven can also be used to build and manage projects written in various languages such as C#, Ruby, Scala, and other languages. The Maven project is hosted by the Apache Software Foundation, where it was formerly part of the Jakarta Project.
The most powerful feature of Maven is download project dependency libraries automatically defined in the pom.xml. Also, configure project build cycle such as invoke junits for test sonarqube for static analysis.
Prerequisites
Java Development Kit (JDK)and Eclipse
Maven 3.3+ require JDK 1.7 or above to execute
Memory
No minimum requirement
Disk
Approximately 10MB is required for the Maven installation itself. In addition to that, additional disk space will be used for your local Maven repository. The size of your local repository will vary depending on usage but expect at least 500MB.
Operating System
No minimum requirement. Start-up scripts are included as shell scripts and Windows batch files.
Next unzip the file that you just downloaded on your machine.
Next, set the enviornmental variables for Apache Maven by going in enviormental variable and add variable name as M2_HOME and MAVEN_HOME
Next append the Maven bin directory in the PATH variable.
Next, check the Maven version
Next, configure the local maven repository by going in the conf folder and then inside the setting.xml file and update the path after creating a maven_repo folder as shown below.
Setup Maven Project
First step is to open eclipse and navigate to new project and then look for Maven as shown below.
Select Create a simple project
Provide the group ID, artifact ID as shwon below.
It will create the new folder on your eclipse as shown below.
Next, create a Java class and name it something as my-calculator
Create a new class mycalculator
After clicking on Finish you will as below.
Next, add the methods in the class.
package mycalculator_package;
public class mycalculator {
// Method to add two numbers
public int add(int a, int b) {
return a + b;
}
// Methods to multiple two numbers
public int multiple(int a, int b) {
return a * b;
}
// Methods to subtract two numbers
public int subtract(int a, int b) {
return a - b;
}
// Methods to divide two numbers
public int divide(int a, int b) {
return a % b;
}
}
Similarly create a junit by creating a test class
Next, we dont want Junit4 to be added.
Next add the Junit dependencies in the POM.xml as shown below from https://mvnrepository.com/artifact/junit/junit. This will install the dependencies of Junit and you can see it on the left side.
Next Run the Maven Project as shown below by navigating to Run as and then Maven Build. All your Java source code remains in the src folder and all the classes are pressent in the target folder.
Further run mvn clean and mvn clean test: Cleans the project and removes all files generated by the previous build and the mvn clean test cleans the target folder as well.
Next test locally on your machine some of the other commands such as mvn compile thatCompiles source code of the project.
mvn test-compile: Compiles the test source code.
mvn test: Runs tests for the project.
mvn package: Creates JAR or WAR file for the project to convert it into a distributable format.
mvn install: Deploys the packaged JAR/ WAR file to the local repository.
mvn deploy: Copies the packaged JAR/ WAR file to the remote repository after compiling, running tests and building the project.
Setting up Maven in Jenkins WAY 1
Create a new Job in Jenkins and call it Maven-JOB
Select top level Maven targets in the Build Step
Next add clean test package in the Goals and the location of pom.xml file.
Next, trigger the Jenkins Job and should see the project compiled succesfully.
Setting up Maven in Jenkins WAY 2
Install Maven Plugin using Manage Jenkins
Now create another job where you will notice the Maven Project option as shown below. Select Maven Project and enter an item name as maven-job2
Inside the Build TAB add the workspace path which is the location where you pom file is located.
Make sure to add: Resolve Dependencies during Pom parsing in the Build step
Next navigate to Manage Jenkins to Global Tool Configuration and add Maven and JDK as shown below.
Next run the Jenkins Job and you should then be able to run the Job.
HTML stands for HyperText Markup Language which is a markup language that is used to create a web page and markup language means a language that uses tags to define elements within a document.
With HTML you can create static pages however if you combine CSS, Javascript and HTMl together you will be able to create dynamic and more functional web pages or websites.
HTML Basic Example and HTML Syntax
Now that you have a basic idea of HTML, let’s kick off this tutorial by learning how to declare HTML Syntax. In the below HTML code:
<!DOCTYPE html> specific that it is a HTML5 document.
<html> is the root of html page.
<head> contains page information.
<title> is title of the page
<body> is documents body
<h1> is heading
<p> is the paragraph
<!DOCTYPE html>
<html>
<head>
<title> Page title </title>
</head>
<body>
<h1> My first heading</h1>
<p> My fist para </p>
</body>
</html>
<!DOCTYPE html> # Document Type
<html lang="en-US"> # Language Attribute to declare the language of the Web page
<head>
# Also <meta> element is used to specify the character set, page description, author and viewport.
<meta name="viewport" content="width=device-width, initial-scale-1.0" # Setting the viewport which is a users visible area of a web page, initial-scale=1.0 sets the initial zoom level.
<style> # Head element is the container of title, style, meta, link, script etc.
body{background-color: red;} # Internal CSS and define style information for a single HTML page
h1{ color:red; } # Internal CSS
p {
border: 2px red; # Border( Border becomes more dark when pixels increased)
padding: 30px; # Padding ( Space between text and Border)
margin: 10px; # Margin (Space outside the border)
}
a: link, a:visted, a:hover, a:active { # HTML Links with different with different scenerios
text-align: center;
color: blue;
}
.city { # Decalring the CSS for Class City
background-color: tomato;
color: white;
}
<link rel="stylesheet" href="styles.css"> # External CSS and link is a
<link rel="icon" type="image/x-icon" href="/images/favicon.ico" # Adding the Favicon in HTML page.
</style>
</head>
<body>
<div class="city"> # Creating a class named city
<h2> Hello this is my City </h2>
</div> # Class City ends here
<!-- --> # Comments in HTML
<p style="background:red" background-image: url('a.jpg') background-repeat>...........</p> # Paragraph
<p><a href="#C4">Jump to Chapter 4</a> # Creating a Link to create a BookMark using the ID
<h2 id="C4"> Chapter 4 </h2> # Creating a heading with id that will be tagged with a link to create a bookmark. ID's are unique and used only with one HTML element rather than class being used by multiple HTML elements
<a href="google.com"> This is a link</a> # Link
<a href="google.com" target="_blank"> This is a link</a> # Opens the document in a new window or tab
<a href="google.com" target="_parent"> This is a link</a> # Opens the document in parent frame
<a href="google.com" target="_top"> This is a link</a> # Opens the document in full body of the window
<a href="google.com" target="_self"> This is a link</a> # Opens the document in same window
<iframe src="a.html" name="iframe_a" height="10" width="10" title="Title Iframe"></iframe> # Creating a Iframe ( An inline fram)
<p><a href="google.com" target="iframe_a">Hello, the link will open when clicked on the link</a></p> # Using Iframe in the link
<ol> # Ordered List
<li>Coffee</li> # Lists
<li>Tea</li>
</ol>
<img src="a.jpeg" alt="Image" width="2" height="2"> # Image
<img src="computer_table.jpeg" usermap="#workmap"> # Using Image Map
<map name="workmap">
<area shape="" coords="34,44,270,350" href="computer.htm">
<area shape="" coords="31,41,21,35" href="phone.htm">
<map>
</body>
<script> # Creating a Javascript inside the Html Page
function myfunc() {
document.getElementById("C4").innerHTML = "Have a nice DAY "
var x = document.getElementsByClassName("city"); # Using the Class city within the Javascript within a HTML Page
for (var i = 0; i< x.length; i ++ ) {
x[i].style.display = "none"
}
}
</script>
</html>
<header> - Defines a header for a document or a section
<nav> - Defines a set of navigation links
<section> - Defines a section in a document
<article> - Defines an independent, self-contained content
<aside> - Defines content aside from the content (like a sidebar)
<footer> - Defines a footer for a document or a section
<details> - Defines additional details that the user can open and close on demand
<summary> - Defines a heading for the <details> element
yamland yml files are superset of JSON. Some of the automation tools such as ansible uses yaml based files, referred to as playbooks, to define actions you want to automate. These playbooks use the YAML format.
Working with yaml files is a fun in python , so lets get started but In order to work with yaml files in python you would require to install a PyYAML libraryas Python doesn’t contain standard library. PyYAML is a YAML parser and emitter for Python.
Run the following command to install PyYAML library in your favorite code editor terminal such as visual code studio.
pip install PyYAML
Next, create a folder with a name Python and under that create a simple YML file and name it as apache.yml and paste the below content and save it.
---
- hosts: webservers
vars:
http_port: 80
max_clients: 200
remote_user: root
tasks:
- name: ensure apache is at the latest version
yum:
name: httpd
state: latest
Next, create another file in same Pythonfolder and name it as read_write_yaml.py and paste the below python code.
Below Python script imports yaml module to work with yaml files and pprint module to get a output in well designed pattern. Next, using open() function it opens the apache.yml file and reads the data using yaml.safe_load() method. Later, using yaml.dump() you can add or write the data into it. As we are not adding any data into it , the output would result as NONE
import yaml
from pprint import pprint
with open('apache.yml', 'r') as new_file:
verify_apache = yaml.safe_load(new_file)
pprint(verify_apache)
with open('apache.yml', 'w') as new_file2:
verify_apache2 = yaml.dump(verify_apache, new_file2)
pprint(verify_apache2)
Execute the above python script using python command and you should see the below output.
[{'hosts': 'webservers',
'remote_user': 'root',
'tasks': [{'name': 'ensure apache is at the latest version',
'yum': {'name': 'httpd', 'state': 'latest'}}],
'vars': {'http_port': 80, 'max_clients': 200}}]
None
Reading and Writing a XMLfile using python
XML files used mostly for structured data. Many web system uses XML to transfer data and one of them is RSS ( Real Simple Syndication) feeds which helps in finding the latest updates on websites from various sources. Python offers XML Library.
Next, in the same Pythonfolder create a simple XML file and name it as book.xml and paste the below content and save it. XML has a tree like structure and top element is known as root and rest of them are elements
<?xml version="1.0"?>
<catalog>
<book id="bk109">
<author>Author1</author>
<title>Automate Infra Part 2</title>
<genre>Science Fiction</genre>
<price>6.95</price>
<publish_date>2000-11-02</publish_date>
<description>book1</description>
</book>
<book id="bk112">
<author>Author2</author>
<title>Automate Infra Part 1</title>
<genre>Computer</genre>
<price>49.95</price>
<publish_date>2001-04-16</publish_date>
<description>book2</description>
</book>
</catalog>
Next, create another file in same python folder and name it as read_write_xml.py and paste the below python code.
In below script, importing xml.etree.ElementTreemodule helps to work with xml files and implements a simple and efficient API for parsing and creating XML data. Next, In XML file entire tree is parsed that is it reads the book.xml file and then prints the content inside it.
import xml.etree.ElementTree as ET
tree = ET.parse('book.xml') # checking each elements
root = tree.getroot() # finding the root
print(root.tag,root.attrib)
for child in root: # Each child and its attributes
print(child.tag,child.attrib)
Execute the above python script using python command and you should see the below output.
O/P:
catalog {}
book {'id': 'bk109'}
book {'id': 'bk112'}
Reading and Writing a comma-separated values (CSV)file using python
CSV is most widely used spreadsheets. To work with these file in python you need to import the csv module. Lets learn how to read and write data into CSV.
Next, in the same Pythonfolder create a CSV file and name it as devops.csv and add the content similar to below in your file and save it.
Next, create another file in same python folder and name it as read_write_csv.py and paste the below python code.
Below script uses csv module to work with csv files. As soon as python script is executed , open() function opens the csv file and then using csv.reader() it reads it and then prints the rows according to the defined range.
import csv
with open('devops.csv' , 'r') as csv_file:
read = csv.reader(csv_file, delimiter=',')
for _ in range(5):
print(next(read))
print(read)
Execute the above python script using python command and you should see the below output.
pandas.DataFrame, which acts like a data table, similar to a very powerful spreadsheet. If you want to work on something like row or column in Spreadsheet then DataFrames is the tool for you.So lets get started by installing pip install pandas
PYTHON: DEAL With Large Files(FILE BREAKER and LINE BREAKER)
Rather than loading the whole file into memory as you have done up until now, you can read one line at a time, process the line, and then move to the next. The lines are removed from memory automatically by Python’s garbage collector, freeing up memory.
# LINE BREAKER
with open("devops.txt",mode="r") as mynewfile: # if you open any binary file such as pdf keep w as wb
with open("devops-corrected.txt", "w") as target_file:
for line in mynewfile:
print(target_file.write(line))
o/p:
Automateinfra.com
automateinfra.com/blog
automateinfra.com/solutions
# FILE BREAKER with chunk of data with number of bytes
with open('book.xml' , 'rb') as sourcefile:
while True:
chunk = sourcefile.read(1024) # break down in 1024 bytes
if chunk:
print(chunk)
else:
break
O/P:
b'<?xml version="1.0"?>\r\n<catalog>\r\n <book id="bk109">\r\n <author>Author1</author>\r\n <title>Automate Infra Part 2</title>\r\n <genre>Science Fiction</genre>\r\n <price>6.95</price>\r\n
<publish_date>2000-11-02</publish_date>\r\n <description>book1</description>\r\n </book>\r\n <book id="bk112">\r\n <author>Author2</author>\r\n <title>Automate Infra Part 1</title>\r\n <genre>Computer</genre>\r\n <price>49.95</price>\r\n <publish_date>2001-04-16</publish_date>\r\n <description>book2</description>\r\n </book>\r\n</catalog>'
PYTHON ENCRYPTION: MOST IMPORTANT TOPIC OF PYTHON FILE SYSTEM
There are many times you need to encrypt text to ensure security. In addition to Python’s built-in package hashlib, there is a widely used third-party package called cryptography
HASHLIB: Uses Hash Function and based on SHA1, SHA224, SHA384, SHA512, and RSA’s MD5 Algorithms
CRYPTOGRAPHY:
symmetric key encryption: Its based on shared keys. These algorithms include Advanced Encryption Algorithm (AES), Blowfish, Data Encryption Standard (DES), Serpent, and Twofish
asymmetric key encryption: Its based on public keys ( which are widely shared ) and private keys which is kept secretly
# Encryption using HashLib
import hashlib # Python Built in Package
line = "I like editing automateinfra.com"
bline = line.encode() # Converting into Binary string
print(bline) # Print the converted Binary string
algo = hashlib.md5() # Using the secure alogorithm using haslib object
algo.update(bline) # Applying the secure alogorithm
print("Encrypted text Message")
print(algo.digest()) # Print the Encypted string
# Encryption using Cryptography (Symmetric key encryption)
from cryptography.fernet import Fernet # Third Party Package So you would need pip install cryptography
key = Fernet.generate_key() # Generating the keys
print("Generating the keys ")
print(key) # Prining the keys
algo = Fernet(key) # Using the key AES alogo using Fenet object
message = b"I definetely like Editing AutomateInfra.com"
encrypted = algo.encrypt(message)
print("Encrypted text Message ")
print(encrypted)
print(algo.decrypt(encrypted))
# Encryption using Cryptography (ASymmetric key encryption)
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives.asymmetric import padding ,rsa
from cryptography.hazmat.primitives import hashes
private_key = rsa.generate_private_key(public_exponent=65537,key_size=4096,backend=default_backend()) # Generating the Private Key
print(private_key) # Printing the Private Key
public_key = private_key.public_key() # Generating the Public Key
print(public_key) # Printing the Public Key
message = b"I am equally liking Editing AutomateInfra.com"
encrypted = public_key.encrypt(message,padding.OAEP(mgf=padding.MGF1(algorithm=hashes.SHA256()), algorithm=hashes.SHA256() , label=None))
print(encrypted)
decrypted = private_key.decrypt(encrypted,padding.OAEP(mgf=padding.MGF1(algorithm=hashes.SHA256()), algorithm=hashes.SHA256(), label=None))
print(decrypted)
O/P:
b'I like editing automateinfra.com'
Encrypted text Message
b'v\x84*\xe55\x01\xa4z\x05\xa2\xa2\xdb\xd1y\xa9\x07'
Generating the keys
b'7trCiXpGuCfEnXoIcsFfCGOw-u_Qkas0tv1lBM8xmQo='
Encrypted text Message
b'gAAAAABgAJcPFj-aGttg8MRJQfRYGWyOWy44u-cLWGuDhqoyyvYP1uG4oQYms8BQMr4eExpv74LIZESGvpIUY88fE0_YQCQ32JH0DZsabLOAtc00QCwV8L51WktRjzUab0Fp3jnbOeb2'
b'I definetely like Editing AutomateInfra.com'
<cryptography.hazmat.backends.openssl.rsa._RSAPrivateKey object at 0x036491D8>
<cryptography.hazmat.backends.openssl.rsa._RSAPublicKey object at 0x03850E38>
b"\x8b\xec\xb0\x91\xec\xe7\x8d;\x11\xbclch\xbdVD@c\xd3J\x07'\xe9\x07\x15\x1c@=^\xd2h\xcaDL\x95\xea[\x0fv\x012\xed\xd5\xed\x0e\x9b\x93V2\x00\xba\x9c\x07\xba\x8b\xf3\xcb\x03M\xa8\xb1\x12ro\xae\xc0\xfb$\xf9\xcc\x85\xe8s\xfc`{\xfe{\x88\xd2\xc3\xffI\x90\xe3\xd2\x1e\x82\x95\xdfe<\xd5\r\x0b\xc4z\xc4\xf7\x00\xcfr\x07npm0\xd4\xc4\xa4>w\x9d]\xcf\xae7F\x91&\x93\xd5\xda\xcaR\x13A\x8ewB\xf6\xd9\xae\xce\xca\x8f\xd6\x91\x06&:\x00\xa0\x84\x05#,\x7fdA\x87\xb2\xe7\x1d\x8b*\xa15\xf8\xb0\x07\xa0n\x1e\xeaI\x02\xbaA\x88ut\x8e\x82<\xfe\xbfM\xe6F\xa3\xcc\xd4\x8b\x80PY\xb5\xd3\x14}C\xe2\x83j\xaf\x85\xa6\x9e\x19\xb2\xd9\xb8\xac\xa4\xfb\x1f\x0c\xce\x9d4\x82\x1e\xfd5\xb49\xa5\xbbL\x01~\x8fA\xee\r\xc7\x84\x9e\x0c\t\x15z\r\xfd]\x0b\xcfW\x01\xd2\x16\x17btc\xeaSl\xf5\xb0\x8a\xe2X\xe7\xa7a\xa7\xf7M\x01\xa2\x0b8\xd6\xf2\xc5c\xbf\xea\xe0\x80\x15\xde-\x98\xa1\xc8ud*\xbel2\xb5\xc8:\x92\xd5\r(_8\xbd\xcb\x80\xf1\x93\x83\xe2\x9f\xed\x82f\xd0\xb2\x8f\x1b\x9eMC\x07\xf9\x08\xb0\x00QA\xea\x93\xc7@&\x84\xff<\xde\x80@\xc8\xc6\x83O&%\x91r-\xb0\xef}\x18tU{C\xa6\x17\x97\x1b\x95g\xc5\x0e>{\xb0\x94a)\xbc)*Sq\x98\xad\xf3>\x04\x9b+x\x95&\xa6\xe6,\xb4~\xf2Y\x06,\xab'uq \x9f0\x7f\xb5\xd50\xbdp\xbb\xdf\x1c\xe9\xb1\xc4\x88y\nq\\\x85\x1e\xd8\x18M\x87\x1aU.\x918;\xcd\x10 \x9b\x11\xf9R\xd3\x8fz\xe8\xf6|C\xfb\x1f\xfd1\x19\x10:>\x1c\x06\x8e\xda\x98\xb2\xf3aa^\xa54\x03\xf8\x03\xc4\xe6\xd9mw\r\x8b\x96\xa2rJ\x03\xe7\xda\x0f\rJ-iPo!^\x8a\xdcg\x8c!L\xa4\xedY\xe5\x12\xdf\xe8\xe7\x0cE\xcd\xa2\xa2Gr\xc0\xe1\xa6\xc5\x9a\x9f\x07\x89\x84\x8b\xb7"
b'I am equally liking Editing AutomateInfra.com'
PYTHON OS MODULE:
This module will help to connect with many low level operating system calls and offers connectivity between multi -OS like Unix and Windows.
import os # Python Built in Package
print(os.listdir('.')) # List the directories
os.rename('automateinfra.txt','automateinfra_backup.txt')
os.chmod('automateinfra.txt',777) # Add the permissions to the file.
os.mkdir('/tmp/automateinfra.pdf') # Make the directory
os.rmdir('/tmp/automateinfra.pdf') # remove the directory
os.stat('b.txt') # These stats include st_mode, the file type and permissions, and st_atime, the time the item was last accessed.
cur_dir = os.getcwd() # Get the current working directory.
print(os.path.dirname(cur_dir)) # Returns the Parent Directory Path
print(os.path.split(cur_dir)) # Gives structure from Parent Directory
print(os.path.basename(cur_dir)) # Returns Base Directory
while os.path.basename(cur_dir): # Until Base Path directory is true , keep continuing
cur_dir = os.path.dirname(cur_dir) # Prints the base directory and all above parents Directory
print(cur_dir)
O/P:
C:\Users\AutomateInfra\Desktop\GIT\Python-Desktop
('C:\\Users\\AutomateInfra\\Desktop\\GIT\\Python-Desktop', 'Basics')
Basics
C:\Users\AutomateInfra\Desktop\GIT\Python-Desktop
C:\Users\AutomateInfra\Desktop\GIT
C:\Users\AutomateInfra\Desktop
C:\Users\AutomateInfra
C:\Users
C:\
import os
# Check the current working directory
file_name = "automateinfra.txt"
file_path = os.path.join(os.getcwd(), file_name)
print(f"Checking {file_path}")
if os.path.exists(file_path):
print(file_path)
# Check user home directory
home_dir = os.path.expanduser("~/") #expanduser function to get the path to the user’s home directory.
file_path = os.path.join(home_dir,file_name)
print(f"Checking {file_path}")
if os.path.exists(file_path):
print(file_path)
o/p:
Checking C:\Users\Automateinfra\Desktop\GIT\Python-Desktop\Basics\automateinfra.txt
C:\Users\Automateinfra\Desktop\GIT\Python-Desktop\Basics\automateinfra.txt
Checking C:\Users\Automateinfra/automateinfra.txt
API is an interface that allows communication between client to server to simplify the building of client-server software.
API is an software that allows two applications to talk to each other. Each time you use an app like Facebook, send an instant message, or check the weather on your phone, you’re using an API.
When you use an application on your mobile phone, the application connects to the Internet and sends data to a server. The server then retrieves that data, interprets it, performs the necessary actions and sends it back to your phone. The application then interprets that data and presents you with the information you wanted in a readable way. This is all possible with an API.
Difference between Types of API’s [ SOAP v/s REST ]
REST: Representational State Transfer. It is an lightweight and scalable service built on REST architecture. It uses HTTP protocol. It is based on architectural pattern
Elements of REST API:
Method: GET, PUT, DELETE
POST – This would be used to send the data to the server such as customer information or uploading any file using the RESTful web service. To send the data use Form parameter and body payload.
GET – This would be used to retrieve data from the server using the RESTful web service. It only extracts the data there is no change in the data. No Payload or body required. To get the data use query parameter.
PUT – This would be used to update the resources using the RESTful web service
DELETE – This would be used to delete * using the RESTful services
Request Headers: These are additional instructions that are sent along with the request
Request Body: Data is sent along with the POST request that is it wants to add a resource to the server.
Response status code: Returned along with the request such as 500, 200 etc.
Characteristics of REST
REST is an Architectural style in which a web service can only be treated as a RESTful service if it follows the constraints of being 1. Client Server 2. Stateless 3. Cacheable 4. Layered System 5. Uniform Interface
Stateless means that the state of the application is not maintained in REST .For example, if you delete a resource from a server using the DELETE command, you cannot expect that delete information to be passed to the next request. This is required so that server can process the response appropriately
The Cache concept is to help with the problem of stateless which was described in the last point. Since each server client request is independent in nature, sometimes the client might ask the server for the same request again
REST use Uniform Service locators to access to the components on the hardware device. For example, if there is an object which represents the data of an employee hosted on a URL as automateinfra.com , the below are some of URI that can exist to access them automateinfra.com/blog
SOAP: Simple Object Access Protocol.
Follows strict rules for communicate between [client-server] as it doesn’t follows what is being followed by REST follows Uniform Interface, Client-Server, Stateless, Cacheable, Layered System, Code.
SOAP was designed with a specification. It includes a WSDL file which has the required information on what the web service does in addition to the location of the web service.
The other key challenge is the size of the SOAP messages which get transferred from the client to the server. Because of the large messages, using SOAP in places where bandwidth is a constraint can be a big issue.
SOAP uses service interfaces to expose its functionality to client applications. In SOAP, the WSDL file provides the client with the necessary information which can be used to understand what services the web service can offer.
SOAP uses only XML to transfer the information or exchanging the information where as REST uses plain text, HTML , JSON and XML and more.
Application Programming Interface theory (API-theory)
When Website is owned by single owner such as Google: In that case when frontend site needs to connect to backend site then it may vary with different languages and can cause lot of compatibility issues such as frontend uses Angular and backend uses Java , so you would need API to deal with it.
When Your client needs to access data from your website then you would need to expose the API rather than exploring your entire code and packages.
When client connects to another client or server using API the transmission of data takes places using either XML or JSON which are language independent.
Jenkins is an open source automated CI/CD tool where CI stands for continuous integration and CD stands for Continuous delivery. Jenkins has its own built-in Java servlet container server which is Jetty. Jenkins can also be run in different servlet containers such as Apache tomcat or glassfish.
Jenkins is used to perform smooth and quick deployment. It can be deployed to local machine or on premises data center or any cloud.
Jenkins takes your code any sort of code such as python, java or go or JS etc. and compiles it using different compiler such as MAVEN one of the most used compiler and then builds your code in war or Zip format and sometimes as a docker Image. Finally once everything is built properly it deploy as an when required . It integrates very well with lots of third party tools.
JAVA_HOME and PATH are variables to enable your operating system to find required Java programs and utilities.
JAVA_HOME: JAVA_HOME is an (OS) environment variable that can optionally be set after either the (JDK) or (JRE) is installed. The JAVA_HOME environment variable points to the file system location where the JDK or JRE was installed. This variable should be configured on all OS’s that have a Java installation, including Windows, Ubuntu, Linux, Mac, and Android.
The JAVA_HOME environment variable is not actually used by the locally installed Java runtime. Instead, other programs installed on a desktop computer that requires a Java runtime will query the OS for the JAVA_HOME variable to find out where the runtime is installed. After the location of the JDK or JRE installation is found, those programs can initiate Java-based processes, start Java virtual machines and use command-line utilities such as the Java archive utility or the Java compiler, both of which are packaged inside the Java installation’s \bin directory.
JAVA_HOME if you installed the JDK (Java Development Kit) or
JRE_HOME if you installed the JRE (Java Runtime Environment)
PATH: Set the PATH environment variable if you want to be able to conveniently run the executables (javac.exe, java.exe, javadoc.exe, and so on) from any directory without having to type the full path of the command. If you do not set the PATH variable, you need to specify the full path to the executable every time you run it, such as:
C:\Java\jdk1.8.0\bin\javac Myprogram.java
# The following is an example of a PATH environment variable:
C:\Java\jdk1.7.0\bin;C:\Windows\System32\;C:\Windows\;C:\Windows\System32\Wbem
Installing Jenkins using msi installer on Windows Machine
MSI is an installer file that installs your program on the executing system. Setup.exe is an application (executable file) that has MSI file(s) as one of the resources. The MSI is the file extension of MSI files. They are Windows installers. An MSI file is a compressed package of installer files. It consists of all the information pertaining to adding, modifying, storing, or removing the respective software. MSI file includes data, instructions, processes, and add-ons that are necessary for the application to work normally.
EXE is short for Executable. This is any kind of binary file that can be executed. All windows programs are exe files. Prior to MSI files, all installers were EXE files. The exe is a file extension of an executable file. An executable file executes a set of instructions or a code when opening it. An executable file is compiled from source code to binary code. It can be directly executed by the Windows OS. These files are understandable by the machine, and they can be directly executed by the operating system
MSI is a file extension of windows installer which is a software component of Microsoft Windows used for the installation, maintenance, and removal of software. Whereas, exe is a file extension of an executable file that performs indicated tasks according to the encoded instructions.
Select the Port 8080 and click on Test Port and then Hit Next.
Provide the admin password from the provided Path mentioned in RED color.
Further install the plugins required for jenkins.
Next,it will prompt for First admin user. Please fill the required information and keep it safe with you , as you will use this to login.
Now Jenkins URL configuration screen will appear , keep it as it is for now.
Click on Save and Finish.
Now your Jenkins is ready , click on Start using Jenkins. Soon, you will see Jenkins Dashboard. You can create New Jobs by clicking on New Item.
Installing Jenkins using jenkins exe on Windows Machine
Similarly now install jenkins.war from jenkins URL and click on Generic Java package(.war).
Next run the command as below.
java -jar jenkins.war -http=8181
Next, copy the Jenkins password from the log output and paste it in the as you did earlier in windows msi section point (5) and follow rest of the points.
Installing jenkins on Apache Tomcat server on Windows Machine
Install the Apache Tomcat on windows machine from https://tomcat.apache.org/download-90.cgi and click on tomcat installer as per your system. This tutorial is performed on 64 bit windows machine.
Next, unzip the tomcat installation folder and copy the jenkin.war file in the webapps folder.
Next, go inside the bin folder and run the tomcat by clicking on the startup batch script.
Finally you will notice that Apache Tomcat has started and Jenkins as well.
Now, navigate to localhost:8080 URL and you should see tomcat page as shown below.
Further, navigate to localhost:8080/jenkins to redirect to Jenkins Page.
Configuring the Jenkins UI
First click on Manage Jenkins and then navigate to Configure system.
Next, add the system message and save it which should display this message on Jenkins everytime as below.
To configure the name of the Jobs add the name Pattern as below.
Next, try creating a a new Jenkins Job with random name then it will not allow you and display the error message.
Managing User’s and Permission’s in Jenkins UI
Go to Manage Jenkins and Navigate to Manage users in the Jenkins UI.
Then Create three users as shown below admin, dev, qa.
Next, Navigate to Manage Jenkins and choose Configure Global Security.
Next select Project-based Matrix Authorization Strategy and define the permissions for all users as you want.
Role Based Stratergy
In Previous section you noticed that adding all users and grnating all permissions is little tough job. So, instead create a role and add users in it. To do that first step is to install the Plugin as shown below.
Next select Role based Stratergy as shown below and define the permissions for all users as you want.
Next, navigate to Manage Jenkins and then to Manage and Assign Jenkins and then click on Manage Roles.
Add 3 Global Roles named DEV Team, QA Team and admin.
Add 2 Items Roles developers and Testers with define patterns so that Jobs names are declared accordingly.
Next, Click on Assign Role
Assigning the roles as shown below.
Conclusion
In this tutorial you learnt how to install jenkins on windows through various ways , how to configure Jenkins Dashboard UI and how to manager users and Permissions.
Are you a Python developer and trying to understand how does Python Language works? This article is for you where you will learn each and every bit and piece of Python Language. Let’s dive in!
Python
Python is a high-level language, which is used in designing, deploying, and testing at lots of places. It is consistently ranked among today’s most popular programming languages. It is also dynamic and object-oriented language but also works on procedural styles as well, and runs on all major hardware platforms. Python is an interpreted language.
High Level v/s Low Level Languages
High-Level Language: High-level language is easier to understand than is it is human readable. It is either compiled or interpreted. It consumes way more memory and is slow in execution. It is portable. It requires a compiler or interpreter for a translation.
Low-Level Language: Low-level languages are machine-friendly that is machines can read the code but not humans. It consumes less memory and is fast to execute. It cannot be ported. It requires an assembler for translation.
Interpreted v/s Compiled Language
Compiled Language: Compiled language is first compiled and then expressed in the instruction of the target machine that is machine code. For example – C, C++, C# , COBOL
Interpreted Language: An interpreter is a computer program that directly executes instructions written in a programming or scripting language, without requiring them previously to have been compiled into a machine language program and these kinds of languages are known as interpreter languages. For example JavaScript, Perl, Python, BASIC
Python vs C++/C Language Compilation Process
C++ or C Language: These Languages need compilation that means human-readable code has to be translated into Machine-readable code. The Machine code is executed by the CPU. Below is the sequence in which code execution takes place.
Human Readable is compiled.
Compilation takes place.
Compiled code generates a executable file which is in a machine code format (Understood by Hardware).
Execuation file is executed by CPU
Python Language:
Python is a high-level language
Bytecode, also termed p-code, is a form of instruction set designed for efficient execution by a software interpreter
Python code is written in .py format such as test.py.
Python code is then compiled into .pyc or .pyo format which is a byte code not a machine code ( Not understood by Machine) using Python Interpreter.
Once your program has been compiled to byte code (or the byte code has been loaded from existing .pyc files), it is shipped off for execution to something generally known as the Python Virtual Machine
Byte code is converted into machine code using PVM ( Python Virtual Machine).
Once your program has been compiled to byte code (or the byte code has been loaded from existing .pyc files), it is shipped off for execution to something generally known as the Python Virtual Machine
Now byte code that is test.pyc is further converted into machine code using virtual machine such as (10101010100010101010)
Finally Program is executed and output is displayed.
Conclusion
In this tutorial, you learnt how the python language works and interacts with Operating systems and Hardware. So, which application are you planning to build using Python?
Cannot we isolate our apps so that they are independent of each other and run perfectly ? The answer is absolutely “YES”, that correct that’s very much possible with docker and containers. They provide you isolated environment and are your friend for deploying many applications with each taking its own container. You can run as many as containers in docker and are independent of each other. They all share same kernel memory.
In this tutorial we will go through a simple demonstration of a python application which will run on docker engine.
Table of content
What is Python ?
What is docker ?
Prerequisites
Create a Python flask application
Create a Docker file
Build Docker Image
Run the Python flask application Container
Conclusion
What is Python ?
Python is a language from which you create web applications and system scripts. It is a used vastly across the organizations and very easy to learn. Python apps require isolated environment to run its application very well. This is quite possible with Docker and containers which we will use in this tutorial.
If you wish to know more about python please visit our Python’s Page to learn all about Python.
What is docker ?
Docker is an open source tool for developing , shipping and running applications. It has ability to run applications in loosely isolated environment using containers. Docker is an application which helps in management of containers in a very smooth and effective way. In containers you can isolate your applications. Docker is quite similar to virtual machine but it is light weighted and can be ported easily.
Containers are light weighted as they are independent of hypervisors load and configuration. They directly connect with machines ie. hosts kernel.
Prerequisites
Ubuntu machine preferably 18.04 version + , if you don’t have any machine you can create a ec2 instance on AWS account
Create and activate a virtual environment named virt:
virtualenv venv
source virt/bin/activate
Finally install Flask
pip install flask # Install Flask from pip
Now create a text file and name it as app.py where we will write our first python flask code as below.
from flask import Flask # Importing the class flask
app = Flask(__name__) # Creating the Flask class object.
@app.route('/') # app.route informs flask about the URL to be used by function
def func(): # Creating a function
return("Iam from Automateinfra.com")
if __name__ == "__main__": # Programs starts from here.
app.run(debug=True)
Create one more file in same directory and name it as requirements.txt where we will define the dependency of flask application
Flask==1.1.1
Now our python code app.py and requirements.txt are ready for execution. Lets execute our code using below command.
python app.py
Great, so our python flask application ran successfully on our local machine. Now we need to execute same code on docker . Lets now move to docker part.
Create a docker file
Docker file is used to create a customized docker images on top of basic docker image. It is a text file that contains all the commands to build or assemble a new docker image. Using docker build command we can create new customized docker images . Its basically another layer which sits on top of docker image. Using newly built docker image we can run containers in similar way.
Create a docker file and name it as Docker file . Keep this file also in same directory as app.py and requirements.txt
FROM python:3.8 # Sets the base image
WORKDIR /code # Sets the working directory in the container
COPY requirements.txt . # copy the dependencies file to the working directory
RUN pip install -r requirements.txt # Install dependencies
COPY src/ . # Copy the content of the local src directory to the working directory
CMD [ "python", "./app.py" ] # Command to run on container start
Build docker Image
Now we are ready to build our new image . So lets build our image
docker build -t myimage .
You should see the docker images by now.
docker images
Run the Python flask application Container
Now run our first container using same docker image ( myimage)
docker run -d -p 5000:5000 myimage
Verify if container is successfully created.
docker ps -a
Conclusion
In this tutorial we covered what is docker , what is python and using python flask application created a application on docker engine in one of the containers.
Hope this tutorial will helps you in understanding and setting up Python flask and python flask applications on docker engine in ubuntu machine.