The following tasks will be performed in the above code:
first step in this task is we have to start our VM and OS(RHEL8) and install ‘httpd’ service in our OS. where ‘httpd’ is the Apache HyperText Transfer Protocol (HTTP) server program. It is designed to be run as a standalone daemon process. When used like this it will create a pool of child processes or threads to handle requests.
yum install httpd
→ Collecting Data
import numpy as np
# Load HAAR face classifier
face_classifier = cv2.CascadeClassifier('haarcascade_frontalface_default.xml')
# Load functions
# Function detects faces and returns the cropped face
# If no face detected, it returns the input image
gray = cv2.cvtColor(img,cv2.COLOR_BGR2GRAY)
faces = face_classifier.detectMultiScale(gray, 1.3, 5)
if faces is ():
# Crop all faces found
for (x,y,w,h) in faces:
cropped_face = img[y:y+h, x:x+w]
# Initialize Webcam
cap = cv2.VideoCapture(0)
count = 0
# Collect 100 samples of your face from webcam input
ret, frame = cap.read()…
Cyber-attacks become the biggest threat in computer and networks system around the world. Because of that it is important to merge IDS that can detect and analyze the data with high accuracy (i.e., true positives and negative) and low false detection (i.e., false positive and negative) in the minimal detection time. So, K-Means clustering detection model with appoint of data mining, peculiarly clustering method is a notable field that can be explored to overcome this matter. …
- Pull the Docker container image of CentOS image from DockerHub and create a new container
- Install the Python software on the top of docker container
- In Container you need to copy/create machine learning model which you have created in jupyter notebook
- Create a blog/article/video step by step you have done in completing this task.
- Submit the link of blog/article or video
Step 1 — Here,We are using Jupyter Notebook to create a machine learning model with a salary dataset that predicts the estimated salary.
Under The Guidance Of Mr. Vimal Daga Sir !!
We Have Finally Completed the !! Where I have Learn From Zero To Hero Topics
→Grep and linux basic commands
→ User add
→Linux permissions on files and folder
etc … etc …. etc …
Thank You So, Much Sir for Wonderful Sessions !
→AI ON AWS had utilized my last weekend by attending a 2-day workshop on “AI Services on AWS”. The workshop was mentored by Mr. Vimal Daga Sir.
The way sir takes the live sessions boosts our mind to think of great projects to create based on what he taught us.
We Explored These Services OF AI on AWS →
— Amazon Rekognition
— Amazon Polly
— Amazon Lex
— Amazon Kendra
— Amazon Comprehend
— Amazon CodeGuru
— Amazon Forecast
— Amazon Textract
— Amazon Fraud Detector
— Amazon Personalize
— Amazon Translate
— Amazon Transcribe
During 2015 Google I/O keynote address in San Francisco, Google revealed they were working on improving their search engine.
These improvements are powered by a 30 layer deep Artificial Neural Network.
This depth of layers, Google believes, allows the search engine to process complicated searches such as shapes and colors.
Using an Artificial Neural Network allows the system to constantly learn and improve.
This allows Google to constantly improve its search engine.
Within a few months, Google was already noticing improvements in search results.
The company reported that its error rate had dropped from 23% down to just 8%. …