Human Face, emotion and race detection with python

A mini OpenCV machine learning project.

Raj Kumar
Analytics Vidhya

--

We use OpenCV, deepface libraries, and haarcascade_frontalface_default.xml file to detect a human face, facial emotion, and race of a person in an image.

Before detection
after detection

download this image from https://unsplash.com/photos/vtwjyEelY08

Before we start, I assume that you know python basics and have already installed python

Other installations required are :

Opencv — install it by using <pip install opencv-python>

deepface — install it by using <pip install deepface>

matplotlib— install it by using <pip install matplotlib>

download haarcascade_frontalface_default.xml from https://raw.githubusercontent.com/opencv/opencv/master/data/haarcascades/haarcascade_frontalface_default.xml and save this in the same folder as your project.

What is the haarcascade_frontalface_default.xml file?

It is a file that is pre-trained to detect face and give some points which are used to draw rectangles, squares, or any shapes on the face.

Now let's get into the coding part (I’m doing this in a jupyter notebook).

First, let's import the required libraries and load the happy_boy.jpg (that image we downloaded previously), and displaying that image.

output :

Every time when we display an image it will be in BGR, before displaying we need to convert it into a color photo to show it in color. To do so we use cvtColor() function from cv2.

output :

Predicting emotion and race of the person/kid in the picture using Deepface

Import DeepFace and analyze our color_img.

It will analyze the image and save the detected emotion and race to the ‘prediction’ variable.

Let’s see what it predicted

output:

In the output, predicted values are dominant_emotion and dominant_race. Rest all the points used to predict the emotion and race. We must only see that dominant_emotion and dominant_race, which are our predictions.

we can access dominant_emotion and dominant_race’s key values like any other dictionary data type.

So DeepFace.analyze() has predicted that kid’s emotion — happy and race — asian.

Detecting Faces

Detecting faces is very short with OpenCV.

we only required the haarcascade_frontalface_default.xml file and OpenCV library to detect a face in an image. Here is the code.

output:

Getting all together

As you see we detected the emotion, race, and face of the kid in the image now it’s the final part to get all together in one. That means now we are going to display emotion and race on the image.

output:

If you didn’t know/understand what all the things that we wrote in cv.putText() function, here the syntax below

cv.putText(img, ‘ text we want to put on ’, ‘ location where we want to put text ’ , font, text-size, color, text-thickness, text-texture)

So this is how we detect a face, human emotion, and race.

Now go and try this for other images and your images also.

We can also use this with a webcam or front cam but it is very slow. You can visit the repository (link given below) for that code (file name: deep_face.py).

If you want to see this project visit this link https://github.com/raj26kumar/human_emotion_race_detection/blob/main/face_emotion.ipynb

Thanks for reading.

--

--