FaceLog2 (DEPRECATED, favour FaceLog6)

Description

FaceLog2 is an advanced analytic that tracks unique faces in a video stream, classifies and optionally identifies them against a supplied watchlist.

  • Product name: FaceLog2
  • Version: 1.0

How to create a PAnalytics FaceLog2 object from this plugin?

PAnalytics faceLog2;
PProperties faceLog2Options;
PAnalytics::Create("FaceLog2", faceLog2Options, faceLog2).OrDie();

As with all PAnalytics, it works on PFrame's pulled from a video stream and releases PEvent's when they are ready. FaceLog2 releases three main kind of events; "Face", "Track" and "Sighting", see sections below to learn more.

Region-Of-Interest (ROI)

It is possible to restrict video analytics area using a region-of-interest, see parameters table below. It can be used to select a specific area in images and/or to speed-up processing.

faceLog.Set("roi", PRectanglei(20, 110, 640, 480));

Warning: coordinates of detected faces, tracks and sighting are given in the region-of-interest coordinates system; they will be shifted by the origin of the area (e.g. (20,110) in the previous example). You can use PRectanglef::Translated(), PFeatureMap::Translate(), PDetection::Translate() and PDetectionList::Translate() to relocate detections in the original full frame. Origin of the region-of-intereset is also stored in PDetection: see PDetection::GetOrigin(). See also PUtils::DrawLabel() which handles this offset.

Detections

A detection represents a single detection of a face in an image of a frame. If there are more than one face in the image then there will be multiple detection events released. Detection events are released per frame, there is no delay (other than the processing of the frame).

Each detection has its own unique id, called a detection-id.

Tracks

A track is a collection of detections, in successive frames, that belong to the same subject. Track events are released as soon as the track has ended, i.e. when a track has not been added to in the next frame.

Each track has its own unique id, called a track-id.

plugin_analyticsFaceLog2_Sightings

A sighting is a collection of tracks of the same subject. The tracks do not need to be in successive frames, there can be a break (lag) between them.

For example, often a track of a person walking through a video can be lost; For example, if they turn their head, or become obscured by someone else walking in front of them. In FaceLog2 we attempt to join similar tracks back together using both spatial information and using face-recognition. This collection of tracks is called a sighting.

After a configurable amount of time (measured in frames), when a track has not been added to a sighting, the sighting is released as an event. So a sighting will always be released a few seconds after the subject has left the scene.

Each sighting has its own unique id called a sighting-id.

Face Recognition

If a PWatchlist is provided to FaceLog2 as a parameter, then face recognition is performed on each track and sighting. Inside a watchlist each subject is assigned a unique id (subject-id) when a description. As each track and sighting is raised, face-recognition is performed against the watchlist. If a match is found (above a user set threshold), the track/sighting is labeled with the subject-id from the watchlist. A confidence of match is also assigned.

It should be noted, the most reliable identity is from the sighting as this is aggregated over the tracks.

Face Recognition Performance

The face recognition algorithm in Papillon are based on a powerful new technique called Deep Learning. They are vastly superior to what was available a few years ago. However, they still have their limitations.

You will get best performance when the faces in the video are within +/-30 degrees from frontal, the faces are diffusely illuminated (with no strong shadows) and there is a minimum of 50 pixels between the eyes. The algorithms will still function outside these parameters, however the performance will be degraded. Ideally, face-recognition should be used as an aid to a human operator performing their task, helping them reduce the amount of data they have to search or ordering the priority of data they have to review.

Face MetaData Classifier

A face-meta classifier is applied to the faces in each track and sighting. This provides extra information about the faces in the track and sighting. Currently this is limited to Gender information.

FaceLog2 Parameters (how to configure FaceLog2)

Parameters are set through a PProperties class which are supplied to the PAnalytics::Create() factory constructor.

PFaceLog2Parameters is a class which can be used to help configure FaceLog2.

We aim to have a sensible set of default parameters that work effectively in many scenarios, however given the range of different use-cases for this analytic this is not always possible.

Parameter Type Required Default Description
Watchlist PWatchlist No None The watchlist of identities. If not present no face-rec performed
WatchlistOptions PWatchlistOptions No Its constructor The watchlist options
MinDetectionSize int32 No 80 The minimum detection size, in pixels. This is the minimum face it will detect (minimum is 40)
MaxDetectionSize int32 No 1000 The maximum detection size, in pixels
Threshold double No 0.0 Threshold for the face-detector
MinCertainty double No 0.0 Threshold for a track
MinDetectionsPerTrack int32 No 3 The minimum number of detections to consider it is a track
MaxDetectionsPerTrack int32 No 100 The maximum number of detections before releasing a track (see also FrameLag); this parameter should be > FrameLag
SightingSimilarity double No 0.6 The recognition threshold to use when considering to join two tracks together
FrameLag int32 No 50 How many frames to elapse before releasing a sighting (see also MaxDetectionsPerTrack)
SpatialFilter bool No true Apply spatial constraints to tracks
ROI PRectanglei No None (0,0,0,0) Region-Of-Interest: restrict video analysis to the specified area of frames

The code below shows a small snippet in how to create a FaceLog2 analytic, attach a watchlist and run it on frames in a video.

#include <PapillonCore.h>
// Create and use faceLog2 analytic
void doFaceLog2(const PWatchlist& watchlist, PInputVideoStream& inputVideoStream)
{
// Set face-log options
PProperties faceLogOptions;
faceLogOptions.Set("Watchlist", watchlist);
PAnalytics faceLog;
PAnalytics::Create("FaceLog2", faceLogOptions, faceLog).OrDie();
// Specify a Region-Of-Interest (optional)
//faceLog.Set("roi", PRectanglei(20, 110, 320, 200));
// Loop through the video and get face-log events
PFrame frame;
PList events;
while(inputVideoStream.GetFrame(frame).Ok())
{
faceLog.Apply(frame, events);
// you can now process events (see Events section below)
// typical event type are "Face", "Track" or "Sighting"
}
// Mop up any left over events
faceLog.Finish(events);
// process final events...
}

Events

This analytic provides three types of events; Face, Track and Sighting. The type of event can be checked through PEvent::GetEventType().

Face Events

If PEvent::GetType() returns "Face", then the event is a face event. A face event is a detection of a face in an image.

The payload of the face event includes the following information (use PEvent::GetPayload() to access these data):

Parameter Type Description
Detection PDetection The detection
Description PDescription A description of the face (can be used to extract metadata)

Track and Track-In-Progress Events

If PEvent::GetType() returns "Track", then the event is a track event. A track event is released when a track has finished. It contains the normal PEvent information. The payload consists of the following properties (use PEvent::GetPayload() to access these data):

Parameter Type Description
StartFrame int64 The start frame of the track
EndFrame int64 The end frame of the track
NumberOfFrames int32 The number of frames in the track
StartTime PDateTime The start time of the track
EndTime PDateTime The end time of the track
Thumbnail PImage A thumbnail that represents the track
FaceRectangles PList Bounding rectangles (elements are PRectanglef)
Description PDescription A description of the track
SubjectId PGuid The subject id of the track
SubjectConfidence double The confidence of the subject

If no PWatchlist was supplied to the FaceLog2 analytic then the subject information will contain the PGuid::Null() and the confidence will be set to -1.

The description will contain information about the subject in the track that can be stored in a watchlist, and later searched.

The description will also contain meta-data about the face in track, e.g. its gender.

If PEvent::GetType() returns "Track-In-Progress", then the event is a track-in-progress event. Track-in-progress events is sent when a track start and is in-progress. The payload consists of the following properties (use PEvent::GetPayload() to access these data):

Parameter Type Description
StartFrame int64 The start frame of the track
NumberOfFrames int32 The number of frames in the track
StartTime PDateTime The start time of the track
FaceRectangles PList Bounding rectangles (elements are PRectanglef)
Description PDescription A description of the track

Sighting Events

If PEvent::GetType() returns "Sighting", then the event is a sighting event.

The payload of a sighting event contains the following information (use PEvent::GetPayload() to access these data):

Parameter Type Description
Thumbnail PImage A thumbnail that represents the sighting
StartFrame int64 The start frame index
EndFrame int64 The end frame index
NumberOfFrames int32 The number of frames
StartTime PDateTime The start time of the sighting
EndTime PDateTime The end time of the sighting
DetectionList PDetectionList A list of all the detections in the sighting
TrackList PList A list of all the track ids (PGuid objects) in the sighting
SubjectId PGuid The global id of the subject in the sighting, PGuid::Null() if not known
SubjectConfidence double The confidence of the identity, range between 0 and 1
Description PDescription The description of the subject (you can store this in a watchlist, see PWatchlist::Add())

The following snippet shows how to process events generated by FaceLog2:

void ProcessEvents(const PList& events)
{
for (int32 i=0; i<events.Size(); ++i)
{
PEvent event;
if (events.Get(i, event).Failed()) continue;
PProperties payload = event.GetPayload();
if (event.GetType() == "Sighting"))
{
// For example we can get subjectId from sighting
PGuid subjectId;
if (sighting.GetPayload().Get("SubjectId", subjectId).Ok())
{
// do something...
}
}
}
}

Example

This example will first enroll two subjects into a watchlist. This watchlist will then get saved and passed to the FaceLog2 analytic. FaceLog2 is then used to analyse a further video clip and perform identification on the sightings in the video.

/*
* Copyright (C) 2015-2018 Digital Barriers plc. All rights reserved.
* Contact: http://www.digitalbarriers.com/
*
* This file is part of the Papillon SDK.
*
* You can't use, modify or distribute any part of this file without
* the explicit written agreements of Digital Barriers.
*/
#include <PapillonCore.h>
#include "PFaceLog2Parameters.h"
USING_NAMESPACE_PAPILLON
const PString SAMPLE_DIR = PPath::Join(PUtils::GetEnv("PAPILLON_INSTALL_DIR"), "Data", "Samples"); // path to find sample data: $PAPILLON_INSTALL_DIR/Data/Samples
const float SCALE_FACTOR = 1.0f;
PResult HandleSighting(const PEvent& sighting, const PWatchlist& watchlist, bool writeVideo, bool display=false) {
PGuid sightingId = sighting.GetSource();
PString shortSightingId = sightingId.ToString().Substring(0, 3);
PString videoFile = PString("sighting_%0.avi?encode_with=opencv&fourcc=XVID&fps=10").Arg(shortSightingId);
PString thumbnailFile = PString("thumbnail_%0.jpg").Arg(shortSightingId);
if (writeVideo) {
if (POutputVideoStream::Open(videoFile, ovs).Failed()) {
P_LOG_ERROR<< "Failed to open " << videoFile << " for writing";
writeVideo = false;
}
}
// Each sighting has a thumbnail
PImage thumbnail;
sighting.GetPayload().Get(papillon::C_PROPERTY_THUMBNAIL, thumbnail);
if (display)
thumbnail.Display("Sighting", 40);
// We can get all the detections from sighting
PDetectionList detectionList;
sighting.GetPayload().Get("DetectionList", detectionList);
// We can get subjectId from sighting
sighting.GetPayload().Get("SubjectId", subjectId);
// We can get confidence we are of subject
double subjectConfidence = -1.0;
sighting.GetPayload().Get("SubjectConfidence", subjectConfidence);
// We can also get the description
PDescription description;
sighting.GetPayload().Get("Description", description);
// In the description we can the face meta-data
PFaceMetaData faceMetaData = PFaceMetaData::FromDescription(description);
// We go to the watch list to get the identity
PString name = "Unknown";
if (watchlist.GetDescription(subjectId, wd)) {
name = PString("%0: %1").Arg(wd.GetName()).Arg(subjectConfidence, 0, 'g', 2);
}
// Draw label, but not bounding box
if (faceMetaData.GetGender() == PFaceMetaData::E_GENDER_FEMALE) {
colourScheme = PUtils::E_PINK;
} else if (faceMetaData.GetGender() == PFaceMetaData::E_GENDER_UNKNOWN) {
colourScheme = PUtils::E_ORANGE;
}
// Now we want to go through the actual detections in the sighting
for (int32 j = 0; j < detectionList.Size(); j++) {
PDetection detection = detectionList.Get(j);
// handle region-of-interest: reset origin of detections to be able to
// correctly draw detected faces on cropped frames
// (because PUtils::DrawLabel() is designed for full frames)
detection.SetOrigin(PPoint2Df::Zero());
PImage image = detection.GetFrame().GetImage().Clone();
// Lets draw a label at the top of the image specifying the frame number and timestamp
int32 centreWidth = image.GetWidth() / 2;
image.DrawLabel(
PString("%0 Frame #: %1").Arg(detection.GetFrame().GetTimestampUTC().ToStringISO()).Arg(
detection.GetFrame().GetFrameNumber()), PPoint2Di(centreWidth, 20));
// Lets draw a label at the bottom with the sighting id and meta-data
PUtils::DrawLabel(image, detection, PString("Id: %0 : %1").Arg(shortSightingId).Arg(faceMetaData.ToString()),
PUtils::E_BOTTOM_CENTRE, colourScheme);
// Draw label at the top with the identity
if (subjectConfidence > 0.0) {
colourScheme2 = PUtils::E_GREEN;
}
PUtils::DrawLabel(image, detection, name, PUtils::E_TOP_CENTRE, colourScheme2, 1.0, 0);
// Display the image
if(display)
image.Display("Sighting", 40);
// Write video if asked
if (writeVideo) {
ovs.PutFrame(image);
}
}
P_LOG_INFO << PString("Sighting: %0 Id: %1 Confidence: %2").Arg(sightingId.ToString()).Arg(subjectId.ToString()).Arg(subjectConfidence);
return PResult::C_OK;
}
PResult ProcessEvents(const PList & events, const PWatchlist & watchlist, bool writeVideo, bool display) {
// Got through each event
for (int32 i = 0, ni = events.Size(); i < ni; i++) {
PEvent event;
events.Get(i, event);
if (event.GetType() == "Sighting") {
HandleSighting(event, watchlist, writeVideo, display);
}
}
return PResult::C_OK;
}
PResult RunDemo(int argc, char **argv)
{
POption opt(argc, argv);
opt.AddStandardOptions(); // set-up logging
PString inputVideoFile = opt.String("inputVideo,iv", PPath::Join(SAMPLE_DIR, "busy_office.avi"), "Input video file");
int32 maxFrames = opt.Int (",f", 0, "Maximum number of frames to process");
int32 skip = opt.Int ("skip,skip", 0, "Skip this number of frames at the beginning");
bool createWatchlist = opt.Bool ("createWatchlist,cw", false, "Create a watchlist from the watchlistConfigFile");
PString watchlistConfigFile = opt.String("watchlistConfig,wlc", PPath::Join(SAMPLE_DIR, "watchlist.config"), "Use a config file to build a watchlist");
PString watchlistFile = opt.String("watchlist,wl", "", "Load watchlist from file");
double watchlistThreshold = opt.Double("watchlistThreshold,wt", 0.5, "The threshold to use for identification");
bool gpu = !opt.Option("nogpu,ng", "Do not use GPU");
bool display = !opt.Option("noDisplay,nd", "Display video as we go");
bool writeVideo = opt.Bool ("outputVideo,ov", false, "Write labeled video for each sighting");
ReturnIfFailed(opt.Check());
if (opt.Has("h") || opt.Has("help")) {
P_LOG_INFO<< opt.ToStringHelp();
return PResult::C_OK;
}
// Open video stream
ReturnIfFailed(PInputVideoStream::Open(PString("%0").Arg(inputVideoFile), ivs));
PFaceLog2Parameters faceLogParameters = PFaceLog2Parameters::BuildDefaultParameters();
faceLogParameters.Set("GPU", gpu);
faceLogParameters.Set("Detector", "FaceDetector2");
// If a watch list is provided then an identity check will be performed
// for each sighting. For this you need to specify a watchlist to use.
// You can either provide a watchlist from file or provide a config
// file and a watchlist will be created for you. If you do not provide a
// watchlist no identification will be done, all the sightings will be unknown.
PWatchlist watchlist;
if (createWatchlist) {
P_LOG_INFO << "Creating watchlist using configuration file " << watchlistConfigFile;
PEnrollment enrollment;
LogAndReturnIfFailed(enrollment.CreateWatchlist(watchlistConfigFile, watchlist));
}
else if(!watchlistFile.IsEmpty()) {
LogAndReturnIfFailed(PFileIO::ReadFromFile(watchlistFile, watchlist));
P_LOG_INFO<< "Loaded watchlist " << watchlistFile << " with " << watchlist.Size() << " subjects";
}
else {
P_LOG_INFO << "Not using a watchlist, anonymous sightings will be reported.";
}
// Set up the watchlist and the options for facelog
PWatchlistOptions watchlistOptions;
watchlistOptions.SetNormalise(false);
watchlistOptions.SetTopN(1);
watchlistOptions.SetThreshold(static_cast<float>(watchlistThreshold));
faceLogParameters.SetWatchlist(watchlist);
faceLogParameters.SetWatchlistOptions(watchlistOptions);
// Finally create the face-logger
PAnalytics faceLog;
ReturnIfFailed(PAnalytics::Create("FaceLog2", faceLogParameters, faceLog));
// Specify a Region-Of-Interest (optional)
faceLog.Set("roi", PRectanglei(100,150,600,400));
PList events;
PFrame frame;
int frameNumber = 0;
while (ivs.GetFrame(frame).Ok()) {
frameNumber++;
if (frameNumber < skip)
continue;
if (faceLog.Apply(frame, events).Failed())
continue;
P_LOG_DEBUG<< "(" << frameNumber << "/" << maxFrames << ") found " << events << " events.";
// We can display things as we go.
// However, the information is not complete as it is per image.
// We really need the sightings to get the whole story.
if (display) {
PImage displayImage = frame.GetImage().Clone();
faceLog.DrawOSD(displayImage);
displayImage.Resize(displayImage, SCALE_FACTOR, PImage::E_INTERPOLATION_NEAREST);
displayImage.Display("Face Log", 1);
}
// Let process the events
ProcessEvents(events, watchlist, writeVideo, display);
// Check if we are exiting early
if (frameNumber == maxFrames) {
P_LOG_INFO<< "Maximum number of frames reached.";
break;
}
} // end while
// OK we have finished video, lets check we have no events hanging around
faceLog.Finish(events);
ProcessEvents(events, watchlist, writeVideo, display);
return PResult::C_OK;
}
int main(int argc, char **argv)
{
PapillonSDK::Initialise().OrDie(); // by default, open a console logger with INFO level
RunDemo(argc, argv).LogIfError();
return 0;
}