Many individuals, and privacy advocates, found it comforting that a mask could offer some measure of invisibility from computerized surveillance.
While this might seem like a sort of relief for the groups concerned with privacy, it is just another problem that facial recognition companies aim to solve.
Before the global pandemic, facial recognition systems typically worked by measuring the difference between facial features, and then, compare it with features on another picture.
When a mask is worn, this covers most of the features the facial recognition software needs to measure before it could make a comparison.
In recent times, occasioned by the current reality that made wearing face masks a civil responsibility, facial recognition companies have devised a way to improve the productivity of facial recognition algorithms, by using features that are not covered by the face mask.
Facial recognition software are trained to use facial features above the face mask, especially, around the eye region.
CEO of True face, Shawn Moore was reported by CNN to have said:
“If the (facial recognition) companies aren’t looking at this, aren’t taking it seriously, I don’t foresee them being around much longer.”
Trueface’s facial recognition software is used by the US Air Force to authenticate the identities of people entering bases.
The need for facial recognition has been on the rise since the beginning of the pandemic. Businesses are looking into implementing ways of avoiding physical contacts, while still ensuring security at their outfits.
Facial recognition have given a sense of security, and have grown to be trusted by security firms. However, according to a report by the National Institute of Standards and Technology (NIST), it was confirmed that many pre-Covid algorithms could not do their job.
The most accurate facial recognition algorithms that the lab tested failed to make a correct match between 5% and 50% of the time.
One caveat to the test is that, many of the samples submitted for testing was done before Covid-19 and face mask wearing season set in on a large scale, and those software were not made with face masks in consideration.
Companies like Tech5 have been working on technology that recognizes a partially covered face, using retina, and minor facial features to create results.
Although, Tech5’s software didn’t perform well when tested by the NIST, it performed more fairly than a few software, and rank top 10 on NIST’s list.
To improve on the errors of past machines, Moore said Trueface’s researchers are currently working on just analyzing the visible portion of the face, rather than first trying to detect, say, a mask or a pair of sunglasses.
By ignoring those objects, Trueface may speed up the overall process of recognizing the person in an image. Moore hopes that this feature will be rolled out in four to six weeks.
“Masks have definitely caused us to rethink how we make our processes more efficient with the algorithms,” Moore said.
Marios Savvides, a professor at Carnegie Mellon University, said focusing on the area of the face around the eyes and forehead makes a lot of sense.
Mario Savvides also said that the eye and eyebrow area (which is often referred to as the periocular region) is the part of the face that changes the least as you age, even if you gain weight.
This means it’s likely to look quite similar in different images of the same person, even if other parts of your face (your lips, for instance) grew or shrank.
Although, if these new algorithms succeed, there may still be ways to cloak yourself to facial recognition software. If you’re wearing a mask and sunglasses, for instance, hardly any of your face would be visible in a picture.