For me, facial recognition seemed like something only Jimmy Neutron had growing up. Or, maybe, only in the secret places of the Whitehouse that could be accessed with President Bush’s face. (Clearly, you know in which decade I was raised.)
Today, my phone unlocks when I look at it, I have a talking speaker on my desk that plays Bruce Springsteen when I ask it to and I'm cynically unamused by the wonders of technology because I studied cognitive psychology and read textbooks about computer science. #adulting
For kids, and most non-Big Bang Theory-esque individuals, however, the advancements technology has made in the past two decades are so rampant they may not even realize just how much has changed in the course of a blink. Nor do they realize the efforts and sheer milestones these changes represent.
For this article, let's focus on Facial Recognition. How to easily get a kid (and adult) to understand it. Most importantly, why you should understand how humans started recognizing faces first.
P. SHERMAN, 42 WALLABY WAY, SYDNEY
If you recognize where that sub-header is from, Good! You were also raised in the late 90s/early 2000s, had kids during that time or just really like Disney Pixar. No idea what that is or what it has to do with facial recognition? Let me ask you the same question I ask students in my Artificial intelligence Class:
How we recognize pretty much anything, whether it is a cartoon fish or a face, is reliant on memory.
Our memory utilizes maps, or as we glorified psychology majors like to call them, schemas, to assemble individual features into a full perception, recognition, understanding of something.
Spread the word
Features Are Data
By using features already in our memory, our brains are able to piece together that this orange, little fin, cartoon character is Nemo -- we've seen him before. Features are data. And, as you may guess, they are much more stratified than simply "orange" and "little fin." Just as a face is more than just "nose," "eyes" and "mouth." Statistical data can be arrived from these features implicitly, automatically, by our brain. How long are that person's lips? What is the diameter of their pupil? How far apart are their nostrils? We aren't walking tape measurers. We don't know these exact data points, but, we all know Ross Geller has a longer knows than Chandler Bing. (That is, if we've seen Friends and remember their features.)
Computer's biometry (biometrics), the application of statistical analysis to biological data (thank's for the definition, Google), are used to recognize facial features. These recognition algorithms are specifically trained on large data sets (i.e; a lot of different human faces) so that they can repeatedly and reliably recognize the features of another human again.
This technology is able to remember your face specifically by pairing the biometric data of your features other data items such as your name or other security parameters you input. "This is MY iPhone. Now, remember my face, ONLY."
Just because your iPhone only opens for your face doesn't mean its not seeing other faces, too. It is just able to abstract out (aka, filter out) your features, compared to another.
So, Computers are just as smart as humans now, right?
You don't know how many times students assume that machines are the better "species" after I teach them this. We are so naturally used to recognizing faces, we take for granted the brain power involved in that process.
And, because kids are so used to a technology that does what their brains do, they are completely unaware of just how extraordinary their own brains are. When teaching Artificial Intelligence (which, facial recognition is more or less categorized under) my greatest emphasis is to develop a humanistic understanding of technological advancements in my student’s wheelhouse. We, after all, made machines. Many, if not all, of their functions, are developed from observing and studying natural behaviors (in humans and animals).
I've observed some children fearing a machine take over after initially realizing their increasing ability and autonomy. When students start to receive a balanced perspective of human cognition and technological function, they begin to see themselves as intelligent beings capable of creating an Alexa, not being overtaken by it.
They understand just how narrow AI is (only capable of doing single tasks and characteristics) and just how incredible we are. (I guess I'm not that cynical, after all.)