The Coffee Table

200

Face Facts

 

When I was maybe ten or eleven years old, my father showed me a photograph of my Aunt Florence when she was roughly that age and asked me who it was. I didn’t hesitate: “Me!” 

I guess it didn’t occur to me to wonder why he was asking, when the answer was obvious. (Of course, I was wrong.) The photo was from a previous generation, yet my aunt and I looked so much alike that the time lapse cues—dress, background, even hairstyle—were easily overlooked in favor of the obvious answer. I did not doubt for a second that I was looking at a photograph of myself.

Makes me wonder if Aunt Florence could open my iPhone if she were still alive.

The Almighty Google shared a story with me about a 10-year-old boy who opened his mother’s smart phone with his face. He was trying to enter her passcode—with her permission—but discovered he didn’t need it. His face looked enough like hers to get the job done. Whoa!

And there are multiple reports of twins opening each other’s phones via facial recognition. I guess that makes sense.

When I first moved to Carroll County, I had the experience of being called by somebody else’s name. When I explained that was not my name, the person who made the error told me I looked just like some other local woman (whom I’ve never met). 

In my twenties, I was told I looked like an actress on a short-lived television show about a pioneer family. I watched the program, and indeed it was true. I’m sorry the actress’s career was so short. I might have gotten a job as her double. 

My point is, despite the commonly held belief that faces are like snowflakes—no two are exactly alike—some faces look enough like others to create room for errors in identification.

This is apparently true even when a computer, with all its Artificial Intelligence superpowers, is the one making the judgment.

Now, I’m not too concerned that my dead aunt or some long forgotten actress—or even some Carroll County resident I don’t know—is going to gain access to my iPhone and commit dirty deeds. But I am aware that AI facial recognition is used by airports, shopping centers, banks… and law enforcement. And while I understand how useful this might be, if used appropriately, the potential for misuse gives me pause.

Apparently policing units are sometimes using AI facial recognition technology as the primary—or even sole—source of identifying a criminal. Last month the Washington Post reported at least eight cases wherein people were wrongfully arrested after being identified through facial recognition. All were eventually cleared of wrongdoing, but in at least one case, it took the accused two years to clear his name. This hardly meets our national standard of innocent until proven guilty.

Multiple sources report that facial recognition is least reliable for people of color, women, and nonbinary individuals. Yet it’s sometimes used as the primary impetus for an arrest of these individuals, in lieu of more substantial evidence.

If you’re a white man committing a crime, watch out! You might get caught via facial recognition. But for the rest of us, maybe an Aunt Florence or a Carroll County doppelgänger will take the fall. 

Some cities—and even states—have banned or heavily restricted the use of facial recognition technology by law enforcement. Some folks might regard this as hamstringing the police. But if you look closely, you’ll see a necessary protection of  your own civil liberties. 

Leave a Comment