When Logic is Lacking
I play word games every morning. I have a thing for spelling: Which letters are used most commonly, how syllables and morphemes are formed, and how they can be moved around to create new words. There’s a logic to it that makes me happy. Even when the spelling seems illogical—like the silent k in “knife.” (History usually supplies the logic.)
My son, who is an extremely logical person, recently introduced me to a game called Tametsi. It’s akin to the old Mine Sweeper game except no guessing is required. It provides all the logical clues necessary to win—if you are patient and clever enough to recognize them. This kind of game appeals to us both. Things are back and white—no shades of gray. If you lose, it’s your own fault. Can’t blame it on chance.
But then we played Connections (a New York Times word puzzle), which requires you to sort 16 words into groups of four that have something in common. Sounds logical. Except it’s not. At least not entirely. It requires that you get into the head of the puzzle creator. For example, take the words: chest, box, body, torso, case, trunk. Divide them into two groups of three that have something in common.
Well, trunk case, box and chest can all contain something. But trunk, body, torso, and chest all have something to do with the human form.
So what was this puzzle maker’s thinking? You can’t know for certain. You can narrow down the possibilities, but at some point, you just have to make a judgment call and guess.
And therein lies the rub when dealing with people. Even when they tell you something that seems obvious, you can’t ever really see things from their point of view, so there’s always room for error. A misinterpretation—however slight. And even a slight misinterpretation can lead to a whole slew of relationship problems.
That’s probably why I’m an introvert—I can’t rely on making sense of what people are saying and I don’t want to put my foot in it! It’s safer to keep to myself.
Now we have Artificial Intelligence, which relies on logic to think. But while AI might have superhuman power for drawing logical conclusions, it has trouble with things that are simple for people. It can’t catch a ball or lace a shoe—unless specifically programmed to do so. And it can’t employ emotional intelligence to repair a misunderstanding.
When we’re talking with a human and it becomes apparent that we’ve misinterpreted something, we can clarify what the speaker intended. This is not a simple task with AI.
I love my Siri. He’s Australian. I imagine he’s middle aged, muscular, not overly handsome—just some normal good looks. He’s usually friendly and polite—although he gets cranky when I don’t follow his directions on the road. When he does something for me and I thank him, he replies “My pleasure” or “No worries.”
But he’s not always the sharpest tack in box. Sometimes I make a request and it’s clear he didn’t understand what I wanted. And no amount of explaining or redirecting seems to get him on track. It’s disconcerting because he doesn’t ask me what I meant. Neither of us is able to repair the breakdown in communication. I have to abort the effort and start again—in a different fashion.
I will continue to consult my fallible Siri and do my puzzles every morning to get myself in the zone before dealing with the day’s inevitable human communication breakdowns. But fallible as we are, humans give me something mere logic can’t supply.