I have been thinking lately of the minister and novelist Frederick Buechner, who recounts once in a book that, in the middle of his morning routine, bleary-eyed and sleep-drunk, he sometimes looks up from the sink and into the mirror.
“What bothers me is simply the everlasting sameness of my face,” he writes. “Those eyes, that nose, that mouth—the variations of expression they’re capable of is really so restricted. The grimmest human tragedy can furrow the brow little more than the momentary pain of the dentist’s drill.”
He thinks of his family, his friends, all the people in his life who know him in large part through his countenance. “I am forced to conclude,” he says, “that to an alarming degree I am my face.”
Facebook has also been thinking about faces.
Last summer, the company’s artificial intelligence team announced that its facial-recognition software passed key tests with near human-level accuracy . Last week, it presented a further development: Yann LeCun, the AI team’s director, boasted that a different algorithm could identify people 83 percent of the time even if their faces were not in the picture . The program instead works from a person’s hairdo, posture, and body type.
Buechner?s statement is phrased generally, but it?s no less profound in the domain of computer security and identity. We are our faces, in a way we are not our Twitter profiles, social-security numbers, or even legal names. Although vast amounts of data are collected about most Internet users, they?re tied to what are essentially bureaucratic identifiers, like browser cookies or email addresses.
Almost everything that represents me online is ultimately a jumble of numbers and letters, and nearly all of it?with some cost or sacrifice?can be changed. Even victims of fraud or domestic violence can apply to the government for a new social-security number.
A face, though?that?s different. We?re stuck with our faces. It?s prohibitively expensive to change them beyond recognition, if it?s even possible. Facial recognition and other biometrics bind data about us to us like nothing else. And, once corporate metadata can recognize and glom onto our bodies?in all their ?everlasting sameness??we can never escape that link.
So what?s to be done? In 2014, the U.S. Department of Commerce held talks about how and whether facial-recognition technology should be regulated. The talks, officially called the ?privacy multi-stakeholder process,? were convened by the National Telecommunications and Information Association (NTIA), the government agency that advises the president on technology policy. The negotiations included representatives from both consumer-advocacy groups and the tech industry.
The talks are still ongoing, but they no longer include the consumer advocates. User privacy groups, including the EFF and the Consumer Federation of America, walked out in June over what they said was industry obstinance. The industry and its lobbyists, they said, would not admit that users might want to consent to facial-recognition software in the most extreme instances imaginable, so it was no use participating in the talks.
As they walked out, the press promptly rushed in?perhaps because the failure of negotiations about digital privacy sounds foreboding and science-fictional.
Read the complete story at the link below:
About the Author
Robinson Meyer is an associate editor at The Atlantic, where he covers technology.