Placeholder Image

Subtitles section Play video

  • In a tweet thread exchanged with JetBlue on April 17th, writer Mackenzie Fegan had a few questions about the airline's new boarding procedure.

  • No boarding pass.

  • No ID.

  • Instead, a camera and screen verified her identity against a US Customs and Border Patrol database, then let her on the plane.

  • Some passengers might consider the increasing use of facial recognition in everyday life convenient, some might think it's Orwellian, but it's already everywhere.

  • The question is, how far will it go?

  • Turns out, even some of those developing the technology are scared of what the answer might be.

  • This is your Bloomberg QuickTake on Facial Recognition.

  • In May, San Francisco became the first American city to block police and other agencies from using facial recognition software.

  • The biggest concerns are all really around civil liberties and whether you're essentially enabling a kind of totalitarian state with this technology, which seems to be sort of the way things are heading.

  • So how'd we get here?

  • Like other artificial intelligence applications, facial recognition initially developed slowly, beginning in the 1960s.

  • With the help of newly available high-def cameras, machine learning, and giant databases of photos to increase accuracy, it advanced in a hurry.

  • Facial recognition is a basic technology, it takes images from video cameras and tries to identify the faces of people in those images.

  • It does so by taking some key points in the face usually and doing measurements of the distance between all those various points.

  • In December 2018, London police made their first arrest based on facial recognition, after cross-checking photos of pedestrians in tourist hotspots against a database of known criminals.

  • In New Delhi, a police trial reportedly identified 3,000 missing children in just four days.

  • So if it involves catching criminals and finding missing children, why would anyone be against it?

  • For that, one might look at the most developed facial recognition network in the world, in China.

  • CCTV and facial recognition have been used in combination to create this sort of vast surveillance apparatus and that's been particularly applied against certain ethnic minority groups.

  • But even in western democracies there's a concern about police departments using this technology to try to find suspects, or even people who might be involved in legitimate protests, so they could be tracked.

  • And those are just concerns about the technology when it works as intended.

  • A study from the MIT Media Lab found that white men in a sample were correctly identified 99 percent of the time, while error rates of up to 35% were found when it came to darker skinned women.

  • Microsoft came out last year.

  • It was the first major tech company to do so and say, they really feel uncomfortable with deploying this technology until there's clear regulation around it.

  • They were then joined by Amazon, which kind of seconded those calls.

  • It seems like some other companies are kind of plowing ahead without such qualms at all.

  • To stake out some guidelines, the Algorithmic Justice Center and Georgetown University Law Center unveiled the Safe Face Pledge, which asks companies not to provide facial AI for autonomous weapons or sell to law enforcement unless explicit laws are debated and passed to allow it.

  • A few companies have signed on, but notably not Microsoft or Amazon, possibly loathe to lose the opportunity to sell facial recognition to police departments and governments the world over.

  • So what's preventing your image from saying a whole lot more about you than it used to?

  • Face it, not much.

In a tweet thread exchanged with JetBlue on April 17th, writer Mackenzie Fegan had a few questions about the airline's new boarding procedure.

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it