Placeholder Image

Subtitles section Play video

  • JOE: Hey Joss, I have a question for you. Do you know how these Snapchat filters work?

  • like behind the scenes?

  • JOSS: Hmm, I have no idea.

  • JOE: Well do you think you can find out?

  • JOSS: You got it!

  • These are what Snapchat calls their lenses, but everyone else calls filters.

  • They are very silly but the engineering behind them is serious.

  • JOSS: Oh my god.

  • The technology came from a Ukrainian startup called Looksery

  • which Snapchat acquired in September 2015 for a $150 million dollars.

  • That's reportedly the largest tech acquisition in Ukrainian history.

  • Their augmented reality filters tap into the large and rapidly

  • growing field of "computer vision" --

  • those are applications that use pixel data from a camera in order to identify objects

  • and interpret 3D space. Computer vision is how you can deposit checks,

  • it's how Facebook knows who's in your photos, how self-driving cars can avoid

  • running over people and how you can give yourself a doggy nose.

  • So how do snapchat filters work? They wouldn't let us talk to any of the Looksery

  • engineers but their patents are online.

  • The first step is detection. How does the computer know which part of an image is

  • a face?

  • This is something that human brains are fantastic at. Too good even.

  • But this is what a photo looks like to a computer. If all you have is the data for

  • the color value of each individual pixel, how do you find a face?

  • Well the key is looking for areas of contrast, between light and dark parts of

  • the image. The pioneering facial detection tool is called the

  • Viola-Jones algorithm.

  • It works by repeatedly scanning through the image data calculating the

  • difference between the grayscale pixel values underneath the white boxes and

  • the black boxes. For instance, the bridge of the nose is usually lighter than the surrounding area on both sides,

  • the eye sockets are darker than the forehead, and the middle of the forehead

  • is lighter than the size of it.

  • These are crude test for facial features, but if they find enough matches in one

  • area of the image,

  • it concludes that there is a face there. This kind of algorithm won't find your

  • face if you're really tilted or facing sideways, but they're really accurate for

  • frontal faces, and it's how digital cameras have been putting boxes around

  • faces for years. But in order to apply this virtual lipstick, the app needs to

  • do more than just detect my face.

  • It has to locate my facial features.

  • According to the patents, it does this with anactive shape model” -- a statistical

  • model of a face shape that's been trained by people manually marking the

  • borders of facial features on hundreds, sometimes thousands of sample images.

  • The algorithm takes an average face from that trained data and aligns it with the

  • image from your phone's camera, scaling it and rotating it according to where it

  • already knows your face is located.

  • But it's not a perfect fit, so the model analyzes the pixel data around each of the points,

  • looking for edges defined by brightness and darkness. From the training images,

  • the model has a template for what the bottom of your lips should look like,

  • for example, so it looks for that pattern in your image and adjust the point to match it.

  • Because some of these individual guesses might be wrong,

  • the model can correct and smooth them by taking into account the locations of all

  • the other points. Once it locates your facial features, those points are used as

  • coordinates to create a mesh.

  • That's a 3D mask that can move, rotate, and scale along with your face as the

  • video data comes in for every frame and once they've got that, they can do a lot with it.

  • They can deform the mask to change your face shape, change your eye color,

  • add accessories, and set animations to trigger when you open your mouth

  • or move your eyebrows.

  • And like the IOS app Face Swap Live, Snapchat can switch your face with a

  • friend's, although that involves a bunch more data.

  • The main components of this technology are not new. What's new is the ability to

  • run them in real time, from a mobile device. That level of processing speed is

  • a pretty recent development.

  • So why go through all this trouble just to give people a virtual flower crown?

  • Well Snapchats sees a revenue opportunity here. In a world that's

  • flooded with advertisements,

  • maybe the best hope that brands have to get us to look at their ads... is to

  • put them on our faces.

  • Facial detection has a creepy side too, particularly when it's used to identify you by name.

  • Both the FBI and private companies like Facebook and Google are massing huge

  • databases of faces and there's currently no federal law regulating it.

  • So some privacy advocates have come up with ways to camouflage your face from

  • facial detection algorithms.

  • It's actually illegal in a lot of places to wear a face mask in public,

  • so this project by artist Adam Harvey suggest some things that you can do with

  • your hair and your makeup that can, for now, make your face Invisible to computers.

JOE: Hey Joss, I have a question for you. Do you know how these Snapchat filters work?

Subtitles and vocabulary

Click the word to look it up Click the word to find further inforamtion about it