Placeholder Image

Subtitles section Play video

  • Let's play a game.

  • Close your eyes and picture a shoe.

  • OK.

  • Did anyone picture this?

  • This?

  • How about this?

  • We may not even know why, but each of us

  • is biased toward one shoe over the others.

  • Now, imagine that you're trying to teach a computer

  • to recognize a shoe.

  • You may end up exposing it to your own bias.

  • That's how bias happens in machine learning.

  • But first, what is machine learning?

  • Well, it's used in a lot of technology we use today.

  • Machine learning helps us get from place to place,

  • gives us suggestions, translates stuff,

  • even understands what you say to it.

  • How does it work?

  • With traditional programming,

  • people hand code the solution to a problem, step by step.

  • With machine learning, computers learn the solution by finding patterns in data

  • ,so it's easy to think there's no human bias in that.

  • But just because something is based on data doesn't automatically make it neutral.

  • Even with good intentions, it's impossible to separate ourselves from our own human biases,

  • so our human biases become part of the technology we create in many different ways.

  • There's interaction bias, like this recent game

  • where people were asked to draw shoes for the computer.

  • Most people drew ones like this.

  • So as more people interacted with the game,

  • the computer didn't even recognize these.

  • Latent bias-- for example, if you were training a computer

  • on what a physicist looks like, and you're using pictures of past physicists,

  • your algorithm will end up with a latent bias skewing towards men.

  • And selection bias-- say you're training a model to recognize faces.

  • Whether you grab images from the internet or your own photo library,

  • are you making sure to select photos that represent everyone?

  • Since some of our most advanced products use machine learning,

  • we've been working to prevent that technology from perpetuating negative human bias--

  • from tackling offensive or clearly misleading information

  • from appearing at the top of your search results page

  • to adding a feedback tool in the search bar

  • so people can flag hateful or inappropriate autocomplete suggestions.

  • It's a complex issue, and there is no magic bullet,

  • but it starts with all of us being aware of it,

  • so we can all be part of the conversation,

  • because technology should work for everyone.

Let's play a game.

Subtitles and vocabulary

Operation of videos Adjust the video here to display the subtitles

B1 US Google bias machine learning machine learning shoe

Machine Learning and Human Bias

  • 12126 696
    Jessieeee posted on 2018/08/18
Video vocabulary