Subtitles section Play video
-
Mark Zuckerberg,
-
a journalist was asking him a question about the news feed.
-
And the journalist was asking him,
-
"Why is this so important?"
-
And Zuckerberg said,
-
"A squirrel dying in your front yard
-
may be more relevant to your interests right now
-
than people dying in Africa."
-
And I want to talk about
-
what a Web based on that idea of relevance might look like.
-
So when I was growing up
-
in a really rural area in Maine,
-
the Internet meant something very different to me.
-
It meant a connection to the world.
-
It meant something that would connect us all together.
-
And I was sure that it was going to be great for democracy
-
and for our society.
-
But there's this shift
-
in how information is flowing online,
-
and it's invisible.
-
And if we don't pay attention to it,
-
it could be a real problem.
-
So I first noticed this in a place I spend a lot of time --
-
my Facebook page.
-
I'm progressive, politically -- big surprise --
-
but I've always gone out of my way to meet conservatives.
-
I like hearing what they're thinking about;
-
I like seeing what they link to;
-
I like learning a thing or two.
-
And so I was surprised when I noticed one day
-
that the conservatives had disappeared from my Facebook feed.
-
And what it turned out was going on
-
was that Facebook was looking at which links I clicked on,
-
and it was noticing that, actually,
-
I was clicking more on my liberal friends' links
-
than on my conservative friends' links.
-
And without consulting me about it,
-
it had edited them out.
-
They disappeared.
-
So Facebook isn't the only place
-
that's doing this kind of invisible, algorithmic
-
editing of the Web.
-
Google's doing it too.
-
If I search for something, and you search for something,
-
even right now at the very same time,
-
we may get very different search results.
-
Even if you're logged out, one engineer told me,
-
there are 57 signals
-
that Google looks at --
-
everything from what kind of computer you're on
-
to what kind of browser you're using
-
to where you're located --
-
that it uses to personally tailor your query results.
-
Think about it for a second:
-
there is no standard Google anymore.
-
And you know, the funny thing about this is that it's hard to see.
-
You can't see how different your search results are
-
from anyone else's.
-
But a couple of weeks ago,
-
I asked a bunch of friends to Google "Egypt"
-
and to send me screen shots of what they got.
-
So here's my friend Scott's screen shot.
-
And here's my friend Daniel's screen shot.
-
When you put them side-by-side,
-
you don't even have to read the links
-
to see how different these two pages are.
-
But when you do read the links,
-
it's really quite remarkable.
-
Daniel didn't get anything about the protests in Egypt at all
-
in his first page of Google results.
-
Scott's results were full of them.
-
And this was the big story of the day at that time.
-
That's how different these results are becoming.
-
So it's not just Google and Facebook either.
-
This is something that's sweeping the Web.
-
There are a whole host of companies that are doing this kind of personalization.
-
Yahoo News, the biggest news site on the Internet,
-
is now personalized -- different people get different things.
-
Huffington Post, the Washington Post, the New York Times --
-
all flirting with personalization in various ways.
-
And this moves us very quickly
-
toward a world in which
-
the Internet is showing us what it thinks we want to see,
-
but not necessarily what we need to see.
-
As Eric Schmidt said,
-
"It will be very hard for people to watch or consume something
-
that has not in some sense
-
been tailored for them."
-
So I do think this is a problem.
-
And I think, if you take all of these filters together,
-
you take all these algorithms,
-
you get what I call a filter bubble.
-
And your filter bubble is your own personal,
-
unique universe of information
-
that you live in online.
-
And what's in your filter bubble
-
depends on who you are, and it depends on what you do.
-
But the thing is that you don't decide what gets in.
-
And more importantly,
-
you don't actually see what gets edited out.
-
So one of the problems with the filter bubble
-
was discovered by some researchers at Netflix.
-
And they were looking at the Netflix queues, and they noticed something kind of funny
-
that a lot of us probably have noticed,
-
which is there are some movies
-
that just sort of zip right up and out to our houses.
-
They enter the queue, they just zip right out.
-
So "Iron Man" zips right out,
-
and "Waiting for Superman"
-
can wait for a really long time.
-
What they discovered
-
was that in our Netflix queues
-
there's this epic struggle going on
-
between our future aspirational selves
-
and our more impulsive present selves.
-
You know we all want to be someone
-
who has watched "Rashomon,"
-
but right now
-
we want to watch "Ace Ventura" for the fourth time.
-
(Laughter)
-
So the best editing gives us a bit of both.
-
It gives us a little bit of Justin Bieber
-
and a little bit of Afghanistan.
-
It gives us some information vegetables;
-
it gives us some information dessert.
-
And the challenge with these kinds of algorithmic filters,
-
these personalized filters,
-
is that, because they're mainly looking
-
at what you click on first,
-
it can throw off that balance.
-
And instead of a balanced information diet,
-
you can end up surrounded
-
by information junk food.
-
What this suggests
-
is actually that we may have the story about the Internet wrong.
-
In a broadcast society --
-
this is how the founding mythology goes --
-
in a broadcast society,
-
there were these gatekeepers, the editors,
-
and they controlled the flows of information.
-
And along came the Internet and it swept them out of the way,
-
and it allowed all of us to connect together,
-
and it was awesome.
-
But that's not actually what's happening right now.
-
What we're seeing is more of a passing of the torch
-
from human gatekeepers
-
to algorithmic ones.
-
And the thing is that the algorithms
-
don't yet have the kind of embedded ethics
-
that the editors did.
-
So if algorithms are going to curate the world for us,
-
if they're going to decide what we get to see and what we don't get to see,
-
then we need to make sure
-
that they're not just keyed to relevance.
-
We need to make sure that they also show us things
-
that are uncomfortable or challenging or important --
-
this is what TED does --
-
other points of view.
-
And the thing is, we've actually been here before
-
as a society.
-
In 1915, it's not like newspapers were sweating a lot
-
about their civic responsibilities.
-
Then people noticed
-
that they were doing something really important.
-
That, in fact, you couldn't have
-
a functioning democracy
-
if citizens didn't get a good flow of information,
-
that the newspapers were critical because they were acting as the filter,
-
and then journalistic ethics developed.
-
It wasn't perfect,
-
but it got us through the last century.
-
And so now,
-
we're kind of back in 1915 on the Web.
-
And we need the new gatekeepers
-
to encode that kind of responsibility
-
into the code that they're writing.
-
I know that there are a lot of people here from Facebook and from Google --
-
Larry and Sergey --
-
people who have helped build the Web as it is,
-
and I'm grateful for that.
-
But we really need you to make sure
-
that these algorithms have encoded in them
-
a sense of the public life, a sense of civic responsibility.
-
We need you to make sure that they're transparent enough
-
that we can see what the rules are
-
that determine what gets through our filters.
-
And we need you to give us some control
-
so that we can decide
-
what gets through and what doesn't.
-
Because I think
-
we really need the Internet to be that thing
-
that we all dreamed of it being.
-
We need it to connect us all together.
-
We need it to introduce us to new ideas
-
and new people and different perspectives.
-
And it's not going to do that
-
if it leaves us all isolated in a Web of one.
-
Thank you.
-
(Applause)