Subtitles section Play video
-
Around five years ago,
-
it struck me that I was losing the ability
-
to engage with people who aren't like-minded.
-
The idea of discussing hot-button issues with my fellow Americans
-
was starting to give me more heartburn
-
than the times that I engaged with suspected extremists overseas.
-
It was starting to leave me feeling more embittered and frustrated.
-
And so just like that,
-
I shifted my entire focus
-
from global national security threats
-
to trying to understand what was causing this push
-
towards extreme polarization at home.
-
As a former CIA officer and diplomat
-
who spent years working on counterextremism issues,
-
I started to fear that this was becoming a far greater threat to our democracy
-
than any foreign adversary.
-
And so I started digging in,
-
and I started speaking out,
-
which eventually led me to being hired at Facebook
-
and ultimately brought me here today
-
to continue warning you about how these platforms
-
are manipulating and radicalizing so many of us
-
and to talk about how to reclaim our public square.
-
I was a foreign service officer in Kenya
-
just a few years after the September 11 attacks,
-
and I led what some call "hearts and minds" campaigns
-
along the Somalia border.
-
A big part of my job was to build trust with communities
-
deemed the most susceptible to extremist messaging.
-
I spent hours drinking tea with outspoken anti-Western clerics
-
and even dialogued with some suspected terrorists,
-
and while many of these engagements began with mutual suspicion,
-
I don't recall any of them resulting in shouting or insults,
-
and in some case we even worked together on areas of mutual interest.
-
The most powerful tools we had were to simply listen, learn
-
and build empathy.
-
This is the essence of hearts and minds work,
-
because what I found again and again is that what most people wanted
-
was to feel heard, validated and respected.
-
And I believe that's what most of us want.
-
So what I see happening online today is especially heartbreaking
-
and a much harder problem to tackle.
-
We are being manipulated by the current information ecosystem
-
entrenching so many of us so far into absolutism
-
that compromise has become a dirty word.
-
Because right now,
-
social media companies like Facebook
-
profit off of segmenting us and feeding us personalized content
-
that both validates and exploits our biases.
-
Their bottom line depends on provoking a strong emotion
-
to keep us engaged,
-
often incentivizing the most inflammatory and polarizing voices,
-
to the point where finding common ground no longer feels possible.
-
And despite a growing chorus of people crying out for the platforms to change,
-
it's clear they will not do enough on their own.
-
So governments must define the responsibility
-
for the real-world harms being caused by these business models
-
and impose real costs on the damaging effects
-
they're having to our public health, our public square and our democracy.
-
But unfortunately, this won't happen in time for the US presidential election,
-
so I am continuing to raise this alarm,
-
because even if one day we do have strong rules in place,
-
it will take all of us to fix this.
-
When I started shifting my focus from threats abroad
-
to the breakdown in civil discourse at home,
-
I wondered if we could repurpose some of these hearts and minds campaigns
-
to help heal our divides.
-
Our more than 200-year experiment with democracy works
-
in large part because we are able to openly and passionately
-
debate our ideas for the best solutions.
-
But while I still deeply believe
-
in the power of face-to-face civil discourse,
-
it just cannot compete
-
with the polarizing effects and scale of social media right now.
-
The people who are sucked down these rabbit holes
-
of social media outrage
-
often feel far harder to break of their ideological mindsets
-
than those vulnerable communities I worked with ever were.
-
So when Facebook called me in 2018
-
and offered me this role
-
heading its elections integrity operations for political advertising,
-
I felt I had to say yes.
-
I had no illusions that I would fix it all,
-
but when offered the opportunity
-
to help steer the ship in a better direction,
-
I had to at least try.
-
I didn't work directly on polarization,
-
but I did look at which issues were the most divisive in our society
-
and therefore the most exploitable in elections interference efforts,
-
which was Russia's tactic ahead of 2016.
-
So I started by asking questions.
-
I wanted to understand the underlying systemic issues
-
that were allowing all of this to happen,
-
in order to figure out how to fix it.
-
Now I still do believe in the power of the internet
-
to bring more voices to the table,
-
but despite their stated goal of building community,
-
the largest social media companies as currently constructed
-
are antithetical to the concept of reasoned discourse.
-
There's no way to reward listening,
-
to encourage civil debate
-
and to protect people who sincerely want to ask questions
-
in a business where optimizing engagement and user growth
-
are the two most important metrics for success.
-
There's no incentive to help people slow down,
-
to build in enough friction that people have to stop,
-
recognize their emotional reaction to something,
-
and question their own assumptions before engaging.
-
The unfortunate reality is:
-
lies are more engaging online than truth,
-
and salaciousness beats out wonky, fact-based reasoning
-
in a world optimized for frictionless virality.
-
As long as algorithms' goals are to keep us engaged,
-
they will continue to feed us the poison that plays to our worst instincts
-
and human weaknesses.
-
And yes, anger, mistrust,
-
the culture of fear, hatred:
-
none of this is new in America.
-
But in recent years, social media has harnessed all of that
-
and, as I see it, dramatically tipped the scales.
-
And Facebook knows it.
-
A recent "Wall Street Journal" article
-
exposed an internal Facebook presentation from 2018
-
that specifically points to the companies' own algorithms
-
for growing extremist groups' presence on their platform
-
and for polarizing their users.
-
But keeping us engaged is how they make their money.
-
The modern information environment is crystallized around profiling us
-
and then segmenting us into more and more narrow categories
-
to perfect this personalization process.
-
We're then bombarded with information confirming our views,
-
reinforcing our biases,
-
and making us feel like we belong to something.
-
These are the same tactics we would see terrorist recruiters
-
using on vulnerable youth,
-
albeit in smaller, more localized ways before social media,
-
with the ultimate goal of persuading their behavior.
-
Unfortunately, I was never empowered by Facebook to have an actual impact.
-
In fact, on my second day, my title and job description were changed
-
and I was cut out of decision-making meetings.
-
My biggest efforts,
-
trying to build plans
-
to combat disinformation and voter suppression in political ads,
-
were rejected.
-
And so I lasted just shy of six months.
-
But here is my biggest takeaway from my time there.
-
There are thousands of people at Facebook
-
who are passionately working on a product
-
that they truly believe makes the world a better place,
-
but as long as the company continues to merely tinker around the margins
-
of content policy and moderation,
-
as opposed to considering
-
how the entire machine is designed and monetized,
-
they will never truly address how the platform is contributing
-
to hatred, division and radicalization.
-
And that's the one conversation I never heard happen during my time there,
-
because that would require fundamentally accepting
-
that the thing you built might not be the best thing for society
-
and agreeing to alter the entire product and profit model.
-
So what can we do about this?
-
I'm not saying that social media bears the sole responsibility
-
for the state that we're in today.
-
Clearly, we have deep-seated societal issues that we need to solve.
-
But Facebook's response, that it is just a mirror to society,
-
is a convenient attempt to deflect any responsibility
-
from the way their platform is amplifying harmful content
-
and pushing some users towards extreme views.
-
And Facebook could, if they wanted to,
-
fix some of this.
-
They could stop amplifying and recommending the conspiracy theorists,
-
the hate groups, the purveyors of disinformation
-
and, yes, in some cases even our president.
-
They could stop using the same personalization techniques
-
to deliver political rhetoric that they use to sell us sneakers.
-
They could retrain their algorithms
-
to focus on a metric other than engagement,
-
and they could build in guardrails to stop certain content from going viral
-
before being reviewed.
-
And they could do all of this
-
without becoming what they call the arbiters of truth.
-
But they've made it clear that they will not go far enough
-
to do the right thing without being forced to,
-
and, to be frank, why should they?
-
The markets keep rewarding them, and they're not breaking the law.
-
Because as it stands,
-
there are no US laws compelling Facebook, or any social media company,
-
to protect our public square,
-
our democracy
-
and even our elections.
-
We have ceded the decision-making on what rules to write and what to enforce
-
to the CEOs of for-profit internet companies.
-
Is this what we want?
-
A post-truth world where toxicity and tribalism
-
trump bridge-building and consensus-seeking?
-
I do remain optimistic that we still have more in common with each other
-
than the current media and online environment portray,
-
and I do believe that having more perspective surface
-
makes for a more robust and inclusive democracy.
-
But not the way it's happening right now.
-
And it bears emphasizing, I do not want to kill off these companies.
-
I just want them held to a certain level of accountability,
-
just like the rest of society.
-
It is time for our governments to step up and do their jobs
-
of protecting our citizenry.
-
And while there isn't one magical piece of legislation
-
that will fix this all,
-
I do believe that governments can and must find the balance
-
between protecting free speech
-
and holding these platforms accountable for their effects on society.
-
And they could do so in part by insisting on actual transparency
-
around how these recommendation engines are working,
-
around how the curation, amplification and targeting are happening.
-
You see, I want these companies held accountable
-
not for if an individual posts misinformation
-
or extreme rhetoric,
-
but for how their recommendation engines spread it,
-
how their algorithms are steering people towards it,
-
and how their tools are used to target people with it.
-
I tried to make change from within Facebook and failed,
-
and so I've been using my voice again for the past few years
-
to continue sounding this alarm
-
and hopefully inspire more people to demand this accountability.
-
My message to you is simple:
-
pressure your government representatives
-
to step up and stop ceding our public square to for-profit interests.
-
Help educate your friends and family
-
about how they're being manipulated online.
-
Push yourselves to engage with people who aren't like-minded.
-
Make this issue a priority.
-
We need a whole-society approach to fix this.
-
And my message to the leaders of my former employer Facebook is this:
-
right now, people are using your tools exactly as they were designed
-
to sow hatred, division and distrust,
-
and you're not just allowing it, you are enabling it.
-
And yes, there are lots of great stories
-
of positive things happening on your platform around the globe,
-
but that doesn't make any of this OK.
-
And it's only getting worse as we're heading into our election,
-
and even more concerning,
-
face our biggest potential crisis yet,
-
if the results aren't trusted, and if violence breaks out.
-
So when in 2021 you once again say, "We know we have to do better,"
-
I want you to remember this moment,
-
because it's no longer just a few outlier voices.
-
Civil rights leaders, academics,
-
journalists, advertisers, your own employees,
-
are shouting from the rooftops