Civil Discourse in the Social Media Age #SkollWF 2017


Hey, everybody. Welcome.
We’re going to get started. I know it’s after lunch so I expect
more people will join us. But the clock’s ticking. We have
so much we want to talk about. So, welcome to the Future
of Media Session… Civil Discourse in the Social Media Age. My name is Manoush Zamorodi. I host a podcast from New York Public
Library called ‘Note To Self’. Every week we talk about how technology
is changing the way we live, the way we work,
the way we communicate. And after the recent presidential
election in the US, we have certainly sharpened our focus on the effect of online
platforms to be wonderful and absolutely horrible places
to get your information, to share stories and to be
with other human beings. So, today our goal
with this panel is to look into what responsibilities
these platforms have, what the next chapter
in digital literacy is, and what we can do to start getting
social media platforms to do better at bringing communities together
rather than pushing us apart. So, our three expert panelists
have done research into this. They’ve also built places for people to get
information, different ways of doing it. We’re going to chat for a bit and then of
course go to discussion and questions. So, it is my honor
to introduce Phil Howard who’s sitting right
here to my left. He’s a professor of Internet Studies
at the Oxford Internet Institute and a senior fellow at Baliol. He researches how digital
media can be used for both civic engagement and political
control around the world. And he’s been releasing some
really fascinating studies just over the last two weeks
that we’re going to get into. And then Eli Pariser
is right over there. He is the author of
‘The Filter Bubble’. Hard to believe this book
came out five years ago. And now… I mean we’re just talking
about everything that was in that book. Echo chambers… good stuff. So, Eli worked in community
and political organizing and in 2012 launched ‘Upworthy’
which is a news website that… now, this is how I describe it, Eli.
Tell me if I’m wrong. “Aims to get people sharing empathetic
stories about civic issues.” Fair? – Fair.
– Okay. Great. Thank you. and then Matthew Segal to his
left is the founder of Attention. This is a news website aimed to make
complex social issues palatable. Easy to understand, easy to take
action on for young people. Millions of millennials have shared
site videos about gerrymandering, which I find absolutely amazing. It’s true. And Matthew told me he’s obsessed with
making political ideas go viral online. I know that this panel’s
called ‘Civic Discussion’ but we’ve all agreed that a civil conversation
makes for a very boring panel. So, hopefully there will be some
debate and maybe even an argument but we’ll all be friends afterwards. So, there’s this crazy statistic
which says that in 2016, over 60% of Americans got their news
through social networking sites. How many of you get your news
through social networking sites? Just out of curiosity. Okay. Definitely more
than 60% in this room. And that just reminds me. I totally
did not do what I was supposed to do, which was to tell you
to silence your phone and wait for the microphone
before you speak, and fill the session
survey out. Okay. There’s done. So, I want to start out
by asking you, Eli, what are the pros and
cons as you see it? Big question, and you
just got off a plane. But I feel that’s good almost. You can wax lyrical on getting most of our information
on a social media platform. So, let me start with the cons
and then we’ll move to the pros. So, you know, the challenge with
social media websites and I… let’s just do a little
taxonomy to start. When people say social media, there’s sort of Facebook and
there’s everything else. And Facebook is several
orders of magnitude larger in terms of the time people spend and
the number of people on the platform than something like Twitter
or Snapchat or whatever else. So, I think you kind of have to
separate out these phenomena. Facebook is… when
people are saying they’re getting news from
social media websites, most of them are talking
about Facebook. And, you know the
challenge is ultimately Facebook is a system that’s
optimizing for engagement. It’s optimizing for people
who spend time. And it’s relatively agnostic about
how you’re spending your time, but it’s looking to show you content
that you’re going to engage with. What the algorithm tends to do is to show
you things that it thinks you’ll like, which is in part a service. But it also means that
you end up seeing things that confirm what you already
believed to be true. And that was kind of the core
idea behind ‘The Filter Bubble’ as you know, increasingly
we have these algorithms. They’re not really… at the end of the day
they’re not serving us. They’re serving the interests
of these platforms. And the interests of the platforms are
to drive the advertiser eyeballs. So, there isn’t really an incentive
for Facebook to show us stuff that makes us feel doubtful, or bad about
ourselves or worried about, you know… Outrage sells better than all of
those [inaudible]. Outrage, love and hope, you know, other strong
emotions engage people. And you lose something there. I think the power of these… and there’s
many powers of these new media. But I think, you know, two, are they open up an opportunity
for new voices to be heard. It’s not as if
like the old media was so inclusive or great at lifting up
the voices of people across society. And actually when you look
at something like YouTube, when you look at
something like Facebook. On the one hand, you have
some really nasty people, but on the other hand you have
a lot of people who just like literally weren’t
allowed to speak who can now reach large audiences
and tell their stories. And that’s a potentially
transformative thing. So we could say Bernie Sanders may
not have existed as a candidate if we hadn’t had social media. Because it made it easier
to get to other people who may not have been taken
seriously by mainstream media. Yeah. Bernie Sanders, for sure. But I was thinking
even more of like… I met this woman recently who when she was 15 in Iran made this rap
video about ending child marriage and it went viral. And she
actually hadn’t told her parents. And it was this whole thing to
like explain to her parents that she had made a
viral rap video about how they shouldn’t sell
her into marriage. But that’s the kind of
person that I’m thinking of, that just would not have had
access to a million people ever in a previous media system. Beautiful thing. So, Phil, you did a study
that just came out which found that nearly a quarter of
the web content shared on Twitter by users in the key state of Michigan
in the days before the election was what you call in
the report junk news. What’s junk news and
what did you find, and why do you think
it’s significant? Well, we started that study because
of something Zuckerberg actually said when he said that only 1% of the
content on Facebook was junk. And it occurred to us that the key
issue isn’t the amount overall but its concentration. And so, if there was going to be
an impact of fake news on voters, we thought we’d find it in
Pennsylvania, Florida, Michigan. We took a close look at Michigan and we found a 1:1 ratio for every link to content produced
by a professional journalist, there was at least
one link to junk. And junk, it turned out to be very hard
to operationalize fake news as a concept. So we put together stuff that
was sensationalist, extremist, commentary masking as news. Quite a range of things. And so
that’s how we got the 1:1 ratio. If you add in the amount of links
to unverified Wikileaks content and to Russia Today
or Sputnik content, it’s almost half of what
Michigan voters were sharing, was stuff not produced by
professional journalists. Did that surprise you? I mean what do
you guys think when you hear that? Are you like, “Well, that’s
terrifying,” or, “Yeah. Of course.” To me it’s terrifying. We did the same study on Germany
around the same period of time. And there the ratio was 4:1. So four pieces of professionally produced
news content to one piece of junk. And my German collaborator
says that’s awful. So they have it
better in Germany. But they’re still really concerned about what’s happening to German public
life and political conversations. In the US, in the important swing
state where things were 50-50 right up until the last day. And I should also say that the
proportion of professional news reached its lowest amount the
day before the election. So these proportions
change over time. But the proportion of professionally
produced news content was at its lowest just the
day before everybody voted. Phil was telling me
before we started that they got totally hacked
after they put out this research which I thought was extremely
vindictive of them. So the website might be down at the
moment but I can send you a copy. So, Matthew, over at Attention. Are you finding that
when you are presenting— if you’re presenting a video about
something that’s controversial or a really important issue, what
is your tact to get people to —when there’s so many things
playing to outrage and emotion. How do you go about it? Well, since we’re
talking about Facebook, you absolutely have to tap
into people’s emotions. Outrage, love, empathy,
excitement, inspiration. Those are all a lot
of the emotions that people will share or put stories
in front of their friends around. So, when we’re making a
video we often think, “What is the most visceral way to first get people into an issue
that otherwise might be boring?” And I think a lot of times people conflict
journalism with stories that are boring. Journalism doesn’t have to be boring. But in the world we live in, you
need to have a much quicker lead. You don’t have time
for slow play. So we think about
what are the images, what are the words that can
quickly pull people in? And then you can broaden and
contextualize as the video progresses. But you often have to find some kind
of emotion to pull people in with. And then often we leave them
with a call to action to share if they believe in the issue being
spread that we’re covering. So, if they want
people to know about redistricting or gerrymandering
like you mentioned, we’ll say at the end, “Share this video if you think more
people should know about gerrymandering.” And that really tells people to, if they care about an issue, spread it to their friends. And I think that’s an important
part of how virality works today. I’m curious, though. Like there’s
this sense of fault lines, that there’s deeper
trenches between us. Is it… should we blame
social media for this, or is it just, you know, helping
nudge us in a direction that we were kind of
headed in anyway. And I guess I’m referring to both
the United States population but also Europe and actually
all kinds of places. I mean, I’ll just say like automation,
you can’t resist social media. It’s where things are going. So, to merely say we should not get
our news from social media is rather, in my opinion, Luddite. And
it doesn’t make any sense. You have to adapt the
product for social media. That said there’s
been a lot of talk… I was at a panel earlier
today about fake news. To a certain degree you have
to blame the electorate. The electorate… you know,
the burden is on the voter to look into the credibility of what
they’re reading from time to time. If journalists themselves can’t solve
fake news, and Facebook itself, which is a for-profit platform that’s
built on engagement like Eli said, can’t singlehandedly solve fake news. Yes, they’re starting to tag
articles with a red line that says, “This has been disputed.” But with the electorate themselves have to look into the
veracity of what they read. And unfortunately that
burden does fall on people. So, it’s the hard truth but no one entity can spend the
spread of fake information. Eli, you agree with that? I have many thoughts. So, a couple of things. First off, I think part of the
reason that what you’re doing, what we’re trying
to do at Upworthy… you know, part of the
premise for that is that the audience for news
is much, much smaller than I think the folks in this
room typically would think. And there’s a study recently where they
actually installed a browser plug-in and watched just where
people went around online. I think it was about
15,000 people. Any guesses on the
percentage of people who visited more than one hard news
page in a three-month period? Oh, God. Don’t make us depressed. – Anyone? 4%.
– Oh, God. So, it’s not as if it’s a habit that is
evenly distributed across the electorate. That’s like part one. Part two is, and I think this is a really uncomfortable
thing to say in this conversation. And especially because
we don’t have someone I think from the right to
argue the other perspective, but this is a deeply
asymmetrical situation that we’re in in terms of the
nature of media on both sides. The nature of the
networks on both sides. And you can’t… and the consumption
patterns on both sides. So, what you see on the right
is you have a fairly small but extremely engaged
group of people who are pretty much only sharing fairly far-right information from Fox News
and Breitbart and other places like that. There’s no analogous
cluster on the left. It’s not that folks on the left don’t
share things that are left-wing more. But there isn’t that kind of
epistemic closure in the networks. It’s not a closed system. And so when we’re talking
about fake news, Brendan and I handled this fascinating
study where he looked at the consumption. Same kind of design, like install a browser
plug-in and look at where people go. And he looked at which people
visit fact-checking pages, and which people visit
fake news pages. And you see this sort of beautiful
but somewhat dismaying curve, which is that the people who were
most fake news in the last 12 months are folks on the… strongly right folks. And the people who
were least exposed were people on the left
end of the spectrum. The people who visited fact-checking sites
the most were the people on the left, and the least the
people on the right. So, the consumption patterns, I think one of the
challenges we have is you have well-meaning progressives, and
I include myself in that category, trying to design solutions
the way that we think. And not actually
grappling with the fact that there’s a whole different
epistemology happening on the right. I mean, Phil, that’s
making me wonder. You’ve been studying
bot-generated Twitter traffic that was mostly pro-Trump. So does that sort of fill
the vacuum in a way? I mean, what role are they having in sort
of the political discourse right now? Well, I think that they’re
having enough of a role. And I’m not sure I would
either blame the electorate or sort of leave it to
differences in ideology. So, we think that bots had a role
in the US election in making Trump, 16 months ago, seem more
popular than he really was. They made him seem like somebody
your neighbor might quietly vote for even though they weren’t telling
pollsters that they’d vote for Trump. We think the automation was
behind his early popularity rise. And then we know the
automation was behind the stories about Clinton
in the last week. And so it’s true that most people most of
the time don’t use social media for news, but they do during elections. Especially
those last three days. In fact… – Phil, did you link that to Russia?
– Sorry? Did you link that to Russia?
Those bots? No. We’ve given up on trying to
work with the geotag data on bots. We’re still looking
into the sourcing, but it’s difficult to know
where they came from. So you don’t know where
they originated? No, we don’t know where
they originated. We know that they appeared slightly
before the 2016 campaign season began. We know that they occasionally tweet
things about Russian domestic politics that you wouldn’t expect a large number
of Trump voters to be keeping on tabs on. But I don’t have
any data on that. So, they made Trump seem more
popular when he was largely a joke, right, compared to the other
Republican contenders. And they made the stories about
Hillary Clinton’s corruption stay in our feeds even after the FBI
said there was nothing to see here. So, I think it’s definitely
the electorate is changing. Millennials get their news in
very interesting, different ways. But there’s a very proximate cause here
of misinformation, and that is Facebook. Facebook served
misinformation to voters also during the Brexit
debates here in the UK. Facebook served bad
news… misinformation in the days before voters
had to make a big decision. And so, I think there’s definitely
the economics of news is changing. There’s lot of big
picture things. Higher education maybe isn’t
delivering critical thinking skills. There’s lot of big picture
things we could say. But the proximate cause is that
Facebook is the platform for bad info. I want to read you a quote.
This is good. We’re getting somewhere. Okay. So, NewsCorp CEO yesterday had an op-ed
in The Wall Street Journal and he said, “Google and Facebook have created a dysfunctional and socially
destructive information ecosystem.” – Of course he’d say that.
– Yeah. He’s one to talk. Do you want to take
that first or…? Go for it. Okay. So, first of all, I think
it’s really easy to blame Facebook. I don’t work for Facebook but I’ll
take the role as their defender here. We did by the way invite
them to take part in this. It’s true. We did. Yeah. I think that fake news has always existed. There’s always been
propaganda newspapers. It’s always been spread often. Historically it was spread
through word of mouth. But the whole premise
of Facebook is putting things that you agree
with in front of your friends. So, if anything, fake news is a projection of people’s political identities on the
right that they’re sharing. And those people probably
supported Donald Trump anyway. And did they really change
a lot of hearts and minds if especially Eli’s right and we
live in these filter bubbles. Fake news can also be spread
through Google, through Wikipedia, through Twitter, through
Snapchat, through Instagram. I mean it’s not just
one platform to blame. But the fact is the only
way to really stop it, and I think Facebook is
making some good efforts to tag articles when they’re
strongly disputed, is for voters and citizens to
actually do the fact-checking and research the source
that they’re reading from. You know, if you’re getting your news
from a website called patriotnews.usa, you might want to think twice. – But there’s…
– But that’s… And I think that there’s too much blame on fake news for
Donald Trump’s win, in my opinion, and not enough of an earnest empathy to
understand where his voting base came from. Eli, where did you gonna… And I think that’s actually why I
frankly reject a lot of these premises. I mean, I think… so let’s take the fake news
and the election piece first. I think—on one hand like anything
could have swayed this election. Media focus on Clinton’s emails,
Commies’ announcement two weeks before. – You know, weather and—
– Anything Russian… sorry. I mean, it’s literally like
because there was a small margin, you can attribute it to almost
anything and be relatively right. I think Trump’s base and even
the swing voters are older, non-college educated men, who are not your standard
Twitter users at all. And are not even especially your
Facebook users, which skews female. So, I think the premise that
was kind of the primary factor as opposed to like deeper struggles
over national identity and race. I think when you zoom
out and you look back, we’re not going to say this
was about social media. We’re going to say this was about some
much bigger things that were happening. But what’s interesting is, and I’m working on how to say this right, but it’s not
coincidental I think that you have this very
big set of questions playing out about national identity. At the same time that you have
these mediums which, as you said, are primarily vehicles for the expression
of identity and the creation of identity. And those things interact
and intersect in weird ways that we don’t totally
understand yet. But’s definitely the case that, you know, when people share things, it’s
not really for the purpose of informing. It’s for the purpose primarily
of shaping how you’re viewed. And the sort of crazy place that we’re getting ourselves
into as a society is that, you know, you have the primary
form of media distribution. What’s becoming the primary
form of media distribution not really based on any kind of
sense of like is this right, but on a sense of just is this fun,
or does it say something about me. That’s where more and
more people are getting. So, that’s a problem because
it’s not even trying to align the question of what do
people need to know, or the question of what
are they actually seeing. And it leaves people in— you know I always think about that thing
that people have their on their Twitter like retweets don’t
constitute endorsement. That’s kind of a problem if retweets are
the way that people get their information. Who does have responsibility then for
the distribution of information? So I don’t want to let the
platforms off the hook. Facebook is the biggest media company
in the history of civilization, except it doesn’t think it’s a media
company, and that’s a problem. But also, as you said— It hasn’t admitted
it’s a media company. Yeah, no. But it is the biggest
aggregator of human attention that we’ve ever seen, you know. – And, yeah—
– So … So, to me where the problem
starts to be is when fake news turn into hate
speech essentially. Because one of my new favourite
terms is the ‘Overton Window’. Do you guys know this term?
Oh, I love it. It refers to what’s okay
to talk about in public without being considered
inappropriate. So, it’s like it used to be completely
wrong to be outright racist in America and now, you know, at
least it was on the DL. And now it’s okay. And the more people say it’s
okay in certain pockets, the more people are emboldened
for that to spread. And so—Eli, actually you’ve called
this a crisis of authority, which I find really interesting. And you’ve been
studying this as well, because you’re looking at the elections
happening here in Europe, right? So, like where does the fake news thing
turn into a question of hate speech? And, of course, the law in
Europe is very different when it comes to that than
it is in the United States. It is, and political speech gets a lot
of protections in the United States. And here in Europe there tends to
be a little bit more public policy— public oversight for
political speech. And people are much more
concerned about hate speech because this is a direct
memory of World War 2. So, I think in Europe anyway, the instinct is not to leave it to
voters— the voters’ sophistication. And I think the policy makers
I’m talking to would agree that we’re past the point that industry
self-regulation is the solution here. The options that we’re talking
about, and I’m not making these up, these are what’s on the table,
are 20,000 euros per post for each piece of fake news that’s
delivered to a voter as a fine, right, from the German
government. A fine. Yeah. 20,000 euros per post. Another option is algorithmic audits. So, we audit scrambling machines, we
audit finance trading algorithms. Facebook should be
audited in a regular way that doesn’t violate IP,
intellectual property, but does let us understand what
they’re doing to our news diets. And then, you know, the other options
are studied, and continued study. And Facebook doesn’t
collaborate with researchers. They don’t share data. So, in theory, they could
actually tell us all these stuff and explain how bots might have
had driven some news stories. But they haven’t or won’t. Let me—can I just underscore it? It is crazy that Facebook, with two point whatever billion users and this huge aggregation
of people’s time, is completely
opaque to research. Like you literally can’t
begin to research what is going in the world’s largest
forum for media distribution. And it’s an enormous
problem because— and partly because—and the
research you’re doing is awesome, but all the researchers are
looking at Twitter saying, “Well, hopefully this has some bearing
on what’s going at Facebook.” But we really have no idea. So, I mean I think
someone has to take on figuring out how to ensure that
Facebook opens up that data so that it can be studied because we’ve never had an institution
that big that’s that opaque. – Can I jump in on the research?
– Yes. So, what’s interesting about
this is that we do know that Facebook employs data scientists
and does occasionally do research. Little tweaks to the system
that they have demonstrated would bring hundreds of thousands
of people out to vote. Little experiments to see if
they could increase the vote, and they can tell it works. They do little
experiments, emotional contagion studies that demonstrate they
could do much more, but that’s all in-house
user experience stuff for a platform that is probably the
source of our public life at the moment. Can we go back to the crisis of
authority because I think like— so I think the challenge
in this conversation is— so after the election I was thinking
about the fake news problem and I started
writing a few ideas of how you just could design ways
to downrank or filter fake news. And I tweeted out a
link to this Google doc that I was working in and within hours
there were hundreds of people in it. It is now literally 160
pages or something, with thousands of contributors
including a lot of Facebook engineers, Google engineers, and other folks
are actively working on it. And I’ll tweet it out afterwards. It’s [inaudible]. I had
almost nothing to do with it o ther than just happening
to tweet out the document, but people came together
and collaborated. And what was interesting in the
document and about the document is there was this tension between
sort of journalistic folks, who were essentially saying
the filtering mechanism should be you have a White
House press corps badge and that gets you
some little special gold star next to your name in the
Facebook algorithm, and you go up. And then you have people who were looking
for these decentralized solutions. And this conversation
about fake news to me reads as a kind of
proxy war between old media who doesn’t want to let
go of its authority and wants to find ways to instantiate
that in the new platforms, and these new forms of
authority that we’re seeing. Like the Google doc itself, which was
thousands of anonymous contributors – I have no idea who they were. But they were contributing and
collecting a lot of great ideas. And the Google and
Facebook engineers wouldn’t have been able to
contribute if their name had— if they’d had to credential
themselves, right? So, where we are is
this complicated place where I think the old systems of authority
are problematic and didn’t let people speak, and were corrupt in some
essential ways, the old media, by their advertisers, by their
connections to folks in power. But we don’t have a
way of thinking about what this new system of authority
should—who should have authority, who should be trusted
in this platform world. Okay. So, this is perfect. You guys are the best. 11
minutes till we go to Q&A. And like to me then that says— okay, so whose
responsibility is it. Let’s switch then to what
solutions you are finding. So, Matthew, you’re saying
that it’s upto normal people to figure out what’s real, what’s not
real, how can you trust this source. So, is that your job then, as somebody
making media, to educate people? Yeah. I mean, I think it’s
up to more than just people. I think all the
people certainly bear a significant portion
of the responsibility to discern what’s real and
what’s fake, in my opinion. But how do they learn
to do that, though? Well, I think that’s where credible
media companies have to come in. You know, we have to correct the record and myth-bustings that are
not true consistently. And that’s a big part of
our editorial each day. When we have editorial
meeting we often brainstorm, “What are the myths that are floating
around on any issue being debated?” Issues around the budget, issues
around social justice topics. Issues around war and peace. And then you take those
misconceptions and you debunk them. And then once people say, “Wait.
This is based in fact, evidence, research, original quotes,
expert testimonial,” then they can deem you an categorize
you as one of the credible outlets. And, you know, when your
peers also review you— no different than
academia— as credible, then you have a reputation, and
then you maintain that reputation. So, that’s the way we
look at spreading truth. But I also think some of the solutions are
just to be a premium editorial outlet when you’re programming
on social platforms, and to really take your time and to
optimize your product for those platforms in a way that’s made
especially for them, so that people can see their news feed that
looks actually real, as opposed to things that are
just sort of pushed through RSS. Okay. So, how do we do that then?
Because it’s interesting to me— there are three I think financial
models here on the stage, right? We’ve got the academic. Are you
[inaudible]? What… ? – Yeah.
– Okay. So, not-profit. – No, for-profit.
– For-profit but with a— Social mission. Social mission. Okay. Yeah.
And then you’re— We’re also for-profit
with a social mission. Okay. Is that how it how it has to work?
I mean because— Well, I think any good for-profit
today should have a social mission. It’s that simple, I mean. But that doesn’t mean that you
should not want to do your part. Speaking to the right
crowd for this. But I think that said, you have
to monetize to stay in business. I mean, look, Laurie’s here.
I saw her speak earlier, and she can testify to how hard it is to
raise philanthropic dollars for media. I think philanthropy in journalism
is incredibly important but I don’t think it’s going to
necessarily save the entire industry. I think— It’s a hundred million
dollars more, apparently, as of today, I think,
from the media. But you know how much Bill
O’Reilly makes in two years? His show makes for Fox News?
$460 million, right. And that’s just one show
on one cable channel. – And I think the idea that Facebook should—
– Thanks. But the point is you have to monetize your
editorial in a way to sustain yourself. And you have to be entrepreneurial
in how to go about that, and appeal to brands who
want to reach your consumer. But also not let the brands
compromise your editorial. And that’s been the
truth for decades. Perhaps that’s been the truth for
decades but not for the longer term. I think we often forget that this thing we call journalism started
with public investment. It was public money that
sustained, and set the rules, and allowed a culture of
professional journalism to kick in. I just want to say I think the— to me the idea that Facebook should
turn to civil society groups to do fact checking of content
on Facebook is inane. I’m totally grateful—the big money
coming in from the media makes sense and is a sensible investment, but why should the media be paying for
fact check—supporting civil society groups – They’d be working
for Facebook.Yeah. – to do the fact checking for
the content that’s on Facebook? You know, all the things
that could happen. Facebook could do some fact
checking with this Like— I mean, I happen to agree. And I
think I find that with our show, a lot of teachers use it
as a tool in the classroom because it is that fundamental that this
is the next chapter in media literacy. And there are no resources
out there to get kids to the point where they know
whether something is reputable, or true, or even how to find
out, or what it even looks like. And so, I mean I think we
have to go so many steps back to the very, very basics
of what is a fact, period. But I think you need you need
media literacy for the media, and for students, right. Because I think one of the challenges
with public interest journalism is that it has been able
to presume an audience. You know, if you write the best
investigative piece at The New York Times, you get on the front page and you get seen
even if it’s not a super engaging piece. And that’s going away. And that has
to compete in the social news feed with someone twerking and whatever
other distractions are happening, and someone’s update
about their family. And we may not like that, but that is
the place where this all going down. And so I do think—to
what you said, like you both need consumers
and the audience and citizens to take and accept responsibility
for their rebroadcast power and their creation power. And you need to be talking about
things that really matter in a way that works
for that medium. And I think for Upworthy, you know, we look
at pieces of that. One is about using kind of
emotional human storytelling to build empathy across groups. So how do break down
some of these divisions? We know from research that one
of the most powerful ways to break down stereotypes
and implicit bias is simply to actually spend a little
bit of time in someone else’s world with who they really are. And when you look at—you
know, we have a video about a young Muslim woman who is
teaching a self-defense class. And you can just see
how the specific, concrete facts of this woman’s
life run totally counter to the underscored methodology around
what it means to be a Muslim. It just breaks through all of
that because she’s a person, and you root for her,
and you care about her. So, I think that
building empathy— you know, I think we think about the purpose of the media a lot of
the time in terms of information. But I think if you look at the
founders of the United States, they were equally concerned
about factionalism and rival tribes
going at each other. And that doesn’t happen
through information. It happens through a
shared sense of identity, a shared sense of purpose and
belonging, and an understanding that other people’s interest might your be
your own interests in some bigger way. So I think these media can be
really extraordinary tools for telling those kinds of stories,
and actually building a sense of cross-group understanding. Okay. So, we’re going to questions.
I have one more. So, have your question ready. I think
there’s a microphone floating around. And if you want to get it
ready, we’ll go there. While that gets started I’m just
going to ask one final question. From my perspective, like
you wake up this morning and everybody knows about what
happened in Syria, and it’s horrible. And how do we then—it’s great.
We all know, but then I think a lot people
in this room would like to know but then how do you get people to take
that empathy that you’re talking about, or the knowledge that they have
and then actually be invested and do something,
and make change. Well, one of the powerful
parts of social media, for some of the bad
press it’s getting, is that you can immediately and rapidly
start campaigns to raise money. We just covered a campaign around
the famine and hunger in Somalia, that was actually started
by a social media— I guess you could call—he
was originally a Viner. He had a big Vine following.
His name is Jerome Jarre. And he raised two and a half million
dollars by tweeting at Turkish Airlines, which had the only direct
flight to Somalia, to ship cargoes of
food a few weeks ago. And it was an incredibly powerful
campaign. He started it. He created a hashtag. He got a bunch of
individuals and celebrities to retweet it. The next thing you know, Turkish
Airlines signs on board, and two and half million dollars is
donated through a Gofundme campaign. That could not have happened as
quickly—certainly as quickly, and probably as
easily—15, 20 years ago. It would have needed
the ad council to do a whole campaign on television
networks, and that is a huge bureaucracy. And so, social media has cut down that
kind of bureaucracy to do social activism. And you’ve seen it with Syria,
you’ve seen it with Somalia, you’ve seen it with disaster
relief in the United States. I totally agree with
that and I think one of the most powerful
disinhibitors to both action by actual engagement
with these kind of stories is despair and a feeling
of hopelessness. And I think people are actually rationally
looking at Syria’s story and saying, “I am not going to spend emotional
time and energy on that because there’s nothing I can do.”
Like if that’s what they believe, that there’s nothing to be
done, then they look away. And there’s some
fascinating about this, about empathic withdrawal when we
see things that we despair about. And I think frankly—and I say this
as a former advocate—I move on. And as someone who cares a lot
about all these movements, I think people often feel
despair and hopeless when they think
about these things. And so, with Upworthy, you
know sometimes people say, “You’re being Polyannish,” or, “It’s
all these hopey, changey stuff.” And— How’s that going? I deeply believe in it.
You have to have hope. You have to believe that change is
possible, or you’ll never make change. And so, if you can give that to people
in little bursts, and show them how they can actually
participate in these things, and show them that
there are solutions, that is the beginning of
actually making change. And cover positive news, you know.
We cover good news occasionally. You know, there are good
things happening in the world, and we go out of our
way to amplify them. And by the way, they’re some of the
most shared articles or videos we make. Great. Well, let’s
go to questions. We have someone right here. I think there’s two microphones.
Right there in the front. She’s coming. You have to wait,
otherwise I get in trouble. Okay. Thank you. Thanks so much. Julia
Davies with DataKind. So, I’m curious – we’ve
just put out an open call for projects to do predictive
technology on issues in journalism. Combating fake news, combating
hate speech online. What do you think—you know, we
hear a lot about the bad bots, and all the things that automation
is going to do to ruin the world. What do you think are really
ripe opportunities for predictive technology
to actually make the world better in
this particular context? So, this is a great question because
we should talk about the good bots. The ones that make corrections
on Wikipedia, right, whenever a politician tries
to edit their own page. There’s some great examples of
computation journalism, right. Journalists doing creative work with
big datasets to break new stories. And I think the—so, one of the
things we’re going to work on next is the use of machine learning
to generate the content for manipulating public opinion. That’s the dark side of what
perhaps you’re describing, using the machine learning
algorithms or basic AI to get good information
on human rights abuses, to manage the drones that deliver
supplies in the case of a crisis. So, in my experience I just want to
echo what my co-panelists have said, that I think that activists,
democracy advocates, are always more creative and
more desperate than dictators. I think we’re at a sensitive moment,
where the dictators have learned— political elites have
learned some of the tricks. And the cycle of innovation
now sort of falls back on us as civil society actors
to get creative again. Start to work with the
internet of things, watch the privacy regulations, and you know, participate in
information policy making because that’s the one domain of policy
making that will affect all the others. Like will ruin all the others. Also just say—I mean, I think if you’ve been following
Facebook closely in the last few years, they’re very tuned into what
works outside of Facebook and copying Snapchat
features left and right. I think that’s an
entrepreneurial opportunity for the kind of people who
are working in this space on the public interest side, too. I see no reason why we shouldn’t see adoption
by these platforms of new methods and
ways of doing things that are pioneered and
described outside. On the one hand, that may be an
intellectual property problem. But on the other hand, I
think like with Facebook, you can’t underscore the
degree to which—sort of like, you know, someone woke up and found
themselves president of a small country that they didn’t want
to be or expect to be. They own the media and that
wasn’t even their thing. And so I think a lot of
what you see is just like coming to grips with an immense
amount of power that they’re not— they need ideas. And I think these ideas can come
from folks in this audience, and could be really powerful. I’d love to give a small
case study if I may, which is we did with our
listeners, 30,000 of them. Speaking of good bots. We decided to do a week where
you were your own filter, and see what sort of focus you
could—what you could achieve if you decided to filter
the news or whatever. We [inaudible]
information goal. And then for a week we had bots tweet
you to check how you are doing, to keep you sort of on track, and
to measure how well you did. The project had a really silly name.
It was called Infomagical. But the results were really
quite extraordinary. And one of those times, the
bots would send you, you know, “Do you want to talk to Manoush?”
And if you texted back yes, your phone would ring,
and it was me saying, “Hey, it’s Manoush. You
can leave me a message.” And I got two and a half
thousand voice mails that day. I listened to
almost all of them. And so I think there is the sense
like the individual—I really did, because people are so generous, and
they tell you so many amazing things. And this was like a way to be in
these people’s lives in one day, and telling me what mattered to
them, and what they had taken in, and how it had
changed their life. And I think this idea of going back
and forth with stories as well is incredibly powerful. And they were so upset
when the week was over. Because they were like, “But I need you. I need you to keep me on track because otherwise I’m looking
at couches on Pinterest for hours and hours,” you know. So, I think there’s really
small things we can do too that can make a very
big difference. And for me especially
with younger people, who I think are not
learning self-regulation when it comes to a lot of these stuff,
and there’s an opportunity there. I don’t know if anybody
else wants to— Is there another question?
Right here in the front. Right here. I have a— Oh, sorry. Wait. We’ve
got a mic here. Can we run the mic up there for
the next one? Oh, you have a mic. Duelling microphones.
Okay. Back. Go. Okay. Debra Dunn from
Stanford University. So, there have been
several studies, and in my jetlagged state I’m not going to remember
what the sources are— We’re with you. but I don’t think
it’s fake news, saying that because so many people
are consuming their news online, it’s radically changing the way
people read news and consume news, and leading to a more superficial
level of looking at news. Just looking at headlines, etc. This isn’t something
that you commented on, but clearly changes the
ability that people have to have a substitive
discourse around things as opposed to just babbling
over the headlines. So, I’m curious about
whether in your work, you see this as one of the significant
issues that needs to be addressed, or not so much. Matthew, do you want
to take that one? I think the fact
that your company is called Attention I think
is kind of perfect for— Yeah. Sure. Well, I mean, I
haven’t seen that study. I’m sure there’s
some truth to it. But candidly when I
read a hard newspaper, I still scroll through and read the
articles where the headlines interest me. So, I think, you know, there’s still a natural filtering
that happens in headline competition. And editors have always tried
to come up with punchy, at times, horrible pun-like
headlines that you see. But I do think
there’s more competition just because like Eli said, you’re competing with family
photos and Miley Cyrus twerking. So, you have to be really clever and attention grabbing with
how you appeal to people. And we have conducted some research
as to what types of headlines work, and also what types
of images work. And I think you have to tell
people what they’re going to get. You have to promise them that they’re going
to get some kind of important information, some kind of emotion, or some
kind of necessary scoop, so that they can be in the know. And that’s what we focus on. We try to market our story accurately
to say what’s actually in it, but bring people to it in a way where we’re giving them
some kind of gratification after they’re finished
consuming it. Do you want to add to that? I also think One is David Carr of The New York
Times used to tell the story about being at an alternative weekly, and all of the reporters would fight
over who got to have the cover story. And when the newspaper
went online, they realized that
actually nobody reads the cover story, and everyone only reads
the astrology column in the back. And so, there’s a way in which I think
that’s partly what’s going on here is we’re just seeing with
data what was already true. Which is that people don’t engage
as much as you would hope. I think the other
piece, though is like so the power of stories is
that they build schemas. They build like mental
scaffolding for how to think about an issue
and the concepts around an issue. And one of the challenges in this area,
like if you want people to engage deeply, then you have to build this
kind of fluency with the area, and you have to build the
schemas from the bottom up, or it’s just going to
be too hard going. And so I actually think starting with something that is a fairly
simple emotional residence schema gives you a foothold to then
build complexity around. Whereas if you go
directly to complexity, I think it’s often every challenging. Like imagine—when you think
about foreign journalism– when you know the characters in a country, and you know who they are and what
they’re doing with each other then it’s interesting
to read about. But the first time you drop into what’s
going on in the politics of the Congo, without that background it’s very
hard to see what is happening and what the story even is. So I just think those building
blocks, those schema, are a really important
piece of the process That’s an excellent point.
Just to be clear. The suggestion in the
study was not just about you have to compete to
get people to read the story, but that not many people are
actually reading stories, period.That when they
look at the data, I can speak to that. a lot of what was stopping it— So, one researcher who’s doing amazing
work is Maryanne Wolf at Tufts University, and she’s looking a lot at the neuroscience
behind deep reading versus skimming, and this idea that it is
indeed a muscle that like, you know, if you don’t use your deep
reading skills they do indeed go away. She did this own experiment on herself.
Like she tried to sit down and read her ‘The Glass Bead’ by Herman Hesse
, and could not get through it. And so she sort of
powered through, and it took her two weeks to be able
to actually sit down and read a book. I mean, I certainly
see that in myself, that when I try to sit
down and read a book, it is really hard if
I don’t stick to it. So, what her indication is, that actually that’s another type of skill
that needs to be taught to younger people. That there’s the deep reading
skills versus the skimming skills and that you need to have both. I mean, even if you
don’t want to read some diatribe about a certain
Kurdish faction, you are going to need to read insurance
papers, or when you buy property. You have to be able to do—sit
and do this deep reading. So, I think there’s an interdisciplinary
sort of interesting approach to how we take in information as well. And I think the head of the
Institute for the Future was talking about that this
morning this neuroscientific look at what it means to present
information and news. I think the findings
sound right to me and I also think that
this is the moment where we can start to
experiment a little bit not with how news
consumption happens, but how Facebook is
integrated in public life. The exit polling system in
most democracies is broken. It hasn’t worked in the
United States in 10 years. It hasn’t worked in the
UK for five, at least. Could Facebook have helped us do
some kind of exit polling system? There’s a lot of exciting
experimental deliberative polling, deliberative democracy
exercises that led people who actually are
interested in an issue spend time thinking about something
that could generate a voter’s guide, or could generate a public
policy recommendation And social media is probably
going to be the best hope for those supporting those kinds
of institutional innovations. Just today, by way of
a—kind of factual, what country in the world will
ever hold another referendum? What government will ever
run another referendum? I would say referenda are
not likely to be popular modes of governance for
the next few years. But experimenting
with social media could get the people who are
interested in those news stories to generate trustworthy
findings—outcomes for the rest of us. To the good book problem, I think
this is actually a thing that— it feels like a problem that would
be hard for platforms to solve. But I actually think there are it’s an interestingly solvable
problem in a certain way. Which is what personalization generally
focuses on is prospective interest. Like, am I clicking on the headline?
Am I viewing this video now? What it doesn’t focus on is a month
later, two months later. Like, did it leave me with anything?
Do I care? And when you think back to
your peak media experiences, the place where you consumed something
that really changed your life, or changed how you thought
about something that you loved and that you would want
anyone else to have, that set of content looks really different from the set
of content that you’re served, if it’s just about what
you’re clicking right now. That’s a measurable thing. Like you can go back to
people and say, “Okay. Here’s 20 things you saw
on your Facebook feed. Put a star by the ones that now
a month later you care about, or you were glad
you saw, or you, you know, they’ve built
some kind of value for you. ” And I think that could be
a powerful intervention. Part of what we’re
dealing with here is a war between our impulsive
selves and our aspirational selves. And our aspirational selves
could use a little help. Because it’s not like it’s fake. It’s not like– what some of the
data scientists would say is like, “Oh, yeah. People say that they want to
watch [inaudible] movies, but actually it’s all Dawn of
the Waking Dead 4 or whatever. But that’s not really true
because in our lives, it was powerful to read that
Herman Hesse book. It did matter, and actually mattered a lot more than
the 19 cellphone reviews that I read even though I’m never going to
have anything but an iPhone. That is true. I think like there’s
interesting startups like Pocket. Do you guys use Pocket— Yeah. which
is an app that saves your stuff. I try to save—I saved, and it kind
of made me so happy it’s so stupid, but remember when they named that
boat Boaty McBoatface or whatever? I don’t know why
that made so happy. So I stuck it in my Pocket along
with like this really long deep article from The New
Yorker that I really must read. And actually then I read it, and it was
amazing. But they’re next to each other. Boaty McBoatface and the
long New Yorker article. But I’m trying to make space for the
things that really have moved me, and I think startups like Pocket can
be helpful with that.. Your question Hi. My name’s Meighan Stone. I spent the last three years
working with Malala Yousafzai. Right now I’m at Harvard at the
Kennedy School with Zack Exley. Actually at the Shorenstein Center.
I’m looking at the intersection between social media and the Muslim
ban policy in the United States. I’m working with MIT in the media
lab on it. But, my question is there’s a lot of hand wringing at
Harvard and here and everywhere about Russian bots and
about what happened. And I’m wondering if you
have any advice to people who are actively organizing resistance
efforts in the United States and otherwise, like how to use
these tools better. My two friends who started a group called
Indivisible, they stumbled into it. They had an open source Google Doc and
now this has become 5,800 groups. So as much as we
see Russian bots, we see 5,800 independently
organized chapters of Americans pushing back on the Trump agenda with no resources and
no real apparatus. So, what kind of advice would you give
to these groups that are organizing, now knowing what you know about
how to do that well today? So, I’ll give a tactical answer
and a bigger answer. Tactically, I think you can’t underestimate the power of
simply explaining to people something that plausibly seems like it
will actually make a difference. And this is the sort of like let’s treat people who can potentially
be engaged as somewhat rational adults who are looking at your petition or
whatever, my petition, and saying, “That just doesn’t—it seems like bullshit.
It’s not going to do anything.” What was powerful about
Indivisible was that you read their
manual and you go, “Oh, yeah. This makes sense.
It’s doable. I can do that.” And I truly believe that
people are not apathetic. They’re just trying to figure out “Where can I put my energy and
get something to change?” And so I think anywhere
that you can show that theory of change in a
really clear, crystalline way, you often find that people
are really willing to do it. If you can tie that to
an emotional story, which is what we had
with the Muslim ban, and even more particularly
to individual people stories then it’s even more powerful. And I think part of the
power of Black Lives Matter was rooted in the stories of these
individuals whose lives you get to know, and whose deaths
you get to know. That combined with the theory
of change that makes sense, that’s very powerful. The broader thing I
would say is, one of the most powerful needs and
forces is the need to belong. And I think that’s
what’s animating actually a lot of the bad stuff out
there, as well as the good stuff. And I think building movements and
organizations and media companies that give people a sense
of belonging to something and having an identity
that they’re proud of, is just an incredibly important piece
of how we get out of this mess. Another positive thing of
Facebook by the way is the event. I mean, after the travel ban people
were basically planning a Saturday or Sunday protest at pretty much every
major airport across the country. Those weren’t organized by
some major Washington DC non-profit that has a national arm
that used its local affiliates. Those were organized by random organizer
in LA. Random organizer in Seattle. And then they grew to having
8,000, 12,000, 15,000 RSVPs, to the point where people
actually showed up. And the women’s march actually
started as a Facebook group. I interviewed one of the co-founders
of the women’s march—co-organisers. And she said, “While the campaign was
happening we created this Facebook group. And all of a sudden
250,000 people RSVP’d.” So, I mean another positive benefit
of Facebook even though it does have its negative externalities, as I
think they’d say in academia, is that you can organize events
and quickly get people together in a really powerful
and impactful way. Phil, do you mind just commenting
on this idea of building good bots to combat bad bots? Like
is that a viable tactic? I think—so we once run this experiment
where we tried to write bots that would convince somebody who
either thought the Earth was flat, that you shouldn’t
inoculate your kids, or that white supremacy
was the way to go. And we tried to write bots to
change their minds about things. And we had links to science, and we had
links to policy papers, things like that. And we had to stop this—I know, it
sounds funny. Links to policy papers. We had to stop because our bots got
locked into a debate with the bots… anti-innoculation had their bots,
and their bots found our bots. And our research was funded by
the National Science Foundation so that was a problem too. So,
I want to say no, don’t do it. But on the other hand, the people who are arguing the other
side of the issue are doing it. And even if we
stopped talking about Trump— I think Trump and Brexit were mistakes
based on misinformed public opinion. But let’s put those aside. There
are other issues to do with what I think of as facts and
science like climate change, or the link between
smoking and cancer. Some things that we’ve held as truth for a
long time that are starting to whittle away because of some very successful
automated social media campaigns that get people to think, “Well,
it’s not every form of cancer. It’s just some of the cancers,
and the link isn’t quite clear, and climate change is—
do we really know?” And these are
driven by issue-specific lobby groups who want to take down green technology,
or want to put up coal, clean coal. There’s a range of issues that
have their bots behind them. So I guess I would say you need
a bot strategy of some kind. Are arguments between people more or
less civil than arguments between bots? Between bots. That’s a good question. And what
came first? Yeah, what came first. Anne-Marie. Right
here in the front. Anne-Marie Slaughter,
present CEO of New America. So I want to underline one thing
I’m taking away from this panel and then ask a question. This idea of the tension
between our impulsive and aspirational selves is
very important. I mean, I look at my son trying to block
social media when he’s working. And there are various apps you can get
that work. But I think about what
if you had a Newsbit? You know, my Fitbit means I walk
10,000, 12,000, 13,000 steps. If you do something like that
to get news from a wider range, it seems to me that that
would be quite helpful. The question that I want to ask Civil
Discourse in the Social Media Age. So, I use my Twitter feed
as an experiment routinely, and when people tell me horrible things
I go back to them and I say, “Really? I mean what’s the point of calling me
that name?” Do you always do that? I often do. It’s my little forum for
trying to improve civil discourse. And what surprises me is about half the
time I get an apology very quickly. They didn’t really think there
was somebody on the other side. They were just venting. But they
really use horrible language. Really repulsive language.
But I call them out. And so my question is, you know, when we talk about you’re
creating tribes and belongings, the biggest problem I see is we’re
demonizing people on the other side. They’re not just different
from us, they’re evil. And the left does
that to the right, as the right does
that to the left. So what are your thoughts
on how we really— I mean we don’t have hate speech laws
in the US, but that’s not it anyway. It’s just basic civility.
Like didn’t you have parents? Some kind of shocking people back into
a very basic norm of civil discourse. One of my favourite pieces
of research about this was looking at where there
is civil discourse online. And fascinatingly one of the places that
you can find it is on sports forums. And when you think about it,
it kind of makes sense, because there is a tribal
identity of being for the Patriots or the
Red Sox or whatever Red Sox Nation. See, we’re bonded now,
and we can disagree. But it really is like that. That if you prime a different identity
and that’s the central identity, then you can have a good conversation
about really hard things underneath it. And I think finding spaces and ways to
prime identities other than partisan ones is a key piece of how you
build those connections. My other favorite study
on this front is about basically they took a football
stadium —another sports thing— and basically took people
whose team had just won and people whose
team had just lost , and asked them both
about points of view that they considered a
threat to them in some way and groups of people,
like implicit bias. And the group that had just
won was much more open to different points of view and
open to different kinds of people than the group that had just lost. Because when people feel like
their identity is threatened, they cling to the things that give
them that kind of sense of identity. So, what that
suggests is affirming one dimension of people’s identity
gives you space to create disagreement that people aren’t going
to freak out and go like, “I’ve gotta win because my
whole being is at stake.. ”I love that. You guys have
anything else to add to that? It’s tough. And one of the things I did after the
elections was I started following a bunch of news sites that I
disagreed with generally speaking because I wanted to
understand filter bubbles. I wanted to understand where the
other side was coming from. But then I also
think part of it is covering issues that reach
across the partisan divide. And one of the issues that Attention covers regularly and
proudly is marijuana reform. And that is a great uniter of libertarian
conservatives, social justice liberals and increasingly so old people who
need it for medicinal reasons. And so we’ll post on that topic and we’ll see the strangest
bedfellows come together. So I think it’s really important to cover
issues that will bring people together as part of your editorial
programming strategy. That’s probably the best
answer I can give you. But forcing trolls not to be
trolls—I mean, when people have the luxury of anonymity of course
they’re going to say terrible things. And I think that’s predated Facebook.
I think it’s predated Twitter. So forcing people to come out of
anonymity will make it more intimidating for them to use foul language
or harassment or hate speech. And some sites make you
create a public identity. It’s actually— the problem of
trolling is not as bad on Facebook because there are fewer fake
accounts than on Twitter. Trolling is much
worse on Twitter because its’ much easier to create an
account and it’s a lot less curated, and therefore people create these
bots, or these fake identities where they can basically
just be jerks all day. Dogs. Another good one. I’ve had the
Facebook—like somebody I know at Buzzfeed, she is really into the dog
Facebook group in her neighborhood because she’s like, “There’s
no talk about politics. All we’re doing is
talking about dogs. And she feels really connected
to her neighborhood. Makes sense actually. I think we
have time for one more question Anybody else? Right
here in the middle. Yeah. Hang on one sec It’s [inaudible] Institute. We had this conversation this morning, too, and I just want to tap you
guys insights into truth Because I always thought that
the tenet of a good journalist was to be objective
and share the facts. But I’m finding that that’s
not necessarily true anymore. And even this morning
on the panel, one of the panelists was talking
about her liberal media site, which is a fabulous
site if you’re liberal. But if you’re not, then
you’re not gonna go. So the question I
have is how do you… in this era of breaking down
into fractionalized media sites, and we only have so much
time because we all work. At least in this
room, I’m assuming. And I don’t have time to
look up all the facts. I don’t want to have
to look up the facts, I take issue, Matthew,
with what you said. I don’t have time to do that.
I’m depending on you as an objective, reasonable
journalist to tell the truth. And I just want to know where
you guys come out on that, because that I think is where
we start building trust, if people begin to understand that
what the media is saying is truth. Well, first of all, to clarify I was
saying you have to vet the sources that you are
consuming news from. I think you need to look at where
you are consuming from and say, “Is this place credible?”
I’m not saying that’s your responsibility to fact check
every journalist you read. Of course not. If you’re reading from a
reputable, trusted outlet, of course you should take the
journalist at his or her word. But I think I like to define the role of
good journalism as fairness not objectivity I think there’s such a thing
as false objectivity, and the climate change example
is the best reason for that. There’s no reason
to have a climate [inaudible] on with someone who
believes in climate change when the clear data evidence and fact
show that climate change is real, exists and it’s existentially threatening
humanity. There is need to be fair. And what we try to
do at Attention is to be fair and give a fair hearing
and argument to what we’re covering. And fairness means that you
don’t use strawmen arguments, you don’t lampoon the other side. But if 98% of data
proves one thing, you don’t have to create
some false objectivity that the other 2% somehow
has 48% more weight. And I think that’s where
cable news has gone awry. So we like to view our
job as being fair. That might coincide
with being objective, but I think when you’re
fair people respect you. And we’ve even had readers say, “I don’t agree with for
instance taking marijuana. I don’t agree with marijuana legalization,
but you present the data fairly. But I disagree with it because I still think for my own
personal experience it’s a getway drug.” And we’re never going
to disprove that person but we still put out the fair
evidence that it’s not a getway drug, based on the numerous clinical
studies that have shown it’s not. In fact, it’s an anti-getway
drug for opioid addiction. So the point is, you
have to be fair. But being objective
I think is kind of a word they should no
longer necessarily hitch their wagon to as
steadfast in journalism. So, side experiment.
You have two people. One person is cold and narcissistic,
self-interested but factually accurate. And the other person is warm, cares about you, has your back,
but less factually accurate. Which one of those
people do you trust? I think on a human level, trust is about who is on
your side and has your back. And I think this is one of the big gaps
between how journalists think about trust, and how trust actually
works between people, is that you don’t trust people
based on factual accuracy. You trust people based on a sense of,
“You have my best interests at heart.” And I think people really feel like a
lot of media and a lot of journalists don’t actually care about
them, and have their backs. And there was a wonderful study by the
Maine– The Portland Press Herald looking at where their coverage
was relative to the vote. And they found that counties
in Maine and cities in Maine where the Trump vote was highest, they had done the
fewest stories, right? They really didn’t care about the stories
of those places and those towns, and it showed up. And so, I really believe
that if you want to build that trust— trust is a necessary
precondition for facts. It doesn’t go facts then trust.
It goes trust then facts. And if you want to build that trust, you
have to actually demonstrate to people that you care about
them in a real way, and that you have their
backs in a real way. And that’s the work that we have
to do And we have to do that, too. [inaudible] do it. Yeah. But I think that’s the way
you break through in this. You want to have the last word Phil?
Well, this one’s tough because I do believe in truths. And I think that
there are some truths that need to be communicated to the public, and we
as a sort of panel made a— there were a couple of jokes earlier.
We started off with a couple of jokes about whether Facebook was
a media company or not, or whether they were telling themselves
they were a media company or not, and I think they are. And that there are truths, but
that the public believes things based on many different kinds of info they
get from their friends, their family, the bus ads, and what
they see on social media. And so anything we
can do to inject or protect some truths in social media
feeds will only improve public life. And I don’t have a laundry list of
which truths should get through. We know Facebook employees about two
years ago suddenly got excited— upset about payday loans. And they did something about the
advertising on payday loans that wiped this issue, this economic
inequality issue off the social networks. There are a few other
issues like that that are public policy issues
that have truths about them, that Facebook could push to feeds. And I think we’re
past the point where we can just leave it to social
networks to sort of self-police. Yeah. See you next year, right?
Thank you so much, panelists. And I leave you with one thing. Sorry. Go
out, do one thing after you leave here. Go have one civil conversation
about something really hard. Because I think that’s part of it too,
is we need to be brave in the face of all of this. So, thank
you so much for having us.

Leave a Reply

Your email address will not be published. Required fields are marked *