It’s beginning to feel a lot like 1984. Today, some politicians routinely use the term “fake news” to discredit any news they don’t like, or any news organization that asks hard questions. The term “alternative facts” is even being pushed by certain White House advisors.
Online, how big is the actual fake news epidemic? No one knows for sure, but the the scale of the problem is potentially huge. Facebook has almost 2 billion users, Twitter has over 300 million; and according to Pew Research, about 60% of Americans get some news from social media. Assuming even a small percentage of users have nefarious intent, eliminating fake news and online abuse is a bit like King Canute trying to hold back the tide. But after accusations of Facebook turning a blind eye on its rampant fake news and potentially impacting the U.S. election; the pressure to effectively and transparently root out fake news and online abuse is likely to intensify, especially with upcoming national elections in Europe.
“I think fake news impacted the election, just by the sheer volume… It can change your perception of the world…Even people who understand news and research can be tricked by fake news.” Adam Schrader, one of 25 former editors in Facebook’s fact-checking team. The entire team was fired by Facebook last summer, just before the election.
At the recent Watermark Women’s conference in Silicon Valley, I spoke with Jessica Rothenberg-Aalami, CEO of Cell-Ed, an online education startup. Here’s the report I filed with the BBC’s Click Radio. It aired today on the BBC World Service.
Listen to the BBC’s “Fake it or Leave it” podcast here (first story in the program lineup) or listen to the 8 minute clip below:
Here’s a transcript of my report, edited for length and clarity.
Click Host, Gareth Mitchell: Misinformation is nothing new, as we heard last week from classics professor, Mary Beard. Today fake news has become a news story in itself. It’s becoming political, it’s undermining social media organizations, and mainstream media. Twitter and Facebook are taking action, but with so much being posted, isn’t it a bit like King Canute trying to hold back the tide, trying to monitor and correct fake news? Our Silicon Valley reporter Alison van Diggelen has been seeking some answers from the big social networking companies and catching up with CEOs of startups, people like this:
Jessica Rothenberg-Aalami: Technology has always been a source of incredible opportunity, unlimited potential pathway and it’s always been destructive.
Alison van Diggelen: Jessica Rothenberg-Aalami is just one of many critics who argue that social network platforms are not doing enough to curb the dark side of the internet.
Jessica Rothenberg-Aalami: I work in community technology access centers…Everybody tells me worldwide, if you have 100 countries…with all these community access centers, isn’t that wonderful? You can bring digital media, books to that village. I say: it’s always double sided – by day maybe it’ll be used for education, and health access, and how to get a better job. But by night it becomes a digital brothel…
Alison van Diggelen: What should be done about that?
Jessica Rothenberg-Aalami: Own it! Twitter not taking a stand around the blatant misogyny and hate language… strange politeness in the face of atrocity is very frustrating.
Alison van Diggelen: What do you feel people like Sheryl Sandberg, Mark Zuckerberg, the Twitter board should do?
Jessica Rothenberg-Aalami: There’s a responsibility to – at the very least – do one or two steps. Untruth is seen as truth because it’s relayed over a screen with a picture. You believe somebody’s story…If that story is a blatant lie, have a way to say “untrue.” Hashtag untrue.
Kara Swisher interviewed Sheryl Sandberg about why she didn’t attend, or even post, about The Women’s March. Sadly, Swisher didn’t ask Sandberg what she’s doing about fake news on Facebook. Next time, let’s hope!
Alison van Diggelen: I took Rothenberg-Aalami’s complaints to Twitter who gave an off-the-record account of their completely new approach to abuse online. Twitter’s VP, Ed Ho, is leading the online safety efforts (via @TwitterSafety) and last week demonstrated his “test-fast, fail-fast, adjust-fast” mantra by rolling out a new feature – eliminating user list notifications – and then promptly reversing it within hours, after an avalanche of user complaints. Last year, Twitter formed a Safety and Trust Council, partnering with over a dozen organizations to tackle online abuse. One of the members, Emma Llanso, a director at the Center for Democracy and Technology, cautions against a one-size fits all solution.
Emma Llanso: The same tools that can be helpful in protecting against harassment by blocking abusive content and taking down accounts can be weaponized themselves if you don’t have the right safeguards in place.
Alison van Diggelen: Without careful protections, trolls can use blocking tools to silence their victims. Although Twitter has promised an open dialogue, Emma Llanso is concerned about lack of transparency.
Emma Llanso: If I had my druthers, we’d be getting a whole lot more reporting from Twitter about the numbers…what is the scope and scale of the content moderation? What is the level of content that gets removed, what are biggest issues? It would help people pin down: harassment, terrorists content, hate speech…How are these moderation processes affecting public discourse?
Alison van Diggelen: As for fake news, why can’t Twitter and Facebook simply flag or censor what they deem fake? Llanso has this advice:
Emma Llanso: That puts way too much power in the company’s hands…Having one centralized decider is a really risky dynamic to set up…
Alison van Diggelen: I asked Facebook to comment and was directed to Mark Zuckerberg’s first post on fake news: We do not want to be arbiters of truth, he wrote.
Last week Zuckerberg wrote this update:
“Our approach will focus less on banning misinformation, and more on surfacing additional perspectives and information, including that fact checkers dispute an item’s accuracy.” Mark Zuckerberg
(To me, this sounds like an endorsement of the Orwellian concept “alternative facts.” – AV)
But Facebook fired its entire fact-checking editorial team after criticism last summer that it had a liberal bias and targeted right-wing fake news.
Adam Schrader was one of those 25 editors.
Adam Schrader: I think fake news impacted the election, just by the sheer volume of it that appears… Facebook has a bubble problem. It can change your perception of the world…Even people who understand news and research can be tricked by fake news.
Alison van Diggelen: Schrader told me he routinely flagged between 50 and 80 fake stories a day. He questions Mark Zuckerberg’s claim that fake news on Facebook is less than 1%.
Adam Schrader: I would question that statistic. I think it would be much higher…in the 5-10% range.
Alison van Diggelen: Since December, Facebook has begun partnerships with five media outlets, including the Associated Press and Snopes, that flag “suspect” stories…. But the AP’s Lauren Easton, told me that it’s only fact checked 36 stories since the project began. Facebook recently announced fact checking collaborations with German and French media. With national elections there this year, the pressure for Facebook and Twitter to tackle the deluge of fake news and abuse will only intensify.
Gareth Mitchell: So Bill Thompson, misinformation is nothing new is it?
Bill Thompson: I have a problem with the term “fake news” but the issue’s been around for a long time. In 2010, my Wikipedia entry was hacked to declare that I’d had a heart attack and died. I corrected it. (Today) it seems to me that there are actually four different things going on:
- There’s the thing that was fake news, which is overt lying by people who want to get clicks on their website and make money.
2. There’s fake news which is propaganda, designed to promote a particular ideology.
3. There’s just out and out lying, like Bill Thompson is dead…May be a joke? For whatever reason.
4. And then there’s stuff you don’t want people to read, which they* call fake news to distract you from what they’re really saying.
(*Donald Trump routinely calls unfavorable news stories “fake news” – AV)
The problem that Facebook, Twitter, and everyone have is that no single tool, or approach or set of practices can possibly deal with all of those, so there will always be some material that fails to get stopped or fails to get flagged. We do need to be a better educated and more aware population to look out for these sorts of things, and not instantly believe everything we read on a screen, just because it’s on a screen.
Gareth Mitchell: I absolutely agree and that extends to things that people listen to on this radio program, any information you receive. Check it out for yourself.
Bill Thompson: Learn how to check it.
Gareth Mitchell: It’s always been an important skill…all the more pertinent given what’s going on in these times.