TikTok, already under scrutiny over its Chinese ownership and threatened with a possible ban by US President Trump, is confronting another major obstacle: how to handle content around its first US presidential election.
Originally known for teens’ viral dance routines and prank videos, Tiktok is now increasingly a destination for political content from its users.
But TikTok’s head of US safety Eric Han, in the first interview he has given about TikTok’s approach to election misinformation, told journalists at our partner news agency Reuters his team’s goal is to ensure the app can stay a place for entertainment and “silly self-expression”.
TikTok, which says it has about 100 million monthly active US users, is charting its own approach to election-related material, factoring in what Han called the “cautionary tales” of more-established social media rivals.
TikTok fact-checking partners Lead Stories and PolitiFact said they have reviewed hundreds of videos containing political misinformation on the app, such as that Democratic vice presidential candidate Kamala Harris had threatened revenge on Trump supporters or about who appeared on disgraced financier Jeffrey Epstein’s flight logs.
Instead, the social media app keeps fact-checkers’ assessments internal and utilises them to eliminate content, or, less often, reduce its own reach.
“A lot of us have come from other platforms, we’ve seen how fact-checking works, we’ve seen how labelled works,” Han said, adding that the company had been “very well aware” that fact-checking labels could backfire by making users double-down on incorrect beliefs or presume all of unlabelled content was legitimate.
Social media companies came under pressure to fight misinformation after US intelligence agencies decided Russia used such platforms to interfere in the 2016 election – which Moscow has denied. Facebook, which utilises evaluations from fact-checkers – to openly tag posts and cut back their distribution, stated warnings on coronavirus misinformation discouraged users from viewing flagged content on 95 percent of their time.
TikTok, which does not accept political advertisements, says it does not allow misinformation that might lead to harm, including content that misleads users . In addition, it has banned artificial media — like a recent movie of House Speaker Nancy Pelosi exploited to make her look drunk.
“Their whole mission was to bring joy,” said David Ryan Polgar, a technician ethicist and member of TikTok’s new content advisory council that helps it shape policies. “But with anything that is popular, you’re going to have somebody who is going to say ‘how do I harness popularity?'”
To combat such exploitation before and after the election, Han said TikTok staff are meeting weekly to plan for situations, from contested election results to disinformation campaigns by “state foreign actors. . .or a kid in someone’s basement.”
Participants of TikTok’s content advisory council also told Reuters that in a interview last week that they discussed topics involving voter suppression and if people supporters of the unfounded political conspiracy concept QAnon should be allowed on the stage, as well as what to do if the app is used to spread misinformation about contested results or incite post-election violence.
“Even if it’s not organic to TikTok, it’s going to end up there,” said Hany Farid, a digital forensics specialist and council member, who said that he expected TikTok’s election policies would eventually become “clearer.”
TikTok’s fact-checkers, who also partner with Facebook, stated political falsehoods found on TikTok were comparable to those spread on Zuckerberg’s platform, “It’s not just dance challenges any more,” said Alan Duke, co-founder of fact-checking spouse Lead Stories.
Even as TikTok grapples with content around the US election issues, the fate of the ByteDance-owned app in the nation remains unclear: the Trump government is expected to make a decision shortly on a planned deal with Oracle, a strategy orchestrated to avoid a US ban.
With about six months into the Nov. 3 election, social media firms’ responses to misinformation on their platforms are in the spotlight. On TikTok, Reuters found videos containing false claims about mail-in voting and presidential candidates, several of which TikTok removed after they were flagged by Reuters.
Searching TikTok for ‘mailinvote’ returns suggestions including ‘mailinvotingfraud,” used on videos both spreading and debunking concerns.
Following questions from Reuters, a TikTok spokeswoman said it was no longer serving results on those hashtags.
Allegations of child sexual abuse are a key element of the QAnon conspiracy theory, which proposes Trump is secretly fighting a cabal of child-sex predators including prominent Democrats and “deep state” allies.
TikTok recently said it had blocked dozens of Q-Anon related hashtags. But misinformation researcher Rory Smith at non-profit First Draft identified several more still being used, including ‘thestormisuponus,”2q2q,”digitalsoldier,’ and ‘jfkjr’ which collectively had countless thousands viewpoints. Following questions, TikTok stated it had blocked a number of those hashtags.
TikTok’s content moderation practices have come under some scrutiny, such as from US lawmakers concerned it may be censoring politically sensitive content following reports that it blocked videos about protests from Hong Kong. A TikTok spokeswoman said its material and moderation policies are directed by a team in California and aren’t affected by any foreign government.
Back in June, the business apologised after it had been accused of censoring #BlackLivesMatter content, attributing a technical glitch it said made posts appear to have zero views.
This past year, the business declared a council of external specialists would help it form US content policies. Rob Atkinson, a council member and president of the Information Technology and Innovation Foundation think-tank, told journalists he has advised the company quite a few times about how coverage decisions, such approaches to hate speech, may play in Washington, DC.
But data safety issues also coming from TikTok’s ownership by Chinese tech giant ByteDance have retained significant US political figures and groups largely from the app, enabling it to avoid the scrutiny faced by Facebook and Twitter over their handling of inflammatory articles by Trump or alternative candidates.
The Democratic National Committee Chief Technology Officer Nellwyn Thomas told journalists at Reuters that the DNC hasn’t engaged with TikTok in much depth, and targets its counter-disinformation work longer on Facebook and Twitter. A TikTok spokeswoman said the company has provided that the Republican and Democratic National Committees with direct tactics to escalate problems.
Graham Brookie, manager of the Atlantic Council’s Digital Forensic Research Lab explained the counter-disinformation community would be”remiss” not to communicate with TikTok:”We don’t get to choose. . .where we have vulnerabilities.”
Misinformation experts remain concerned about the issues of moderating TikTok’s many-layered videos, which can involve sound and visual effects, streamed text along with hashtags. Users may use’green screen’ effects to share news articles supporting them or make split-screen’duets’ with existing videos.
TikTok’s comedic tone also makes it difficult to inform spoof from skulduggery: last month, the company eliminated a movie shared from the Republican Hype House, which the account’s 17-year-old founder Aubrey Moore stated was satire.
In the video, which liberal press watchdog Media Matters For America said racked up 40,000 viewpoints, a Republican Hype House creator, alongside ‘BREAKING NEWS’ text, falsely claimed that because of COVID-19, Democrats should head to the polls following Election Day. Although Moore said it was a gag, she said the band didn’t bother to appeal the take-down: it pumps out several videos a day.
TikTok advisory council member Farid said he had suggested to the firm, partially to be more provocative, that to curtail misinformation and abuses it might ban new videos from the United States for a couple of days before and following the election.
His Plan B? “Honestly, I don’t know,” he said.
“I’m struggling with that”
The team at Platform Executive hope you have enjoyed this news article. Initial reporting via our official content partners at Thomson Reuters. Reporting by Elizabeth Culliford. Editing by Greg Mitchell and Edward Tobin.
To stay on top of the latest developments across the platform economy and gain access to our problem-solving tools and comprehensive content sets, you can become a member for just $7 per month.