TikTok, already under scrutiny over its Chinese ownership and threatened with a possible ban by U.S. President Donald Trump, is facing another major challenge: how to handle content around its first U.S. presidential election.
Originally known for teenagers’ viral dance routines and prank videos, Tiktok is now increasingly a destination for political content from its users. The hashtags Trump2020 and Biden2020 have collectively had more than 12 billion views on the app.
But TikTok’s head of U.S. safety Eric Han, in the first interview he has given about TikTok’s approach to election misinformation, said his team’s goal was to ensure the app could stay a place for entertainment and “silly self-expression.”
With about 100 million monthly active U.S. users, TikTok continues to chart its own approach to election-related material, factoring in what Han called the “cautionary tales” of more-established social media rivals.
But unlike Facebook and Twitter, TikTok does not flag any misinformation to its users. Instead, the social media app keeps fact-checkers’ assessments internal and uses them to remove content, or, less frequently, reduce its reach.
Social media companies came under pressure to combat misinformation after U.S. intelligence agencies determined Russia used such platforms to interfere in the 2016 election – which Moscow has denied.
TikTok claims to not accept political ads, and says it does not allow misinformation that could cause harm, including content that misleads users about elections. The platform has also banned synthetic media – such as a recent video of House Speaker Nancy Pelosi manipulated to make her seem drunk.
“Their whole mission was to bring joy,” said David Ryan Polgar, a tech ethicist and member of TikTok’s new content advisory council that helps it shape policies. “But with anything that is popular, you’re going to have somebody who is going to say ‘how can I exploit popularity?’”
To combat such exploitation before and after the election, Han said TikTok staff are meeting weekly to plan for scenarios, from contested election results to disinformation campaigns by “state foreign actors…or a kid in someone’s basement.”
They have also discussed issues including voter suppression and whether public supporters of the unfounded political conspiracy theory QAnon should be allowed on the platform. There was also the matter of what to do if the app is used to spread misinformation about contested results or incite post-election violence.
As TikTok grapples with the incoming vitriol around the U.S. election issues, the fate of the ByteDance-owned app in the country remains uncertain, as the Trump administration holds off on making a decision on its proposed deal with Oracle, a plan set in swing, to work around the U.S. ban.
[Sourced from Reuters]