Meta's Threads: A Welcoming Alternative to Twitter with Unique Challenges in Maintaining Constructive Online Discourse
Mark Zuckerberg has introduced Threads, a Twitter alternative developed by Meta, as a welcoming and friendly platform for constructive online conversations. He emphasized its distinction from the more confrontational Twitter, which is owned by Elon Musk. Soon after the launch of Threads, Mark Zuckerberg, the CEO of Meta, stated, "Our focus is on promoting kindness and creating a friendly environment."
However, ensuring that Threads maintains this idealistic vision poses a unique challenge. Meta Platforms, experienced in managing the often contentious nature of the internet, has emphasized that the same rules governing its photo and video-sharing social media service, Instagram, will be enforced on the Threads app.
Meta, the parent company of Facebook and Instagram, has been actively adopting an algorithmic approach to content curation, granting it greater control over the types of content that gain traction. The company aims to shift its focus more towards entertainment while distancing itself from news-related content.
By integrating Threads with other social media platforms like Mastodon and targeting news enthusiasts, politicians, and those fond of vigorous debates, Meta is venturing into uncharted territory, opening itself up to new challenges while attempting to carve out a distinct path.
Notably, Christine Pai, a spokesperson for Meta, explained in an email that the fact-checking program implemented across other Meta apps will not be extended to Threads. This decision eliminates a distinctive feature in how Meta addresses misinformation on its other platforms.
Pai further clarified that if posts are flagged as false by fact-checking partners, including Reuters, on Facebook or Instagram, those labels will also be applied to the same posts if shared on Threads.
When asked about the divergence in handling misinformation on Threads, Meta declined to provide an explanation.
During a New York Times podcast, Adam Mosseri, the head of Instagram, acknowledged that Threads is more likely to attract a news-oriented audience and foster public discussions compared to Meta's other services. However, the company intends to concentrate on lighter topics such as sports, music, fashion, and design.
Meta's ability to distance itself from controversy faced immediate challenges. Shortly after its launch, Threads accounts were found to be posting about topics like the Illuminati and "billionaire satanists," while users engaged in heated discussions covering a wide range of subjects, including gender identity and violence in the West Bank. These exchanges often devolved into comparisons with Nazis.
Conservative figures, including the son of former U.S. President Donald Trump, voiced complaints about censorship when warning labels cautioning potential followers about false information appeared. A Meta spokesperson claimed that these labels were a mistake.
Content moderation will become even more complex when Meta integrates Threads with the "fediverse," enabling users from servers operated by non-Meta entities to interact with Threads users. Christine Pai of Meta clarified that Instagram's rules would also apply to these users.
Pai explained, "If an account or server, or if we discover multiple accounts from a particular server, violating our rules, they would be blocked from accessing Threads. As a result, content from that server would no longer appear on Threads, and vice versa."
Nevertheless, experts specializing in online media highlight the intricacies of Meta's approach to these interactions as the key to its success. Alex Stamos, director of the Stanford Internet Observatory and former head of security at Meta, emphasized the challenges Meta will face in enforcing content moderation without access to back-end data on users who post prohibited content.
Stamos explained that without federation, the metadata necessary to identify individual users or detect widespread abusive behavior would be unavailable. This would make it more difficult to combat spam, troll farms, and abusers driven by economic incentives.
In his posts, Stamos expressed the expectation that Threads would limit the visibility of fediverse servers with numerous abusive accounts and impose stricter penalties on those sharing illegal materials such as child pornography.
Nonetheless, these interactions themselves present challenges. Solomon Messing from the Center for Social Media and Politics at New York University highlighted the complexities surrounding illegal content, such as child exploitation, nonconsensual sexual imagery, and arms sales. The responsibility to address such material while indexing content from other servers raises ethical questions beyond mere blocking on Threads.