Users Will Decide What Threads Becomes, Not Zuckerberg

Facebook founder Mark Zuckerberg

Mark Zuckerberg has pitched Meta’s Twitter copycat app, Threads, as a “friendly” refuge for public discourse online, framing it in sharp distinction to the more adversarial Twitter which is owned by billionaire Elon Musk.

“We are definitely focusing on kindness and making this a friendly place,” Meta CEO Zuckerberg said on Wednesday, shortly after the service’s launch.

Maintaining that idealistic vision for Threads — which attracted more than 70 million users in its first two days — is another story.

To be sure, Meta Platforms is no newbie at managing the rage-baiting, smut-posting internet hordes

To be sure, Meta Platforms is no newbie at managing the rage-baiting, smut-posting internet hordes. The company said it would hold users of the new Threads app to the same rules it maintains on its photo and video sharing social media service, Instagram.

Facebook founder Mark Zuckerberg
Mark Zuckerberg

The Facebook and Instagram owner also has been actively embracing an algorithmic approach to serving up content, which gives it greater control over the type of fare that it does well as it tries to steer more towards entertainment and away from news.

Read also : Uganda to Lift Ban on Facebook if it Stops “Playing Games”

However, by hooking up Threads with other social media services like Mastodon, and given the appeal of microblogging to news junkies, politicians and other fans of rhetorical combat, Meta is also courting fresh challenges with Threads and seeking to chart a new path through them.

For starters, the company will not extend its existing fact-checking programme to Threads, spokeswoman Christine Pai said in an e-mailed statement on Thursday. This eliminates a distinguishing feature of how Meta has managed misinformation on its other apps. 

Labels

Pai added that posts on Facebook or Instagram rated as false by fact-checking partners will carry their labels over if posted on Threads, too.

Asked to explain why it was taking a different approach to misinformation on Threads, Meta declined to answer.

In a New York Times podcast on Thursday, Adam Mosseri, the head of Instagram, acknowledged that Threads was more “supportive of public discourse” than Meta’s other services and therefore more inclined to draw a news-focused crowd, but said the company aimed to focus on lighter subjects like sports, music, fashion and design.

Nevertheless, Meta’s ability to distance itself from controversy was challenged immediately.

Read also : Propel Raises $2.7M in Seed Investment to Fuel Talent Pipelines in Africa’s Tech Ecosystem

Within hours of launch, Threads accounts were posting about the Illuminati and “billionaire satanists”, while other users compared each other to Nazis and battled over everything from gender identity to violence in the West Bank.

Conservative personalities, including the son of former US President Donald Trump, complained of censorship after labels appeared warning would-be followers that they had posted false information. Another Meta spokesman said those labels were an error.

Further challenges in moderating content are in store once Meta links Threads to the so-called fediverse, where users from servers operated by other non-Meta entities will be able to communicate with Threads users. Meta’s Pai said Instagram’s rules would likewise apply to those users.

“If an account or server, or if we find many accounts from a particular server, is found violating our rules then they would be blocked from accessing Threads, meaning that server’s content would no longer appear on Threads and vice versa,” she said. 

Still, researchers specialising in online media said the devil would be in the details of how Meta approaches those interactions.

Alex Stamos, the director of the Stanford Internet Observatory and former head of security at Meta, posted on Threads that the company would face greater challenges in performing key types of content moderation enforcement without access to backend data about users who post banned content.

With federation, the metadata that big platforms use to tie accounts to a single actor or detect abusive behaviour at scale aren’t available.

“With federation, the metadata that big platforms use to tie accounts to a single actor or detect abusive behaviour at scale aren’t available,” said Stamos. “This is going to make stopping spammers, troll farms and economically driven abusers much harder.”

In his posts, he said he expected Threads to limit the visibility of fediverse servers with large numbers of abusive accounts and apply harsher penalties for those posting illegal materials like child pornography.

Read also : United Kingdom Visa Payment Changes for Applications in Morocco

Even so, the interactions themselves raise challenges.

“There are some really weird complications that arise once you start to think about illegal stuff,” said Solomon Messing of the Center for Social Media and Politics at New York University. He cited examples like child exploitation, nonconsensual sexual imagery and arms sales.

“If you run into that kind of material while you’re indexing content [from other servers], do you have a responsibility beyond just blocking it from Threads?”

Kelechi Deca

Kelechi Deca has over two decades of media experience, he has traveled to over 77 countries reporting on multilateral development institutions, international business, trade, travels, culture, and diplomacy. He is also a petrol head with in-depth knowledge of automobiles and the auto industry