In my little corner of the tech world, all anyone can talk about is Threads—the short-text platform launched by Meta earlier this month as a move to potentially replace Twitter, which has struggled since Elon Musk’s takeover last year, losing users and ad revenue. The opportunity wasn’t lost on Mark Zuckerberg, CEO of Meta. “Twitter never succeeded as much as I think it should have,” he told The Guardian, “and we want to do it differently.”
Zuckerberg and his team are certainly doing something. Threads racked up more than 100 million users in a matter of days. Whether or not they’re doing it differently remains to be seen. As a former Trust and Safety domain expert for Twitter and Facebook before that, I have some concerns – concerns that led me to co-found T2.social, a new, alternative platform that keeps Trust and Safety at its core. I worry past mistakes may be repeated: growth may come at the risk of safety yet again.
With major launches at companies like Meta and Twitter, the focus is almost unilaterally on going live at all costs. The risks raised by researchers and operations colleagues are addressed after the launch has been deemed “successful.” This backwards prioritization can lead to disastrous consequences.
How so? In May of 2021, Twitter launched Spaces, its live audio conversations offering. Leading up to that launch, people across the company voiced concerns internally about how Spaces could be misused if the right safeguards were not in place. The company opted to move ahead quickly, disregarding the warnings.
The following December, the Washington Post reported that Spaces had become a megaphone for “Taliban supporters, white nationalists, and anti-vaccine activists sowing coronavirus misinformation,” and that some hosts “disparaged transgender people and Black Americans.” This happened largely because Twitter had not invested in human moderators or technologies capable of monitoring real-time audio. This could have been avoided if the company had made safety as important as shipping.
I’d like to think that the teams at Meta kept Twitter’s missteps in mind as they prepared to release Threads, but I’ve yet to see clear indicators that prove it. Facebook has a checkered past on these matters, especially in new markets where the platform was not prepared for integrity issues. A few days ago civil society organizations called on the company in an open letter to share what’s different this time: how is the company prioritizing healthy interactions? What are Meta’s plans to fight abuse on the platform and prevent Threads from coming apart at the seams like its predecessors? In a response sent to Insider’s Grace Eliza Goodwin, Meta said that their enforcement tools and human review processes are “wired into Threads.”
Ultimately, there are three key initiatives that I know work to build safe online communities over the long term. I hope Meta has been taking these steps.
1. Set Healthy Norms And Make Them Easy To Follow
The first (and best) thing a platform can do to protect its community against abuse is to make sure it doesn’t materialize in the first place. Platforms can firmly establish norms by carefully crafting site guidelines in ways that are both easy to read and easy to find. Nobody joins an online community to read a bunch of legalese, so the most important aspects must be stated in plain language and easily located on the site. Ideally, subtle reminders can be integrated in the UI to reinforce the most crucial rules. Then, of course, the team must rapidly and consistently enforce these guidelines so that users know they’re backed by action.
2. Encourage Positive Behavior
There are features that can encourage healthy behavior, working in tandem with established norms and enforced guidelines. Nudges, for example, were successful on Twitter before they were disbanded.
Beginning in 2020, teams at Twitter experimented with a series of automated “nudges” that would give users a moment to reconsider posting replies that might be problematic. A prompt would appear if a user attempted to post something with hateful language, giving them a momentary opportunity to edit or scrap their Tweet.
Although they could still go ahead with their original versions if they wished, users who were prompted ended up canceling their initial responses 9% of the time. Another 22% revised before posting. This successful safety feature was discontinued after Elon Musk assumed control of the platform and let most of the staff go, but it still stands as a successful strategy.
3. Keep An Open Dialogue With People
I’m lucky because my co-founders at T2 share my belief in methodical growth that favors user experience over rapid scale. This approach has given me a unique opportunity to conduct deep, direct conversations with our early users as we’ve built the platform. The users we’ve spoken to at T2 have become skeptical of “growth at all costs” approaches. They say they don’t want to engage on sites that place a high price on scale if it comes with toxicity and abuse.
Now, Meta is a public company focused on shareholder interests and, therefore, does not have that luxury. And by building off of Instagram’s existing user base, Meta had a switch it could easily flip and flood the platform with engagement—an opportunity too good to pass up. It’s no surprise that the Threads team has taken this route.
That said, a company this large also has enormous teams and myriad tools at its disposal that can help monitor community health and open channels for dialogue. I hope Meta will use them. Right now, Threads’ algorithms appear to prioritize high-visibility influencers and celebrities over everyone else, which already sets one-way conversations as the standard.
What I’ve learned from years in the trenches working on trust and safety is that if you want to foster a healthy community, listening and building with people is key. If the teams behind Threads neglect to listen, and if they favor engagement over healthy interactions, Threads will quickly become another unsatisfying experience that drives users away and misses an opportunity to deepen human connection. It won’t be any different from Twitter, no matter what Zuck says he wants.
Read the full article here