Musk’s ‘Twitter Files’ Offers a Glimpse of Crude, Complicated, and Ungrateful Moderation Task TechCrunch

Twitter’s new boss, Elon Musk, is heavily advertising “Twitter files”: Selected internal communications from the company, diligently tweeted out by sympathetic amanuense. But Musk’s apparent belief that he’s released some partisan kraken is misleading – far from conspiracy or systematic abuse, these files are a valuable peek behind the veil of censorship. on a large scale, hinting at Sisyphean efforts made by every social media platform.

For a decade, companies like Twitter, YouTube, and Facebook have performed an elaborate dance to keeping the details of their moderation processes out of reach of the bad guys, regulators, and the press. .

Too much disclosure exposes processes to abuse by spammers and scammers (who actually take advantage of every leaked or published detail), while too little disclosure leads to damaging reports and rumors when they lose control of the story. Meanwhile, they must be willing to justify and document their methods or face censorship and fines by government agencies.

The result is that while everyone knows small In terms of exactly how these companies test, filter and organize the content posted on their platforms, that should be enough to ensure that what we’re seeing is just the tip of the iceberg.

Sometimes there are methods that we suspect – hourly contractors clicking through violent and sexual images, a disgusting but seemingly necessary industry. Sometimes companies go overboard, such as repeated claims about how AI is revolutionizing censorship and subsequent reports that AI systems for this purpose are confusing and not worth it. trust.

What almost never happens — in general, companies don’t do this unless they’re forced to — is that the actual large-scale content moderation tools and processes are displayed without have filter. And that’s what Musk did, perhaps to his own peril, but certainly to the great interest of anyone who’s ever wondered what moderators actually do, say, and click when they give Decision making can affect millions of people.

Not paying attention to the honest, complicated conversation behind the curtain

Email threads, Slack conversations, and screenshots (or rather, screenshots) released over the past week provide a glimpse into this important and under-understood process. What we see is a bit of raw material, not the partisan illuminati that some would expect – although it is clear, by highly selective presentation, that this is what we want to perceive. .

On the contrary: those involved are cautious and confident, practical and philosophical, outspoken and permissive, respectively, suggesting that the choice of restriction or prohibition is not made arbitrarily but is based on consensus. growing agreement of opposing views.

The lead to Hunter Biden’s choice to temporarily restrict the laptop story – perhaps at this point the most controversial censorship decision of the past few years, behind banning Trump – is not partisan or good What conspiracy is implied by the pompous packaging of the material.

Instead, we see serious, thoughtful people trying to reconcile conflicting and incomplete policies and definitions: What constitutes “hacked” documents? How confident are we in this or that assessment? What is a commensurate response? How should we communicate it, to whom and when? What are the consequences if we do, if we don’t restrict? What precedents do we set or break?

The answers to these questions are not entirely clear and are the kind of thing that is often given after months of research and discussion, or even in court (legal precedent affects the legal language and the aftermath). fruit). And they need to be done quickly, before the situation spirals out of control one way or another. Dissent from inside and outside (from a US Representative, nothing more and less – ironically, tricked into the topic with Jack Dorsey violating the same policy) was reviewed and integrated. honestly matched.

“This is an emerging situation where the truth is still unclear,” said Yoel Roth, former director of Trust and Safety. “We are making the mistake of adding a warning and preventing this content from being amplified.”

Some questions about the decision. Some question the facts as they are presented. Others say it is not supported by their policy reading. One said they needed to make the particular base and extent of the action clear because it would obviously come under scrutiny as a partisan. Deputy General Counsel Jim Baker called for more information but said caution was needed. There is no clear precedent; facts at this time are not available or have not been verified; some of the material is clearly non-consensual nudity.

“I believe Twitter itself should limit what it recommends or includes in trending news and your policy against QAnon groups is fine,” congressman Ro Khanna admitted, adding that the action The move mentioned is a step too far. “It’s a difficult balance.”

Neither the public nor the press are aware of these conversations, and the truth is that we are as curious, and largely in the dark, as our readers. It would be incorrect to call published documents a complete or even accurate representation of the whole process (they are clearly sifted and inefficiently selected to fit a given sentence). stories), but even so, we are given more information than before. .

Instruments of Trade

Even more direct disclosure is the next topic, which carries screenshots of the actual moderation tool used by Twitter employees. While the theme attempts to equate the use of these tools with a shadow ban, the screenshots don’t show nefarious activity, and they’re not necessary to be interesting.

Image credits: Twitter

On the contrary, what is shown is attractive precisely because it is so trivial, so systematic. Here are the different techniques that all the social media companies have explained over and over again that they use, but while we’ve included it in the PR fun diplomacy story in the past, it is now presented without comment: “Trend Blacklist”, “High Profile”, “DO NOT ACTION” and the rest.

Meanwhile, Yoel Roth explains that actions and policies need to be better aligned, more research is needed, plans are underway to improve:

The underlying assumption for much of what we’ve developed is that if exposure to, for example, misinformation directly causes harm, we should use remedial measures to reduce exposure. exposure and limiting the spread/spread of content is a good way to do that… we’re going to need to make a stronger case to include this in our policy remediation category us — especially with respect to other policy areas.

Again, the content is at odds with the context in which it is presented: this is hardly the consideration of a liberal group secretly attacking its ideological enemies with the ban hammer. It’s an enterprise level dashboard as you can see for tracking leads, logistics or accounts, discussed and iterated over and over by sane people working within real limits. and aims to satisfy a wide range of stakeholders.

That’s right: Twitter, like its fellow social media platforms, has worked for years to make the moderation process efficient and systematic enough to operate at scale. Not only to keep the platform free from bots and spam, but also to comply with regulatory frameworks like FTC and GDPR orders. (Where “widespread, unfiltered access” by outsiders granted to the tool pictured could constitute a breach. The relevant agencies told TechCrunch they are “interacting” with Twitter on this issue.)

A small number of employees making arbitrary decisions without evaluation criteria or monitoring is not the way to effectively moderate or respond to such legal requirements; nor (like resignation of some people on Twitter’s Trust & Safety Council testifying today) is automation. You need a large network of people to collaborate and work under a standardized system, with clear boundaries and escalation processes. And that’s certainly what seems to be shown in the screenshots Musk has published.

What is not shown in the documents is any kind of systematic bias, which Musk’s supporters allude to but are not entirely provable. But whether it fits the story they want or not, what is being announced is of interest to anyone who thinks these companies should be more open about their policies. It’s a win for transparency, even if Musk’s non-transparent approach more or less got there by accident.


News 7D: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button