Tech

The hunt for informative moles on Wikipedia


This network mapping can also identify a particular strategy used by bad actors to separate their edit history between several accounts to avoid detection. Editors have worked to build reputation and status in the Wikipedia community, mixing legal page edits with more politically sensitive edits.

“The main message I take from all of this is that the main danger is not vandalism. Miller said.

However, if the theory is correct, it means that public agencies could also spend years working on a disinformation campaign that could potentially slip by going unnoticed.

“Russian influence activities can be quite complex and go on for a long time, but I don’t know if the benefits will be that big,” O’Neil said.

Governments also often have more tools that are easier to use. For years, authoritarian leaders blocked the site, take its governing organization to courtand catch its editors.

Wikipedia has been battling inaccuracies and misinformation for 21 years. One of the longest-running disinformation efforts came in more than a decade after a group of radical nationalists bet Wikipedia’s admin rules to make it happen. via Croatian community, rewriting history to restore the country’s World War II fascist leaders. The platform is also vulnerable to “reputation management” efforts aimed at embellishing the profiles of powerful people. Then there are the complete hoaxes. In 2021, a Chinese Wikipedia editor is discovered to have spent years write 200 articles on the fabricated history of medieval Russia, complete with states, nobility and fantasy battles.

To combat this, Wikipedia has developed a complex collection of rules, governing bodies, and public discussion forums run by a self-organized and self-governing body of 43 million registered users. signed around the world implemented.

Nadee Gunasena, director of human resources and communications operations at the Wikimedia Foundation, said the organization “welcomes a deep dive into the Wikimedia model and our projects,” particularly in the area of ​​misinformation. . But she also added that the study included only part of the paper’s editing history.

“Wikipedia content is protected through a combination of machine learning tools and rigorous human oversight from volunteer editors,” says Gunasena. All content, including the history of every article, is publicly available, while the source is checked to ensure neutrality and reliability.

O’Neil adds that research focusing on the bad actors that have been found and caught at the root can also show that Wikipedia’s system is working. But while the research doesn’t make a “smoking gun,” it could be invaluable to Wikipedia: “The study is really the first attempt at characterizing questionable editing so we can can use those signals to find it elsewhere,” Miller said.

Victoria Doronina, a member of the Wikimedia Foundation’s board of trustees and a molecular biologist, says that Wikipedia has historically been targeted by coordinated attacks by naturalistic “taxi”. its content.

“While individual editors act in good faith, and the combination of different points of view allows for the creation of neutral content, the off-Wiki coordination of a particular group allows it to skew story,” she said. If Miller and its researchers are right in identifying state strategies to influence Wikipedia, the next struggle could be “Wikimeders against state propaganda,” says Doronina. more.

The analyzed behavior of bad actors can be used to create models that can detect misinformation and find out how vulnerable a platform is to manipulation, says Miller. systematically exposed on Facebook, Twitter, YouTube, Reddit and other major platforms.

The English edition of Wikipedia has 1,026 administrators tracks over 6.5 million pages, the most articles of any publication. Tracking bad guys mostly relies on someone reporting suspicious behavior. But much of this behavior can’t be seen without the right tools. In terms of data science, it is difficult to analyze Wikipedia data because, unlike a tweet or a Facebook post, Wikipedia has multiple versions of the same text.

As Miller explains, “the human brain simply cannot determine what hundreds of thousands of edits across hundreds of thousands of pages look like.”

news7d

News 7D: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button