
That article, which as of Friday has been viewed more than 900,000 times, has since undergone 1,071 edits by 223 editors who’ve voluntarily updated the page on the internet’s free and largest crowdsourced encyclopedia. Moore, who works as a strategist for a digital creative agency, has made nearly 500,000 edits to Wikipedia articles over the past 15 years. He is also ranked as one of the 50 most active English-language Wikipedia users of all time, based on the number of edits. (Wikipedia editors do not get paid.)
“It’s a hobby,” Moore told CNN Business. “I sometimes spend a lot of time diving in and fleshing out an article, but other times I’m writing one or two sentences to get the ball rolling and watching other editors improve upon my work. I get a lot of satisfaction out of planting the seed and watching it evolve over time.”
In the middle of breaking news, when people are searching for information, some platforms can present more questions than answers. Although Wikipedia is not staffed with professional journalists, it is viewed as an authoritative source by much of the public, for better or for worse. Its entries are also used for fact-checking purposes by some of the biggest social platforms, adding to the stakes and reach of the work from Moore and others.
“Editing Wikipedia can absolutely take an emotional toll on me, especially when working on difficult topics such as the COVID-19 pandemic, mass shootings, terrorist attacks, and other disasters,” he said. “I’ve learned how to minimize this by stepping away if needed and revisiting tasks at a later time.”
Moore is part of a subculture of Wikipedia users who spend hours each day contributing to the platform, helping to fulfill the organization’s mission to “create and distribute a free encyclopedia of the highest possible quality to every single person on the planet in their own language.” He calls his work as a volunteer editor “rewarding.”
“I like the instant gratification of making the internet better,” he said. “I want to direct people to something that is going to provide them with much more reliable information at a time when it’s very difficult for people to understand what sources they can trust.”
Some of these expert users attend Wikipedia editor conferences and meetups all over the world. “We’re kind of like ants,” Moore said. “You kind of find how you fit in and how you can help.”
Cutting out the noise
Lane Rasberry, who is employed at the School of Data Science at the University of Virginia and was a volunteer Wikipedia editor for 10 years, said there’s also an allure and a culture around people involved in high-profile breaking news situations on Wikipedia.
“It is considered cool if you’re the first person who creates an article, especially if you do it well with high-quality contributions,” said Rasberry. “Just like when a celebrity dies, there’s a rush to go to Wikipedia and change their [date of] death. People like to be first … and also make an impact” in getting reliable and accurate information out quickly.
To help patrol incoming edits and predict misconduct or errors, Wikipedia — like Twitter — uses artificial intelligence bots that can escalate suspicious content to human reviewers who monitor content. However, the volunteer editors of the Wikipedia community make decisions on what to remove or edit. The platform also uses admins, known as “trusted users,” who can apply or are nominated for the role, to help monitor content.
Another issue is vandalism, or people who make purposefully erroneous edits on Wikipedia pages. But Moore said he doesn’t worry about his own pages falling victim to vandalism because he believes Wikipedia’s guidelines and policies are working in his favor.
“I’ve got many other editors that I’m working with who will back me, so when we encounter vandalism or trolls or misinformation or disinformation, editors are very quick to revert inappropriate edits or remove inappropriate content or poorly sourced content,” Moore said.
While “edit wars” can happen on pages, Rasberry said this tends to occur more often over social issues rather than news. “People have always assumed edit wars [play out on] Wikipedia and it does not happen nearly as much as outsiders expect,” he said. “Wikipedia has both technological and social structures in place, which most people find agreeable and appropriate, and which permit many people to edit at once.”
“Administrators are very quick to block those who do not obey the rules, so if you’re coming to Wikipedia with mal-intent, you’re wasting your time because we will stop you from contributing to the site,” Moore said.
Challenges exist with getting users full access to news on Wikipedia, too. Rasberry said that due to news or magazine subscription costs, some Wikipedia editors may not be able to access and cite those sources in their updates. “Access to media and interpreting media is a major bottleneck,” said Rasberry, saying “news agencies [should] see Wikipedia as more of a collaborator than rival news source.”
Wikipedia volunteers have created lots of guidance on reliable news sources. A dedicated Wikipedia page on the topic notes that articles should be “based on reliable, published sources, making sure that all majority and significant minority views that have appeared in those sources are covered.”
“If no reliable sources can be found on a topic, Wikipedia should not have an article on it,” the page said.
Although Moore is known among friends, colleagues and those in the Wikipedia editor community as being a Wikipedia influencer, the weight of that title is far less than the fame one can acquire on YouTube, Instagram and TikTok.
“I don’t spend all of my time contributing to Facebook and Twitter and these other platforms because I feel strongly about Wikipedia’s mission,” he said. “If it was a paid advertising site or if it had a different mission, I wouldn’t waste my time.”