Wikipedia And The Truth Troops


As the saying goes in the news business, check the source — especially if it wants to remain anonymous.

Community-driven encyclopedia Wikipedia.com needs to do more than check anonymous editors at the site’s door. Although Wikipedia recently changed its policy to no longer allow anonymous contributions to the site, it has to make more changes if it wants to help the Web improve its reputation as a place to find accurate information.

Jimmy Wales, the founder of the project, says the community is doing just that.

The recent so-called Seigenthaler case helped blow Wikipedia’s accuracy problem farther into the mainstream while blowing open an even bigger hole in the site’s credibility because of its posting policy that lets anyone post what they want. Sure, the community is supposed to suss out the fraudulent information, but it failed in this case.

In a recent Op-Ed for USA Today, John Seigenthaler Sr., the newspaper’s former editorial page editor, described his experience of trying to remove false and defamatory information about him from the site.

His editorial lead off with this entry some anonymous person had posted on his Wikipedia bio:

“John Seigenthaler Sr. was the assistant to Attorney General Robert Kennedy in the early 1960s. For a brief time, he was thought to have been directly involved in the Kennedy assassinations of both John, and his brother, Bobby. Nothing was ever proven.”

After pointing out it out to the site’s managers, including founder Jimmy Wales, the libelous passage was removed.

As the 78-year-old Seigenthaler wrote in his Op-Ed, “One sentence in the biography was true. I was Robert Kennedy’s administrative assistant in the early 1960s. I also was his pallbearer. It was mind-boggling when my son, John Seigenthaler, journalist with NBC News, phoned later to say he found the same scurrilous text on Reference.com and Answers.com.”

Who posted the “toxic” content? Wikipedia doesn’t know, and probably never will. As Seigenthaler points out, the Communications Decency Act states that ‘no provider or user of an interactive computer service shall be treated as the publisher or speaker,’ which means that the poster’s ISP can’t reveal the poster’s name.

Wales says that’s one of the issues bedeviling Wikipedia’s efforts to block the “vandals and trolls” who constantly add malicious content to the site. They’ve tried asking the ISPs to close off the accounts of the malicious posters, but the ISPs really don’t have to do anything because of the Decency Act.

Although plenty of cases are challenging that law, the Seigenthaler/Wikipedia case has shown us that just “calling out” an inaccuracy on the site by the community of contributors and readers isn’t enough. Wikipedia needs to vet its content the same way the old-fashioned print world does.

Or maybe wikis will always be perceived as not to be trusted.

That’s been my experience with another wiki (not associated with Wikipedia). When I Google my name — which anyone with anything close to a public profile should do routinely — my name comes up on another wiki in which a poster gripes that I mischaracterized his quotes in a story. I didn’t. He got mad and posted to a wiki.

Three years ago, I wrote a story for internetnews.com about an e-mail-blocking policy by AOL that was garnering all kinds of complaints from legitimate e-mailers, that is, from e-mailers who don’t spam.

The postings about the policy were piling up on Slashdot, so I contacted one of the posters who had been impacted by the policy and asked him to comment for my story.

I used one quote verbatim that explained his issue with AOL’s policy. He later wrote to me saying I misrepresented him. I replied to his e-mails about the issue right away and politely explained my viewpoint and disagreed, saying it was clear in the story that he was a legitimate e-mailer impacted by the policy.

Although we correct early on this site, even when we’ve messed up quotes, I checked and nothing was amiss in this case. So I refused to change the story to suit him.

When I discovered his complaint on the wiki, I tried to post my point of view about the issue. Funny, my side of the story never made it to the site. So much for the “free and open” concept on that wiki.

At least Wikipedia understands what’s at stake and is mulling even more changes to its posting policy.

Wales tells internetnews.com that unregistered editors are no longer allowed to create new pages. This has helped slow down the new pages that come in, so the editors can fact-check the sourcing.

In addition, come January, the site will launch a review mechanism that will allow people to review articles in a more systematic way, so that problematic posts can be called out more quickly.

”What has been overlooked in this grand media storm,” he adds, “is that overall, Wikipedia’s content quality is quite high. It isn’t uniform, and using it as a full source is a bad idea.”

But the core of the way this works is this community, he adds. “It’s a group of people who themselves formed an institution of trusted editors and people that we know. It’s much closer to the actual editing model than most people realize,” he says.

”For most of the active Wikipedians, there’s a real passion to get it right, and correct an error when they find it. What makes this interesting and newsworthy is that our processes failed,” he says. “The challenge for us is to figure out in this particular case what went wrong, how was it overlooked and how did it stay on site for so long. That’s the type of thing we do constantly.”

You can read one recent discussion about the policy changes here.

“The goal of Wikipedia is to create an encyclopedic information source adhering to a neutral point-of-view style of prose, with all information being referenced through the citation of reliable published sources, so as to maintain a standard of verifiability,” the section on Wikipedia says.

“However, the realities of international defamation laws demand that we pay attention to legal issues in order to ensure that our work can remain available, and to protect the project from legal liability. Any libelous content posted to Wikipedia potentially exposes the author and the site’s operators (The Wikimedia Foundation) to legal liability.”

It then urges anyone who wants to register as an editor to read up on libel laws, starting with U.S. libel and slander laws, by picking up “The Associated Press Stylebook and Briefing on Media Law.”

Anyone who has practiced anything closely resembling journalism knows this book well. If Wikipedia had made this its policy from the outset, it might not find itself in a pickle over credibility now.

With more than 2 million articles and more than 80 languages represented on the site, the editors have their hands full vetting the information.

No one wants to shut down the idea of free speech that gave rise to the Wikipedia project some five years ago. But at the very least, the case has pointed out the need for more stringent policies that help vet spurious information from getting into print and to keep tabs on the few who will always want to vandalize the site.

Otherwise, Wikipedia will continue to labor on as Information Not Quite Trusted.

Erin Joyce is executive editor of internetnews.com

Get the Free Newsletter!

Subscribe to our newsletter.

Subscribe to Daily Tech Insider for top news, trends & analysis

News Around the Web