Providing social transparency to Wikipedia

WikiDashboard Background

The idea is that if we provide social transparency and enable attribution of work to individual workers in Wikipedia, then this will eventually result in increased credibility and trust in the page content, and therefore higher levels of trust in Wikipedia.

Wikipedia itself keeps track of these studies and openly discusses them here, which is a form of social transparency itself. However, even Wales himself have been quoted as saying that "while Wikipedia is useful for many things, he would like to make it known that he does not recommend it to college students for serious research." Indeed, the standard complaint I often hear about Wikipedia is that because of its editorial policy (anyone can edit anything), it is an unreliable source of information.

The opposite point of view, however, has not been debated or expressed nearly as much: Precisely because anyone can edit anything and that anyone can examine the edit history and see who has made them, it will (or has already) become a reliable source of information. I think Michael Scott, the character on the popular TV show "The Office", puts it succinctly: "Wikipedia is the best thing ever. Anyone in the world, can write anything they want about any subject. So you know you are getting the best possible information."

While tongue-in-cheek, it brings up a valid point. Because the information is out there for anyone to examine and to question, incorrect information can be fixed and two disputed points of view can be listed side-by-side. In fact, this is precisely the academic process for ascertaining the truth. Scholars publish papers so that theories can be put forth and debated, facts can be examined, and ideas challenged. Without publication and without social transparency of attribution of ideas and facts to individual researchers, there would be no scientific progress.

We're curious of how the Web community will use this tool to surface social dynamics and editing patterns that might otherwise be difficult to find and analyze in Wikipedia. Please let us know by leaving a comment on feedback forum. Alternatively, (if you wish to contact us in private), email us at: wikidashboard [at] parc [dot] com

Bongwon Suh Twitter: @billsuh
Ed Chi Twitter: @edchi

Research Papers

A comparison of generated Wikipedia profiles using social labeling and automatic keyword extraction
ICWSM 2010
The singularity is not near: slowing growth of Wikipedia
WikiSym 2009
So you know you're getting the best possible information: a tool that increases Wikipedia credibility.
CHI 2009
What's in Wikipedia? mapping topics and conflict using socially annotated category structure
CHI 2009
Providing social transparency through visualizations in Wikipedia
Social Data Analysis Workshop at CHI 2008
Crowdsourcing user studies with Mechanical Turk
CHI 2008
Lifting the veil: improving accountability and social transparency in Wikipedia with WikiDashboard
CHI 2008
Can you ever trust a wiki? Impacting perceived trustworthiness in Wikipedia
CSCW 2008
Us vs. them: understanding social dynamics in Wikipedia with revert graph visualizations
IEEE VAST (Symposium on Visual Analytics Science and Technology) 2007
He says, she says: Conflict and coordination in Wikipedia
CHI 2007 - ACM Conference on Human Factors in Computing Systems
Power of the few vs. wisdom of the crowd: Wikipedia and the rise of the bourgeoisie
alt.Chi at CHI 2007
Augmented social cognition: understanding social foraging and social sensemaking
HCIC Workshop
Disclaimer & Privacy Policy
Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation License, Version 1.2 or any later version published by the Free Software Foundation; A copy of the license is included in the section entitled GNU Free Document Licence.

Copyright (c) Palo Alto Research Center, Inc. 2007-2010.