Incremental Vs. Discrete Content
A year or more ago I signed up as a member of the Citizendium. It was announced as the alternative to the Wikipedia. This “alternative” to the Wikipedia was started by Larry Sanger, who co-founded the Wikipedia with Jimmy Wales. Sanger felt that the Wikipedia, as it came to be, lacked crucial editing functions to keep it reliable and accurate. Because he felt that his offspring Wikipedia would not change sufficiently in that direction, he reluctantly started Citizendium as a better Wikipedia.
A lot about Citizendium is similar to Wikipedia — the fact that any one can contribute, the group-created unsigned articles, the neutral stance, and the permanent in-process nature of each article. But there are significant differences as well. At least in theory.
Sanger describes the Citizendium as “Wikipedia with editors and real names” of contributors. I am certain these two differences are advantageous improvements. I would bet they are vital and inevitable in any sustainable wiki encyclopedia. Both raise the quality of articles over time, while reducing mindless vandalism and entropy. I think a reliable encyclopedia will demand these functions sooner or later. Which means Wikipedia will adopt them eventually.
But whether those two differences alone are sufficient for the survival of Citizendium, I can’t predict. I got a mass email from Sanger a few months ago, reminding me (and other early joiners like me) that their new wiki was up and going and to please stop by to contribute. I took the opportunity to check out Citizendium and ask Larry a couple of questions.
In the original email Sanger wrote: “We are after quality, not just quantity. Some of our articles are after just one year already better than what you find in ‘that other project,’ and in a few years, we’re going to have uniformly superior quality. We’re creating a better resource for the world–one you can feel proud of being a part of.”
I asked Sanger to point out to me some articles in Citizendium that were superior to Wikipedias because on a very quick browse of Citizendium (CZ) I could not detect a significant difference. First, there aren’t very many complete articles in CZ, so even finding subjects to compare was difficult. Sanger sent back a pointer to a CZ page that listed approved articles. Approved articles are ones that have gone through a series of editorial decisions that approve the article as reliable and then fix (freeze) it in that approved version. (Further improvements to the approved article are continued in a draft version sitting behind the fixed one, which can later be approved and sent to the front to replace the current version.)
I opened a few from the short list of approved articles, then found the corresponding Wikipedia article, and did a quick A/B comparison. I did not have expert knowledge in any of the approved subjects, so I had to approach these as a ignorant novice. But in this respect I could not discern any quality difference between the two sets of articles I examined. Wikipedia and Citizendium looked the same to me. Sure there were lots of differences in details — and also surprising similarities in outline — but few differences that made a difference.
Sanger then specifically pointed me to several articles that “are far better than Wikipedia’s, or were” such as Life and Biology. I did note that these general articles were much longer and detailed, almost short books. They covered too much territory for me to judge whether they were superior to the shorter Wikipedia entries (length is not synonymous with quality). From what i read, these articles are more comprehensive and thorough, and may provide more answers for more people. They were certainly not inferior to Wikipedia’s. But I did not see an obvious ten-fold increase in quality, which according to Peter Drucker is often what a competitor needs to succeed.
In discussing this with Sanger he said: “While many of our articles are in fact better than Wikipedia’s, it is not a very large percentage yet. But it really is unreasonable, of course, to expect our articles generally to be better than Wikipedia’s at this point, simply because most of our articles are not even complete; the number of person-hours we have put in on any one article, and per article, is completely dwarfed by the time that Wikipedia has spent. As long as we are accelerating in a way similar to Wikipedia in its early days–which we are, e.g., we have doubled our growth rate in the last 100 days–then we will after a similar number of years come within an order of magnitude of the labor Wikipedia has put in on its articles. Our developed articles will probably be generally superior well before that, though.”
Okay, fair enough. This argument — that these kinds of collaborative works proceed incrementally, and not discretely as in mainstream media — is legitimate. This is the reason that Wikipedia caught most of us off guard; we ignored it because for a very long time it was not very good. But it kept getting better very very slowly, one sentence at a time. This kind of growth is invisible. Jimmy Wales makes the same argument now for Wikia Search, his newest bottom-up prosumer project, a collaborative-built search engine. He admitted to me that Wikia Search is no better than Google now (and therefore won’t win many switchers). But, he quickly adds, because it is in its early days, it will slowly, almost invisibly, become better like other incrementally improved creations, and will be surprising when it does.
Micro-incremental growth is an under-appreciated element of successful new media. This method is way beyond issuing beta versions, because there are no versions, just ever tiny modifications, some of which are not even improvements but simply changes. Incrementalism is the way the treasure trove of any archive (say the back issues of the New York Times) grows — one nano addition by one nano addition. It’s the “long now” approach to making content. It contrasts with the discrete model of launching versions, the norm for most creative works.
The key to continuous content creation is to have a survival model that permits long-term accretion. It requires a special kind of patience to say: we’ll endure several years, or maybe even a decade, when the work is not very good, yet we’ll still work on it. An unfinished cathedral is not very useful to anyone. But work continues because the result is visible in everyone’s mind, and there is no doubt it will be useful when completed. For this reason it is harder to incrementally produce a long-now tool than it is to accumulate long-now content. The environment for tools like software and applications change so fast it is difficult to remain useful after 10 years (will there even BE browser bookmarks in 10 years?). Content, on the other hand, can often increase in value over 10 years (think of the Times archive).
As we move all our content onto the world wide database, also known as the semantic web, the micro incremental nature of this media will come into play. For a very long time, maybe a decade, the nano additions to the global database will appear insignificant and hardly worth doing. The world wide database will remain an unfinished cathedral for a long time — of little use to anyone. There will never be a beta version of the semantic web. (Or, as they say, it will be perpetual beta; same thing.) Instead, over a period of one or two years a decade from now, there will be the sudden realization that there is something there “there” in an embedded semantic-database structure. Its value will become (suddenly!) visible and spur more concerted effort to complete it — although of course, it will never be complete.