Close Mobile Menu

Open Source: A Double Bind

March 31, 2010
by Steven Weber

In so-called democratic systems, who can you trust?

Once upon a time, in a land far away, a wizard gave to the people of her village a magic parchment. Although it was covered with written wisdom from top to bottom, anyone in the village could change the words or add their own ideas to the parchment, and it would never run out of space. More magical still, the parchment kept a complete record of everything that had ever been written upon it. As the story of the parchment spread throughout the land, first hundreds and then thousands of people came to write upon it. They received neither grain nor gold for their contributions, but they still kept coming. Millions read it. It was celebrated throughout the land, for the wisdom upon the parchment had been created by and belonged to the people, not to their rulers.

Wikipedia is not a fairy tale—it is an extraordinary phenomenon. Only five years old, it has more than 900,000 online entries and 320 million words, compared to 120,000 articles and 77 million words in the online Encyclopedia Britannica, the largest traditional English-language encyclopedia. Along with Google, Wikipedia has become a compelling symbol of “democratic” and “open” possibilities within information technology.

Like most political symbols that emerge in rapidly changing times, the terms “democratic” and “open” mix overblown hype and dangerous blind spots with wonderful innovation. Let’s undermine the hype, illuminate the blind spots, and magnify the innovation. Systems such as Wikipedia and Google pose profound questions about the practical meanings of democracy and accountability in a highly connected information society. Answering those questions means getting serious about the notion of power—who has it, where does it come from, and how is it limited, checked, balanced, and held accountable in open systems like Wikipedia?

The answers are murky, and that is precisely the problem. Many people are impressed with the fact that Wikipedians do their work for free. I say, so what? It is well and good to champion the absence of direct monetary exchange in (some) open systems—money can certainly corrupt the search for truth. But power corrupts at least as badly. And power’s influence is sometimes easier to hide than money’s. I know how to “follow the money.” But who actually gets to decide when a Wikipedia article is being “corrupted” rather than “corrected”? Who defines “vandalism”? Who precisely is accountable for those choices? What does it take to fire them, kick them out of office, or revolt against them? Are we being quietly asked to take their essential benevolence on faith?

I don’t find Google’s corporate motto—”Don’t be evil”—a sufficient guarantee. Google’s self-proclaimed goal is to organize all the world’s information. The desire and power to organize is profound. Google won’t take money to push a website to the top of a search result, but the Page Rank algorithm that generates and organizes the list is a secret. Should it be? The Dewey Decimal system and the Library of Congress cataloguing system are not secret. There is a deep public interest in making transparent the rules by which information is organized. Is this a problem for Google’s business model? Absolutely. But just as economic monopolies have special responsibilities to restrain themselves in markets, so do potential information monopolies whose impact on society’s welfare could be considerably broader and deeper.

The French political theorist Michel Foucault worried a great deal about veiled power, and he was right to worry. When people try to reassure me by talking about “self-organization” on the Internet, I get even more worried. When organization comes from an editorial board, a boss, or an elected official, I know where power lies and how to hold it accountable, even if it’s not always easy and doesn’t always work. But in a “self-organizing system,” where is the accountability? What kinds of decisions would we forbid such a system to make? If you have faith in the notion that something recognizable as a democracy could organize and sustain itself, as if by magic, out of unfiltered opinions and decisions, you need to reread The Federalist Papers and consider the wisdom of James Madison.

What is called collaborative filtering (systems that rate the contributions of individuals in various ways, or assign them reputations over time) simply counts how many people agree with you, or are like you. Harnessing the “wisdom of the crowd” in this way certainly has attractions: It dampens the extremes by relying on a version of the law of large numbers. This type of system is good at guessing the weather and investing in stocks, because it is a recipe for average performance—and that is better than most experts do at forecasting rain and recessions. But it is no substitute for democratic deliberation or scientific argument, both of which aim for doing better than average, by upgrading the state of knowledge and wisdom within a society.

Vibrant governance systems are living experiments. They learn over time how to get better at what they do, usually in response to failure. Laws are passed, constitutions are amended, and sometimes revolutions emerge and we start over again. Google needs to become more transparent. Wikipedia needs clearer mechanisms of accountability. Let’s keep pushing them along. Now that’s democracy.

Steven Weber is a professor of political science and director of the Institute of International Studies. He is the author of The Success of Open Source (Harvard University Press, 2004).

From the March April 2006 Can We Know Everything issue of California.

Share this article