Saturday, February 12, 2011

Wisdom of the Rat - In Defense of Network Production

This post is pretty much straight from some NYU work. It discusses a critical notion of network production, the idea of publish-then-filter (based on market selection) rather than filter-then-publish (credential then trust as authority). It's part of the broader notion of the thesis that as transaction costs decline and the benefit of making upfront investments and decisions declines, it makes more and more sense to do these things ad hoc, tailored to each situation and for each particular question.

In Pixar’s award winning animated feature Ratatouille, genius is found in the most unlikely of places. Remy, the story’s rodent hero, comes from a long line of undifferentiating rats, who eat whatever they find, garbage or otherwise. But Remy is different. His profound love of food and flavor drives him to the culinary arts and a secret career in the strictly-gated circles of Parisian haute cuisine. What sets Remy off on his journey is the motto of the chef Gusteau, “anybody can cook.” At first, Remy interprets this, optimistically enough, to mean that cooking is for everyone – that if they try, all people can understand its beauty and create great things. By the end of the film though, it becomes clear the slogan embodies a narrower, if still encouraging, meaning. It’s not that cooking is easy and with a little effort, everyone can master it – some people will always be better than others (as the film's obliging foil Linguine comes ultimately to realize). But the idea is that genius might be found anywhere, and you don’t know who that great cook is going to be. It might be someone unexpected or uncredentialed... perhaps someone not even human.

Remy’s journey is of course emblematic of the production studio behind its creation, the upstart Pixar, who believed it could bring beauty and inspiration to that oft disregarded genre of animation. But it also brings valuable insight to this week’s discussion on new modes of production and how crowdsourcing is changing methods of labor on the Internet.

While I disagreed with much of it, I really enjoyed the video at the bottom of the Wolfshead post, featuring commentary from Andrew Keen and others criticizing crowdsourcing as diffusing knowledge by superseding traditional notions of credentialing. Keen compares enthusiasm for web 2.0 technologies and their empowering of everyday individuals over established experts to the theosophy of Rousseau. This he frames as belief in the inner greatness of man and the innocence of youth (e.g. Emile), gradually corrupted by society. The analogy however is off. Keen paints web enthusiasts as Remy at first interprets Gusteau’s motto... as a belief that we all have it inside of us, that anyone can cook. This is not right. What the web enthusiast rejects is not the notion that an expert might be better informed, but merely the credentialing system that, due to historical constraints no longer in place, assumes this necessarily to be the case.

In Ratatoullie, Remy’s culinary talent is portrayed as in conflict with the established French
cooking scene. In the end, he’s basically right and they’re basically wrong. This is the point Keen seems not to be able to sign onto. In the video, he and others seem constantly to lament the idea that on Wikipedia,14-year-old kids can correct well established professors.

To begin with, the proposed conflict is again misleading... often it’s in fact traditional experts writing articles in their areas of focus, but just doing so voluntarily. Moreover, something like Wikipedia covers areas far beyond the spheres in which ‘experts’ typically operate, for instance obscure MMOGs or very local subjects. In these cases, not only might a particular 14- year-old in fact be the most qualified to comment (if that 14-year-old was say the highest ranking in the game or happened to be on hand at the scene), but even if he weren’t, it’s him or no one, and I’d think something is better than nothing.

But these are merely side points. In the web context, the real counter to Keen is that it’s not a question of justifying credentials, but rather of letting content stand for itself. If a professor writes a Wiki on a subject she knows a great deal about, that’s great. If a 14-year- old then comes along and corrects some piece of it, the community has the opportunity to decide whether this correction is right or wrong. This doesn’t mean ignoring credentials... if the professor supported her facts with a link to her own or someone else’s published paper, whereas the 14-year-old had much less established references, the professor’s should win out. It’s a question of opening the production process to market/democratic selection to determine.

Keen of course dismisses this as the core problem inherent in the Cult of the Amateur... that it creates what comedian Stephen Colbert calls ‘truthiness,’ where fact is determined by the whims of the crowd rather than by reality and the individuals qualified to judge it. This is misleading though, for there’s no such thing as objective reality. All ‘facts’ are subjective... they are simply proposals that society has generally accepted as true. Often, we designate certain individuals (like scientists or academics) to tell us what's true, and then simply trust them. But there’s no reason to believe these individuals always are 100% correct about everything they discuss. New media collaboration platforms such as Wikipedia offer the opportunity to balance this trust with input and evaluation from a much wider group.

To the extent there are systematic biases in any collaborative production system, such as those Wolfshead laments, it should be worked to correct these. And new media literacies are absolutely required to help us understand the limitations of this new form of information production and the appropriate ways to negotiate its products. But collaborative, democratized production is an evolutionary process that is constantly expanding the record of human knowledge at a rate far beyond any effort previously. And empirical evidence suggests it’s working pretty well – independent studies show time and again quality of the average Wikipedia article to be on par with that of an average encyclopedia Brittanica article. It would be a shame to disregard this incredible resource.

As Ratatouille’s Anton Ego points out, “the world is often unkind to new talents, new creations. The new needs friends.” Without a doubt, I believe Wikipedia and crowdsourced production in general belong in this category -- and, quite certainly, I intend to be one of them.

Sunday, February 6, 2011

The New Reading

This was a post I wrote in early October about new forms of literacy, responding to Nicholas Carr's argument in his article "Is Google Making Us Stupid?" in the Atlantic. The post builds off some of the work I'd done for my undergrad thesis around new media literacy, although it came before the 'distributed production epiphany,' so I'll definitely look to come back and give this an updated treatment sometime soon.

In the meantime, the basic idea is that Carr suggests we are being inundated by data and it's causing us to lose the ability to think deeply about things. I respond by suggesting what we're actually losing (orperhaps, choosing to abandon) is simply that traditional notion of 'deep thinking' that involves only acquiring and internalizing knowledge through passive textual decoding. Today, I suggest in the post and will elaborate on in future ones, literacy demands the ability not only to receive knowledge, but to actively pursue, evaluate, combine, and deploy it. So while I think Carr is right to note this change in the way we interact with information, I think he and all of us should be careful to think about the profound opportunities and necessities of new systems, and not simply disregaurd them out of hand because they're different.

In his article “Is Google Making Us Stupid?,” Nicholas Carr laments what he sees to be the harmful, homogenizing effect proliferating access to information is having on human mental faculties. Inundated by data, he suggests, we jump around from topic to topic and are gradually losing the ability for deep thought.

I strongly disagree with Carr.

Presumably, what Carr is concerned with is critical thinking – our ability to acquire and deploy knowledge to impactful ends, the foundation of both individual and societal progress. Critical thinking has two components: which generally I dub computation and experimentation. Computation is the simpler of the two... it involves executing a set of already-decided steps to determine some answer. It is by nature goal directed. Experimentation is the process that determines which of these sets of steps should be executed. It is ‘random,’ or alternatively, is creative in nature.

High level thought processes involve recursive deployment of computation and experimentation. Imagine a scholar seeking to answer some fundamental and yet to be understood question in her field. Likely, she would start by acquiring knowledge – reading up on topics around the question. She probably doesn’t know exactly where she’s going or exactly what knowledge she needs to obtain, but she’s creating a base dataset with which to begin her inquiry. Gradually, she will likely develop some overarching hypothesis. To prove it, she will head down some particular path. She will start reading up on some sub branch of the field, design some experiments to test out the hypothesis and execute them. If these don’t bear out, she will find some other path and try it out, or perhaps she will disprove the hypothesis entirely, look for some new explanation and then repeat the process until eventually she finds the answer.

At each point in this process where the scholar is selecting a new mode of inquiry – in choosing hypotheses, in choosing the paths by which to pursue them – the scholar is engaged in experimentation. She is somehow synthesizing the information she’s internalized and using it to creatively generate an idea and a set of actions. At each point where she’s executing these sets of actions, she’s engaged in computation .

In all modes of critical thinking, we use media as aids to both experimentation and computation. When a person is brainstorming and taking notes on a whiteboard, she’s using a physical encoding medium to augment the capabilities of her brain, freeing up processing power in her working memory by offloading the storage of certain data elements. The person may for instance be writing down each area she explores as she does so. This aids in her experimentation by making clear which areas she’s already explored and which are yet free for her to venture.

With the invention of computers though, media can take on the even more important role of handling computational processes altogether. As described above, computation is the set of elements in critical thinking that involve execution of predetermined steps. These steps – not involving any sort of randomness or human creativity – can be offloaded to machines. When children learn math at early stages, they first are taught the process of computation. They memorize multiplication tables and do long division with pencil and paper. When it’s time though for them to move onto more complex areas of mathematics such as algebra, calculus, and trigonometry, these more rudimentary computational processes get offloaded to a calculator.

This process of using external resources as part of mental processes is known as distributed cognition (examples thus far have included offloading only to media and computational devices, although distributed cognition can equally well involve offloading to other people). As with any complex information architecture, the key qualities of a distributed cognition setup are data storage, data access, and data processing. How much information can be stored? Where will it be stored and what does this mean for how easy it is to get to? What sort of manipulation on the data is done once retrieved?

The driving motivation for things like Carr’s ‘deep reading’ or a scholar spending years researching in an area to become expert in it is internalization of data for fast access and abstract processing. A literature scholar writing on a novel needn’t memorize its every word to efficiently analyze it. When she’s engaged in critical thinking on a particular paragraph, certainly she may read that paragraph deeply and try to bring much of it into her working memory cache where she can more easily manipulate its components toward creative and analytic ends. Generally though, the text is encoded in a permanent physical medium (it’s written down), and so when she turns to analyze the next paragraph, there’s no need for her to expend limited mental resources holding the exact data of that previous paragraph in memory.

Computers today are great at storage and computation – they can hold hundreds of gigabytes worth of data with perfect fidelity indefinitely and can execute trillions of serial processing tasks every second. The problem though is that they’re not good at creative thinking, which means we need to do it, and creative thinking requires fast access to data and computation – you need to be able to dart around to different ideas, know quickly whether a particular path is even worth exploring. Historically, the time it would take to access data stored externally or launch an external computation process meant that the only was to achieve this sort of creative analysis was to internalize everything.

As access to information and computational processes proliferate however, and becomes infinitely more efficient, this changes. More and more of the lower levels of critical thinking can be offloaded to external, distributed sources – I can check the relevant wiki page when I need some fact about Georgia in the 1850s, I can see what the first Google link is when I type in “current social pressures in Batswana” – leaving us free to roam in the more important, more abstract and creative elements of critical thinking. Not only though does information proliferation make this sort of offloading possible, it makes it ever more critical. For as information proliferates, it becomes impossible to internalize it all and the value of any particular piece of information declines proportionately.

Distributed critical thinking is of course complex, and teaching it is one of the main goals of so called “New Media Literacy” education programs. Relevant to Carr’s argument, new media literacy involves learning not only how to deploy distributed resources, but also when. Indeed, it will often still make sense to ‘read something deeply’ so as to internalize it. And individuals must practice and retain this skill. But the skill that is equally and ever more important is the ability to determine which content to process and how to do so. To learn how to jump around from link to link, finding relevant tidbits that help inform the main argument and add additional components beside it. How to skip paragraphs entirely when they describe information you already posses internally. How to execute any number of information consumption variants besides simply sitting down and reading a thing from point A to point B.

Certainly, not every act of digression is an enlightened demonstration of distributed cognitio. Some people may well have become a bit ADD sometimes – reading a lot and internalizing little, high or low level. Each citizen of this century though owes it to herself to learn the skill of information negotiation, and we as a society owe it to one another to aid in this process. While Carr's correct to call out what's clearly an evolution of literacy and a problem that amounts to one of it's biggest traps, we must be sure to view it as such -- a call for smart, controlled change, and not merely a blind or nostalgic opine for a literacy of the past.


Ok team. After that flurry of initial activity (ie my single blog post), I admittedly did not keep quite as tight a regimen as I would have hoped. Decided it’s time to put some points on the board.

I have a bunch of posts I wrote for class that I figure I’ll start by fixing up a bit and posting here. Couple of them get pretty academic (out there?), but hopefully will be interesting and seems like a good way to get some volume going.

As many of you know, I’ve been working hard of late on my distributed production ideas – my grand thesis for the future of everything. I’ll probably start making posts shorter and less polished, but will try to use this as a good brainstorming space for new thoughts as they come in (actual thesis work will also be going up in wiki form eventually).

Anyway, wish me luck sticking to it this time!