Read all about it! Crowd makes the news

March 10th, 2011 by

Newspaper and tea by Matt Callow @ Flickr“Researchers have always wondered what made hit songs, books and movies, just that, hits. What they’ve found is that quality had only little to do with it.”

At first glance, these 27 words look fairly ordinary. An interesting idea, but hardly revolutionary journalism. Even so, this innocent-looking sentence has caused quite a stir in the blogosphere. Why? Because it was researched, written and edited entirely by workers on Mechanical Turk.

Word processors
Here’s the scoop. Two journalists, Jim Giles and MacGregor Campbell, have begun an experiment in collaboration with researchers from Carnegie-Mellon University (“collaboration” as in the researchers do the work and the journalists blog about it). The aim is to
“try and create an automated system for producing quality journalism using Mechanical Turk’s army of untrained workers”.

In practice, this means seeing if Turkers can produce a coherent 500-word scientific article – like something you’d find in Wired or New Scientist.

The experiment works like this: each part of the article writing process (including researching, fact checking and writing) is divided into microtasks. Each task is completed numerous times by different workers. Other workers then vote on the best versions. The idea is that with enough built-in redundancy the crowd will act as a self-editing journalistic machine. The whole process is co-coordinated by an algorithm (or “robot boss”) that issues calls to workers in the same way a computer program would call functions.

Pressing problems
With only a single sentence produced so far, it’s too early to tell if journalists will soon be handing their press-passes over to the crowd. Jim Giles himself has admitted that he’d be surprised if the experiment worked perfectly first time.

The assembly-line style of crowdsourcing certainly sounds exciting, but it’s easy to spot some potential “functionality issues”. If an error is made – e.g. a worker enters a wrong name or date – will it be replicated throughout the writing process? Can readers trust crowd researchers to properly check facts? How does an article written by dozens of different people maintain a consistent style and tone (especially if the authors come from several different countries)?

Even if everything goes exactly to plan, and the crowd produces 500 words of totally flawless copy, I doubt that it will seriously affect professional journalism. Summing up a science paper is one thing, but interviewing witnesses and experts, gathering evidence, and then writing a well reasoned, coherent, reliable article is quite another (for the sake of my argument, let’s assume this is actually what journalists are paid to do, as opposed to just follow Charlie Sheen around with a microphone).

The future of churnalism?
There are some areas, however, where microtasked journalism could have a real impact. In recent years, content farms have become an online phenomenon. Controversial but incredibly successful, content farms specialize in churning out low-cost, search-engine-optimized articles. Currently, major players like Demand Media employ freelance writers to write the articles. Microworkers may be able to do the same job faster, cheaper and better (whether this would mean better quality content or just more spam results on Google, who knows).

Unless you happen to be Arianna Huffington (who sold the Huffington Post to AOL for a cool $315 million last month) the world of online content can be pretty tough. Reporters, freelancers, bloggers and tweeters are all in fierce competition for readers. Depending on how good the next 473 words turn out to be, things might be about to get even more crowded.



<<

>>