Why You Should Make Crowdsourced Tasks Meaningful

One of the ideas at the heart of effective crowdsourcing is understanding the non-financial incentives that attract and inspire workers. particularly the most the talented ones. Research from traditional workplaces shows that providing context to a task so that people understand why it is being done is a strong motivator. That is even more the case with crowdsourcing, where workers often have no idea of how their work will be used. Now some very interesting research gives insight into how meaning impacts workers on the crowdsourcing site Mechanical Turk.

Dana Chandler, who studies the economics of crowdsourcing, pointed me to a great study he has done, written up in the paper Breaking Monotony with Meaning: Motivation in Crowdsourcing Markets. A great summary of the research is written up on the CrowdFlower blog.

In short, the project used Mechanical Turk to get pools of Indian and US workers. Some workers were simply told they had to identify parts of images with particular characteristics. Others were told they were identifying cancerous tumor cells. The rate of pay went down with successive tasks.

The chart below shows one of the key findings:

In short, giving meaning and context to the task increased the response of US workers, but did not have a significant impact in the case of Indian workers. However the quality of the response was the same in all cases.

While it was a very well-designed study, and confirmed general perception that developing country workers are more focused on financial return on effort, I still believe that meaning is important for any worker, particularly for more complex tasks. The lesson: give as much context as possible to the tasks you crowdsource.

See the blog post or the full paper for more detail of this very interesting study.