Georgia Institute of TechnologyA study published January 5 in the early edition of the journal Proceedings of the National Academy of Sciences (PNAS) takes what may be the first comprehensive look at this trend, finding common threads in seven projects hosted on Zooniverse, now the most popular crowd science platform. The study’s findings regarding the contributions made by thousands of volunteers offer both encouragement and caution, describing the considerable value of donated time and noting the limitations of nonprofessional research assistance.
“We are seeing projects that couldn’t be done before, and we are seeing them done on a massive scale and at a fast speed,” said Henry Sauermann, an associate professor in the Scheller College of Business at the Georgia Institute of Technology. “However, these are not conventional laboratory research projects going online. It’s not a substitution of crowd science for conventional research projects.”
Though a few crowd science projects require technical knowledge from contributors, most expect little more than the ability to follow simple instructions – such as examining images of galaxies to note their shape, or reporting what animals are doing in photos. Assistance from crowd scientists is important because the sheer volume of work involved would otherwise put projects beyond the reach of conventional research teams.
“The key is to translate the complicated science into something that’s easily done by people who don’t need to understand the scientific details,” explained Sauermann, who has been studying the crowd science phenomenon. “The broad idea is to get people involved who have an interest in science, even if it is a fairly shallow interest. Anybody can participate as long as they have a computer and can do the basic tasks required.”
Though crowd science is attracting considerable interest right now, it’s actually not a brand new idea. For years, ornithologists have used amateur bird watchers to count populations of different species and report their locations. What’s new is access by the general public to masses of scientific images and data made possible by the broad reach of the Internet and ubiquity of personal computers.
With support from the Alfred P. Sloan Foundation, Sauermann and co-author Chiara Franzoni from the Politecnico di Milano in Italy studied seven projects hosted on Zooniverse, one of several platforms providing crowd science infrastructure. They found that most volunteers spend relatively little time on the projects they support, with the majority of work done by a small fraction of the volunteers. But even brief involvement adds up when thousands of volunteers pitch in.
Examining the records of the seven projects over a 180-day study period, the researchers followed the activities of 100,386 participants who contributed a total of 129,500 hours of unpaid labor. At the rate normally paid to undergraduate students, that help would have been worth more than $1.5 million. The amount of labor by project ranged from $22,717 to $654,130.
The contributions received by projects varied dramatically over time, with large spikes in assistance tied to promotion efforts of projects or news media coverage. While this variability may be problematic for some types of tasks, Sauermann said, it may not be a problem for other types of tasks such as classifying large numbers of archived images.
Though the value of the unpaid labor seems attractive, there are costs involved. Projects must be designed for untrained volunteers, infrastructure must be set up, projects must be promoted, and project leaders need to interact with the community to ensure continued involvement.
“It’s not like simply outsourcing something,” Sauermann said. “It’s a big-time commitment on the part of the scientists to make these things happen. Because of the investment, this makes the most sense for projects that have a large scale, where a lot of outside help is needed.”
Though each project is different, crowd science volunteers usually handle tasks that computers can’t do because they require human judgment. For instance, in one project, volunteers were asked to characterize blurry images of animals. That kind of task is easy for humans, but difficult for computers. In other projects, humans noted unexpected objects that would have been ignored by computers.
What motivates people to share their time for research projects? Sauermann says many crowd science projects parallel hobbies such as astronomy or bird-watching. In other cases, people may contribute because they feel they are helping society.
“It’s a way to match people who are interested in looking at animal pictures or galaxies with people who need someone to look at animal pictures or galaxies,” Sauermann said. “When they get together, all parties benefit, and that’s what makes crowd science so promising.”
The phenomenon also provides an opportunity to build understanding and support for science.
“Many people don’t have a tangible connection to science,” he said. “Crowd science can give people the hands-on experience of science, and therefore a better appreciation of it.”
Hundreds of crowd science projects have been done so far, with astronomy and the life sciences the most popular fields. Sauermann expects “an explosion” of projects in other areas of science as word spreads about the success of the projects and more infrastructure is created to host them. But he cautions that researchers hoping to use volunteers must continue to present interesting opportunities.
“I think we are really at the beginning of something big,” he said. “But it’s going to be difficult to get people to participate in topics that don’t seem interesting or important to society. There may be areas of science that seem boring or unimportant. If asked to help with such projects, people may just prefer to watch television.”
CITATION: Henry Sauermann and Chiara Franzoni, “Crowd Science User Contribution Patterns and their Implications,” (Proceedings of the National Academy of Sciences, 2015).