Citizen science is the generation of new information and knowledge from the participation of nonscientists in traditional scientific activities. In its best case, it is a collaborative endeavor with benefits for the participants, scientists, and society as a whole. Citizen science is not new: Nonmeteorologists have been gathering weather data for centuries, and amateur ornithologists have recorded data for nearly as long (Crain et al. 2014). However, with the advent of low-cost information and communication technologies, there are new possibilities to expand participation in data gathering, exchange, and analysis to other fields of environmental science. A number of questions remain about the balance of costs and benefits of including such participation in ongoing research or monitoring programs.

The engagement and training of community members in citizen science programs have been shown to provide a number of benefits to the proponents: researchers, monitoring agencies, and policymakers (Diduck and Sinclair 2002). These include the direct and indirect effects of more-informed and empowered communities and increased support for land-use and resource decisionmaking processes by governments, companies, and institutions (Haywood and Besley 2014). Recent citizen science projects on hydrology, aquatic biodiversity, and water quality have shown that productive partnerships among scientists, agencies, and the public can provide an increase in the spatial and temporal resolution of environmental information (e.g., Buytaert et al. 2015).

But citizen science is not without its limitations. From the data-quality point of view, sampling bias and analytical challenges are key shortcomings to the use of citizen science (Dickinson et al. 2012, Bird et al. 2013). From a program-development point of view, the cost of training and long-term engagement can be a central hurdle for agencies and researchers, but these remain fundamental elements of a successful program that generates scientifically useful data.

We examine these limitations using data from a global citizen science program exploring freshwater ecosystem conditions and dynamics, FreshWater Watch (FWW). The program research goals were designed to meet two objectives: (1) a global comparative analysis of freshwater ecosystem dynamics under varying degrees of anthropogenic pressure and (2) a shared platform to support ongoing local research and monitoring programs. Although the individual (25 and growing) projects differ in specific objectives, they use the same core measurement methodologies, engagement, and learning approaches, as well as data-upload and mapping tools (Castilla et al. 2015). The advantages of using a common platform are related to the comparability of the data (owing to a consistent approach), quality control (multiple laboratories involved across a range of environmental conditions), expanded communication (an international community of citizen scientists), and reduced costs.

To explore the relationship between the costs and benefits of a citizen science program, we compared the time invested in training and engagement by individual researchers with the equivalent time saved in sampling and measurement. Time invested by each project scientist includes training (a 7-hour field-based training day) and post-training online engagement and feedback. The return on investment depends on the number of data sets acquired by each trained team of citizen scientists. Considering the number of data sets acquired (14,000 from May 2013 to April 2016) and an average time required for data acquisition and travel (1 hour), this translates to nearly 28,000 researcher-equivalent hours, assuming a two-person researcher team. The actual time dedicated by the participants is greater, because the number of citizen scientists averages 3.4 per dataset (see figure S1 in the supplemental materials). Given that there have been a total of 3000 hours of training activity (420 training days) by lead scientists, the return on time invested in training by the lead scientists is more than 9 hours of sampling time for each hour of training. If the additional time dedicated to feedback and engagement by the lead scientists is considered, this becomes 6 hours of sampling per hour of time invested. Regional differences are evident (figure S1): from a maximum return on time invested in Indonesia, Singapore, and Canada of 14 hours to a minimum in Malaysia, Australia, and United Arab Emirates of less than 3 hours. The largest number of data sets came from the United Kingdom (2544), India (1855), China (1381), and Brazil (1257), where the return on time invested was in the middle of this range. This variability results from differences in sampling approach (e.g., assigned or self-selected sampling sites) and engagement practices. Studies in the relative influence of different sampling and engagement practices (complexity of measurements, feedback frequency, and training-day conditions) would help optimize citizen science programs.

To explore the scientific limitations to citizen science, we compared the spatial and temporal distribution of FWW–generated data to regular UK Environment Agency (EA) monitoring in four subcatchments in the River Thames basin (table S1 in the supplemental materials) in 2015. A spatial comparison reveals that there was a higher measurement density of sites selected by citizen scientists but that these selected sites were more clustered (Kivelä et al. 2014). The extent of the FWW sites was marginally greater, with the two farthest sites being more distant than the EA equivalents. The combined effect of higher density and larger extent of sites indicates that the FWW monitoring network has the potential to deliver robust information but with some spatial biases that are not present in agency monitoring. Adding FWW sites to the EA network increased the scope and distribution of the sampling, allowing for an improved spatial coverage. Temporally, the FWW citizen scientists sampled more frequently in spring and summer months (figure S2), whereas EA data were acquired with a higher regularity throughout the year in relation to expected statutory duties. Combining data from FWW with EA improved the frequency of measurements by 50 percent.

Citizen scientist–generated data have the potential to support ecological data gathering and regulatory monitoring efforts. The costs of training and engagement with citizen scientists can be worth the investment made, in particular when common sampling, training, and data management methods are used. However, biases in sampling location and frequency are likely to occur and need to be taken into consideration. Additional benefits are more difficult to quantify but are no less important; the FWW participants report an increased awareness of critical freshwater issues and commitment to personal action. Citizen science is no panacea but can provide a cost-effective approach to improving our understanding of the state of the environment and therefore increase the knowledge base on which the decisionmakers can act.

Supplemental material

The supplemental material is available online at http://bioscience.oxfordjournals.org/lookup/suppl/doi:10.1093/biosci/biw089/-/DC1.

We thank HSBC Bank for the financial support of the FreshWater Watch, under the scope of the HSBC Water Programme. We sincerely acknowledge the efforts of the citizen scientists who are active in the project, providing enthusiasm and fundamental data gathering, and our partner scientists who have worked closely with us for nearly 3 years.

References cited

Bird
TJ
Bates
AE
Lefcheck
JS
Hill
NA
Thomson
RJ
Edgar
GJ
Stuart-Smith
RD
Wotherspoon
S
Krkosek
M
Stuart-Smith
JF
Pecl
GT
2014
Statistical solutions for error and bias in global citizen science datasets
Biological Conservation
 
173
144
154
Buytaert
W
Baez
S
Bustamante
M
Dewulf
A
2012
Web-based environmental simulation: Bridging the gap between scientific modeling and decision-making
Environmental Science and Technology
 
46
1971
1976
Castilla
EP
Cunha
DGF
Lee
FWF
Loiselle
S
Ho
KC
Hall
C
2015
Quantification of phytoplankton bloom dynamics by citizen scientists in urban and peri-urban environments
Environmental Monitoring and Assessment
 
187
(art. 690)
Crain
R
Cooper
C
Dickinson
JL
2014
Citizen science: A tool for integrating studies of human and natural systems
Annual Review of Environment and Resources
 
39
641
665
Dickinson
JL
Shirk
J
Bonter
D
Bonney
R
Crain
RL
Martin
J
Phillips
T
Purcell
K
2012
The current state of citizen science as a tool for ecological research and public engagement
Frontiers in Ecology and the Environment
 
10
291
297
Diduck
A
Sinclair
AJ
2002
Public involvement in environmental assessment: The case of the nonparticipant
Environmental Management
 
29
578
588
Haywood
BK
Besley
JC
2014
Education, outreach, and inclusive engagement: Towards integrated indicators of successful program outcomes in participatory science
Public Understanding of Science
 
23
92
106
Kivelä
M
Arnaud-Haond
S
Saramäki
J
2014
EDENetworks: A user-friendly software to build and analyse networks in biogeography, ecology, and population genetics
Molecular Ecology Resources
 
15
117
122