21
Final thoughts on choosing and using
citizen science - continued
and feedback. Although
web-developers can set up
bespoke databases, there are
many examples of mature
technologies for databases
and for visualisation (Roy et
al. 2012). Broadly these can
be divided into: 1) bespoke
technologies that are designed
for a specic purpose and
audience; 2) adaptable
template-type platforms where
the project leader can modify
the content within the bounds
of the xed parameters of the
platform; and 3) technologies
that have aspects of both. See
the References and resources
section for more information
on these.
We strongly recommend
that data are stored in a way
that makes it easy to access
and easy to share. Often
open-source tools can be
used to reduce costs, though
we recommend the use of
mature and well-supported
technologies.
Data protection needs to
be considered when storing
personal data online. It may
be possible to overcome this
by not collecting any personal
information, but this limits the
potential for communication
with people and personalised
feedback. Advice must be
sought to make sure that any
online data storage in the
UK complies with the Data
Protection Act.
Validation, quality
assurance and
verication
One of the key aspects of
data collected by citizen
science projects is that it needs
to be ‘of known quality’.
‘Known quality’ can be either
‘guaranteed to be accurate’
(e.g. through verifying
photographs) or achieved by
quantifying the degree of error
or bias.
One of the most cost-effective
ways of ensuring high data
quality is to thoroughly test
your protocols (Tweddle
et al. 2012). Through this
process you can quantify
errors in measurement/
identication and improve
protocols where necessary.
For some projects, records
are only accepted if there is
accompanying information
(e.g. a photograph), especially
for unusual records. This
conservative approach may
result in the discarding
of genuinely interesting
data points, so should be
undertaken with care.
For other data, quality will be
affected by random error and
bias. Random error will increase
the ‘noise’ in the data (for
example, inaccuracy in making
counts), thus making it more
difcult to accurately discern
signals from the data. However,
most error is likely to be some
form of bias (a systematic error)
and this can vary due to many
different factors, including
people’s experience. This bias
needs to be quantied and
explicitly accounted for in the
analysis. One often overlooked
source of error is the lack of
a record. People are most
likely to record the presence
of something rather than its
absence or record something
out-of-the-ordinary, thus causing
systematic bias in the data.
Communication
Communicating with the
target audience is clearly a
vital aspect of citizen science.
Communication via the
mass media is appealing for
many organisers of citizen
science, but it is risky to rely
on journalists to promote a
project. It is wise to explore
alternative, more stable,
routes of communication (e.g.
newsletters of interest groups)
in addition to the mass media.
Social media (e.g. Twitter and
Facebook) has opened up new
opportunities for promoting
projects and communicating
with participants, and news
can spread quickly by ‘word-
of-mouth’. Workshops and
training sessions can provide
invaluable face-to-face contact
with project participants. Varied
approaches to communication
will ensure projects are
promoted in a way that meets
the requirements of the diverse
range of potential participants.