We're all looking at the stars: citizen science projects at ZooCon13

Last Saturday I escaped my desk to head to the Physics department at the University of Oxford and be awed by what we're learning about space (and more terrestrial subjects) through citizen science projects run by Zooniverse at ZooCon13. All the usual caveats about notes from events apply – in particular, assume any errors are mine and that everyone was much more intelligent and articulate than my notes make them sound. These notes are partly written for people in cultural heritage and the humanities who are interested in the design of crowdsourcing projects, and while I enjoyed the scientific presentations I am not even going to attempt to represent them!  Chris Lintott live-blogged some of the talks on the day, so check out 'Live from ZooCon' for more. If you're familiar with citizen science you may well know a lot of these examples already – and if you're not, you can't really go wrong by looking at Zooniverse projects.

Aprajita Verma kicked off with SpaceWarps and 'Crowd-sourcing the Discovery of Gravitational Lenses with Citizen Scientists'. She explained the different ways gravitational lenses show up in astronomical images, and that 'strong gravitational lensing research is traditionally very labour-intensive' – computer algorithms generate lots of false positives, so you need people to help. SpaceWarps includes some simulated lenses (i.e. images of the sky with lenses added), mostly as a teaching tool (to provide more examples and increase familiarity with what lenses can look like) but also to make it more interesting for participants. The SpaceWarps interface lets you know when you've missed a (simulated, presumably) lens as well as noting lenses you've marked. They had 2 million image classifications in the first week, and 8500 citizen scientists have participated so far, 40% of whom have participated in 'Talk', the discussion feature. As discussed in their post 'What happens to your markers? A look inside the Space Warps Analysis Pipeline', they've analysed the results so far on ranges between astute/obtuse and pessimistic/optimistic markers – it turns out most people are astute. Each image is reviewed by ten people, so they've got confidence in the results.

Karen Masters talked about 'Cosmic Evolution in the Galaxy Zoo', taking us back to the first Galaxy Zoo project's hopes to have 30,000 volunteers and contrasting that with subsequent peer-reviewed papers that thanked 85,000, or 160,000 or 200,000 volunteers. The project launched in 2007 (before the Zooniverse itself) to look at spiral vs elliptical galaxies and it's all grown from there. The project has found rare objects, most famously the pea galaxies, and as further proof that the Zooniverse is doing 'real science online', the team have produced 36 peer reviewed paper, some with 100+ citations. At least 50 more papers have been produced by others using their data.

Phil Brohan discussed 'New Users for Old Weather'. The Old Weather project is using data from historic ships logs to help answer the question 'is this climate change or just weather?'. Some data was already known but there's a 'metaphorical fog' from missing observations from the past. Since the BBC won't let him put a satellite in a Tardis, they've been creative about finding other sources to help lift 'the fog of ignorance'. This project has long fascinated me because it started off all about science: in Phil's words, 'when we started all this, I was only thinking about the weather', but ended up being about history as well: 'these documents are intrinsically interesting'– he learnt what else was interesting about the logs from project participants who discovered the stories of people, disasters and strange events that lay within them. The third thing the project has generated (after weather and history) is 'a lot of experts'. One example he gave was evidence of the 1918-19 Spanish flu epidemic on board ship, which was investigated after forum posts about it. There's still a lot to do – more logs, including possibly French and Dutch – to come, and things would ideally speed up 'by a factor of ten'.

In Brooke Simmons' talk on 'Future plans for Galaxy Zoo', she raised the eternal issue of what to call participants in crowdsourcing: 'just call everyone collaborators'. 'Citizen scientists' makes a distinction between paid and unpaid scientists, as does 'volunteers'. She wants to help people do their own science, and they're working on making it easier than downloading and learning how to use more complicated tools. As an example, she talked about people collecting 'galaxies with small bulges' and analysing the differences in bulges (like a souped-up Galaxy Zoo Navigator?). She also talked about Zoo Teach, with resources for learning at all ages.

After the break we learnt about 'The Planet 4 Invasion', the climate and seasons of Mars from Meg Schwamb and about Solar Stormwatch in 'Only you can save planet Earth!' from Chris Davis, who was also presenting research from his student Kim Tucker-Wood (sp?). Who knew that solar winds could take the tail off a comet?!

Next up was Chris Lintott on 'Planet Hunting with and without Kepler'. Science communication advice says 'don't show people graphs', and since Planet Hunters is looking at graphs for fun, he thought no-one would want to do Planet Hunters. However, the response has surprised him. And 'it turns out that stars are actually quite interesting as well'. In another example of participants going above and beyond the original scope of the project, project participants watched a talk streamed online on 'heartbeat binaries', and went and found 30 of them from archives, their own records and posted them on the forum.  Now a bunch of Planet Hunters are working with Kepler team to follow them up.  (As an aside, he showed a screenshot of a future journal paper – the journal couldn't accept the idea that you could be a Planet Hunter and not be part of an academic team so they're listed as the Department of Astronomy at Yale.)

The final speaker was Rob Simpson on 'The Future of the Zooniverse'.  To put things in context, he said the human race spends 16 years cumulatively playing the game Angry Birds every day; people spend 2 months every day on the Zooniverse. In the past year, the human race spent 52 years on the Zooniverse's 15 live projects (they've had 23 projects in total). The Andromeda project went through all their data in 22 days – other projects take longer, but still attract dedicated people.  In the Zooniverse's immediate future are 'tools for (citizen) scientists' – adding the ability to do analysis in the browser, 'because people have a habit of finding things, just by being given access to the data'. They're also working on 'Letters' – public versions of what might otherwise be detailed forum posts that can be cited, and as a form of publication, it puts them 'in the domain'.  They're helping people communicate with each other and embracing their 'machine overlords', using Galaxy Zoo as a training tool for machine learning.  As computers get more powerful, the division of work between machines and people will change, perhaps leaving the beautiful, tricky, or complex bits for humans. [Update, June 29, 2013: Rob's posted about his talk on the Zooniverse blog, '52 Years of Human Effort', and corrected his original figure of 35 years to 52 years of human effort.]

At one point a speaker asked who in the room was a moderator on a Zooniverse project, and nearly everyone put their hand up. I felt a bit like giving them a round of applause because their hard work is behind the success of many projects. They're also a lovely, friendly bunch, as I discovered in the pub afterwards.

Conversations in the pub also reminded me of the flipside of people learning so much through these projects – sometimes people lose interest in the original task as their skills and knowledge grow, and it can be tricky to find time to contribute outside of moderating.  After a comment by Chris at another event I've been thinking about how you might match people to crowdsourcing projects or tasks – sometimes it might be about finding something that suits their love of the topic, or that matches the complexity or type of task they've previously enjoyed, or finding another unusual skill to learn, or perhaps building really solid stepping stones from their current tasks to more complex ones. But it's tricky to know what someone likes – I quite like transcribing text on sites like Trove or Notes from Nature, but I didn't like it much on Old Weather. And my own preferences change – I didn't think much of Ancient Lives the first time I saw it, but on another occasion I ended up getting completely absorbed in the task. Helping people find the right task and project is also a design issue for projects that have built an 'ecosystem' of parts that contribute to a larger programme, as discussed in 'Using crowdsourcing to manage crowdsourcing' in Frequently Asked Questions about crowdsourcing in cultural heritage and 'A suite of museum metadata games?' in Playing with Difficult Objects – Game Designs to Improve Museum Collections.

An event like ZooCon showed how much citizen science is leading the way – there are lots of useful lessons for humanities and cultural heritage crowdsourcing. If you've read this thinking 'I'd love to try it for my data, but x is a problem', try talking to someone about it – often there are computational techniques for solving similar problems, and if it's not already solved it might be interesting enough that people want to get involved and work with you on it.

One thought on “We're all looking at the stars: citizen science projects at ZooCon13”

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.